Search results for: shared frailty survival models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8677

Search results for: shared frailty survival models

6547 Research and Application of Multi-Scale Three Dimensional Plant Modeling

Authors: Weiliang Wen, Xinyu Guo, Ying Zhang, Jianjun Du, Boxiang Xiao

Abstract:

Reconstructing and analyzing three-dimensional (3D) models from situ measured data is important for a number of researches and applications in plant science, including plant phenotyping, functional-structural plant modeling (FSPM), plant germplasm resources protection, agricultural technology popularization. It has many scales like cell, tissue, organ, plant and canopy from micro to macroscopic. The techniques currently used for data capture, feature analysis, and 3D reconstruction are quite different of different scales. In this context, morphological data acquisition, 3D analysis and modeling of plants on different scales are introduced systematically. The commonly used data capture equipment for these multiscale is introduced. Then hot issues and difficulties of different scales are described respectively. Some examples are also given, such as Micron-scale phenotyping quantification and 3D microstructure reconstruction of vascular bundles within maize stalks based on micro-CT scanning, 3D reconstruction of leaf surfaces and feature extraction from point cloud acquired by using 3D handheld scanner, plant modeling by combining parameter driven 3D organ templates. Several application examples by using the 3D models and analysis results of plants are also introduced. A 3D maize canopy was constructed, and light distribution was simulated within the canopy, which was used for the designation of ideal plant type. A grape tree model was constructed from 3D digital and point cloud data, which was used for the production of science content of 11th international conference on grapevine breeding and genetics. By using the tissue models of plants, a Google glass was used to look around visually inside the plant to understand the internal structure of plants. With the development of information technology, 3D data acquisition, and data processing techniques will play a greater role in plant science.

Keywords: plant, three dimensional modeling, multi-scale, plant phenotyping, three dimensional data acquisition

Procedia PDF Downloads 277
6546 The Relationship between Organizations' Acquired Skills, Knowledge, Abilities and Shareholders (SKAS) Wealth Maximization: The Mediating Role of Training Investment

Authors: Gabriel Dwomoh, Williams Kwasi Boachie, Kofi Kwarteng

Abstract:

The study looked at the relationship between organizations’ acquired knowledge, skills, abilities, and shareholders wealth with training playing the mediating role. The sample of the study consisted of organizations that spent 10% or more of its annual budget on training and those whose training budget is less than 10% of the organization’s annual budget. A total of 620 questionnaires were distributed to employees working in various organizations out of which 580 representing 93.5% were retrieved. The respondents that constitute the sample were drawn using convenience sampling. The researchers used regression models for their analyses with the help of SPSS 16.0. Analyzing multiple models, it was discovered that organizations training investment plays a considerable indirect and direct effect with partial mediation between organizations acquired skills, knowledge, abilities, and shareholders wealth. Shareholders should allow their agents to invest part of their holdings to develop the human capital of the organization but this should be done with caution since shareholders returns do not depend much on how much organizations spend in developing its human resource capital.

Keywords: skills, knowledge, abilities, shareholders wealth, training investment

Procedia PDF Downloads 240
6545 Diversity in Finance Literature Revealed through the Lens of Machine Learning: A Topic Modeling Approach on Academic Papers

Authors: Oumaima Lahmar

Abstract:

This paper aims to define a structured topography for finance researchers seeking to navigate the body of knowledge in their extrapolation of finance phenomena. To make sense of the body of knowledge in finance, a probabilistic topic modeling approach is applied on 6000 abstracts of academic articles published in three top journals in finance between 1976 and 2020. This approach combines both machine learning techniques and natural language processing to statistically identify the conjunctions between research articles and their shared topics described each by relevant keywords. The topic modeling analysis reveals 35 coherent topics that can well depict finance literature and provide a comprehensive structure for the ongoing research themes. Comparing the extracted topics to the Journal of Economic Literature (JEL) classification system, a significant similarity was highlighted between the characterizing keywords. On the other hand, we identify other topics that do not match the JEL classification despite being relevant in the finance literature.

Keywords: finance literature, textual analysis, topic modeling, perplexity

Procedia PDF Downloads 170
6544 Activity of Some Plant Extracts on the Larvae and Eggs of Culex quinquefasciatus in the Laboratory

Authors: A. A. El Maghrbi

Abstract:

The control of vectors like mosquitoes based on the application of chemical insecticides but due to its adverse effect on the environment, and development of resistance by most of species of mosquitoes including vectors of important diseases. Ethanol and acetone extracts of nine species of plants (Allium tuberosum, Apium leptophylum, Carica papaya, Cymbopogon citratus, Euphorbia cotinofolia, Melia azedarach, Ocimum canum, Ricinus common, and Tagetes erecta) were tested in respect of their influence on the eggs and larvae of Culex quinquifasciatus in concentration 100, 10 and 1 mg/L. In relation to the survival of larvae, ethanol extract of O. canum and acetone extract of A.tuberosum in 100 mg/L have larvicide activity against L4 of Cx. quinquifasciatus. For hatching of eggs, ethanol and acetone extract of A.tuberosum (100 and 10 mg/L) and acetone extract of C.citratus (100 mg/L) produced reduction in the number of eggs hatched of Cx. quinquifasciatus. Our results indicate that each extract of the plant have potential to control mosquito population and suggest that further studies are needed in this field.

Keywords: Cx. quinquefasciatus, plant extract, ethanol, acetone, larvae, eggs

Procedia PDF Downloads 365
6543 Prospect and Challenges of Public Bicycle Sharing System in Indian Cities

Authors: Anil Kumar

Abstract:

Public Bicycle System (PBS), generally known as Public Bicycle Share System or Bike-Share, is a service provided to the everyday commuters in which several cycles are available on the shared system. The concept of PBS is new to the people of India and requires more study in the fields of essential requirements, major infrastructural requirements, social acceptability, and various challenges. In various Indian cities, MRTS, BRTS, Monorail, and other modes of transport have been adopted for the main haul of transport. These modes take more time, space and are also expensive to implement. At the same time, the PBS system is more economical and takes less time to implement. The main benefit of the PBS system is that it is more environmentally friendly. PBS is being implemented in many Indian cities for public use, but various challenges are associated with this. The study aims to determine what are the basic infrastructural requirements for PBS in India, as well as to determine to what extent a Public Bike Sharing System can provide a quality and efficient service to passengers as a primary method of transportation.

Keywords: public bicycle sharing system, sustainable transport, infrastructure, smart city

Procedia PDF Downloads 192
6542 Count Data Regression Modeling: An Application to Spontaneous Abortion in India

Authors: Prashant Verma, Prafulla K. Swain, K. K. Singh, Mukti Khetan

Abstract:

Objective: In India, around 20,000 women die every year due to abortion-related complications. In the modelling of count variables, there is sometimes a preponderance of zero counts. This article concerns the estimation of various count regression models to predict the average number of spontaneous abortion among women in the Punjab state of India. It also assesses the factors associated with the number of spontaneous abortions. Materials and methods: The study included 27,173 married women of Punjab obtained from the DLHS-4 survey (2012-13). Poisson regression (PR), Negative binomial (NB) regression, zero hurdle negative binomial (ZHNB), and zero-inflated negative binomial (ZINB) models were employed to predict the average number of spontaneous abortions and to identify the determinants affecting the number of spontaneous abortions. Results: Statistical comparisons among four estimation methods revealed that the ZINB model provides the best prediction for the number of spontaneous abortions. Antenatal care (ANC) place, place of residence, total children born to a woman, woman's education and economic status were found to be the most significant factors affecting the occurrence of spontaneous abortion. Conclusions: The study offers a practical demonstration of techniques designed to handle count variables. Statistical comparisons among four estimation models revealed that the ZINB model provided the best prediction for the number of spontaneous abortions and is recommended to be used to predict the number of spontaneous abortions. The study suggests that women receive institutional Antenatal care to attain limited parity. It also advocates promoting higher education among women in Punjab, India.

Keywords: count data, spontaneous abortion, Poisson model, negative binomial model, zero hurdle negative binomial, zero-inflated negative binomial, regression

Procedia PDF Downloads 155
6541 Recent Advancement in Dendrimer Based Nanotechnology for the Treatment of Brain Tumor

Authors: Nitin Dwivedi, Jigna Shah

Abstract:

Brain tumor is metastatic neoplasm of central nervous system, in most of cases it is life threatening disease with low survival rate. Despite of enormous efforts in the development of therapeutics and diagnostic tools, the treatment of brain tumors and gliomas remain a considerable challenge in the area of neuro-oncology. The most reason behind of this the presence of physiological barriers including blood brain barrier and blood brain tumor barrier, lead to insufficient reach ability of therapeutic agents at the site of tumor, result of inadequate destruction of gliomas. So there is an indeed need empowerment of brain tumor imaging for better characterization and delineation of tumors, visualization of malignant tissue during surgery, and tracking of response to chemotherapy and radiotherapy. Multifunctional different generations of dendrimer offer an improved effort for potentiate drug delivery at the site of brain tumor and gliomas. So this article emphasizes the innovative dendrimer approaches in tumor targeting, tumor imaging and delivery of therapeutic agent.

Keywords: blood brain barrier, dendrimer, gliomas, nanotechnology

Procedia PDF Downloads 561
6540 Investigations into the in situ Enterococcus faecalis Biofilm Removal Efficacies of Passive and Active Sodium Hypochlorite Irrigant Delivered into Lateral Canal of a Simulated Root Canal Model

Authors: Saifalarab A. Mohmmed, Morgana E. Vianna, Jonathan C. Knowles

Abstract:

The issue of apical periodontitis has received considerable critical attention. Bacteria is integrated into communities, attached to surfaces and consequently form biofilm. The biofilm structure provides bacteria with a series protection skills against, antimicrobial agents and enhances pathogenicity (e.g. apical periodontitis). Sodium hypochlorite (NaOCl) has become the irrigant of choice for elimination of bacteria from the root canal system based on its antimicrobial findings. The aim of the study was to investigate the effect of different agitation techniques on the efficacy of 2.5% NaOCl to eliminate the biofilm from the surface of the lateral canal using the residual biofilm, and removal rate of biofilm as outcome measures. The effect of canal complexity (lateral canal) on the efficacy of the irrigation procedure was also assessed. Forty root canal models (n = 10 per group) were manufactured using 3D printing and resin materials. Each model consisted of two halves of an 18 mm length root canal with apical size 30 and taper 0.06, and a lateral canal of 3 mm length, 0.3 mm diameter located at 3 mm from the apical terminus. E. faecalis biofilms were grown on the apical 3 mm and lateral canal of the models for 10 days in Brain Heart Infusion broth. Biofilms were stained using crystal violet for visualisation. The model halves were reassembled, attached to an apparatus and tested under a fluorescence microscope. Syringe and needle irrigation protocol was performed using 9 mL of 2.5% NaOCl irrigant for 60 seconds. The irrigant was either left stagnant in the canal or activated for 30 seconds using manual (gutta-percha), sonic and ultrasonic methods. Images were then captured every second using an external camera. The percentages of residual biofilm were measured using image analysis software. The data were analysed using generalised linear mixed models. The greatest removal was associated with the ultrasonic group (66.76%) followed by sonic (45.49%), manual (43.97%), and passive irrigation group (control) (38.67%) respectively. No marked reduction in the efficiency of NaOCl to remove biofilm was found between the simple and complex anatomy models (p = 0.098). The removal efficacy of NaOCl on the biofilm was limited to the 1 mm level of the lateral canal. The agitation of NaOCl results in better penetration of the irrigant into the lateral canals. Ultrasonic agitation of NaOCl improved the removal of bacterial biofilm.

Keywords: 3D printing, biofilm, root canal irrigation, sodium hypochlorite

Procedia PDF Downloads 230
6539 Improving Chest X-Ray Disease Detection with Enhanced Data Augmentation Using Novel Approach of Diverse Conditional Wasserstein Generative Adversarial Networks

Authors: Malik Muhammad Arslan, Muneeb Ullah, Dai Shihan, Daniyal Haider, Xiaodong Yang

Abstract:

Chest X-rays are instrumental in the detection and monitoring of a wide array of diseases, including viral infections such as COVID-19, tuberculosis, pneumonia, lung cancer, and various cardiac and pulmonary conditions. To enhance the accuracy of diagnosis, artificial intelligence (AI) algorithms, particularly deep learning models like Convolutional Neural Networks (CNNs), are employed. However, these deep learning models demand a substantial and varied dataset to attain optimal precision. Generative Adversarial Networks (GANs) can be employed to create new data, thereby supplementing the existing dataset and enhancing the accuracy of deep learning models. Nevertheless, GANs have their limitations, such as issues related to stability, convergence, and the ability to distinguish between authentic and fabricated data. In order to overcome these challenges and advance the detection and classification of CXR normal and abnormal images, this study introduces a distinctive technique known as DCWGAN (Diverse Conditional Wasserstein GAN) for generating synthetic chest X-ray (CXR) images. The study evaluates the effectiveness of this Idiosyncratic DCWGAN technique using the ResNet50 model and compares its results with those obtained using the traditional GAN approach. The findings reveal that the ResNet50 model trained on the DCWGAN-generated dataset outperformed the model trained on the classic GAN-generated dataset. Specifically, the ResNet50 model utilizing DCWGAN synthetic images achieved impressive performance metrics with an accuracy of 0.961, precision of 0.955, recall of 0.970, and F1-Measure of 0.963. These results indicate the promising potential for the early detection of diseases in CXR images using this Inimitable approach.

Keywords: CNN, classification, deep learning, GAN, Resnet50

Procedia PDF Downloads 88
6538 Turkish Airlines' 85th Anniversary Commercial: An Analysis of the Institutional Identity of a Brand in Terms of Glocalization

Authors: Samil Ozcan

Abstract:

Airlines companies target different customer segments in consideration of pricing, service quality, flight network, etc. and their brand positioning accords with the marketization strategies developed in the same direction. The object of this study, Turkish Airlines, has many peculiarities regarding its brand positioning as compared to its rivals in the sector. In the first place, it appeals to a global customer group because of its Star Alliance membership and its broad flight network with 315 destination points. The second group in its customer segmentation includes domestic customers. For this group, the company follows a marketing strategy that plays to local culture and accentuates the image of Turkishness as an emotional allurement. The advertisements and publicity projects designed in this regard put little emphasis on the service quality the company offers to its clients; it addresses the emotions of the consumers rather than individual benefits and relies on the historical memory of the nation and shared cultural values. This study examines the publicity work which aims at the second segment customer group focusing on Turkish Airlines’ 85th Anniversary Commercial through a symbolic meaning analysis approach. The commercial presents six stories with undertones of nationalism in its theme. Nationalism is not just the product of collective interests based on reason but a result of patriotism in the sense of loyalty to state and nation and love of ethnic belonging. While nationalism refers to concrete notions such as blood tie, common ancestor, shared history, it is not the actuality of these notions that it draws its real strength but the emotions invested in them. The myths of origin, the idea of common homeland, boundary definitions, and symbolic acculturation have instrumental importance in the development of these commonalities. The commercial offers concrete examples for an analysis of Connor’s definition of nationalism based on emotions. Turning points in the history of the Turkish Republic and the historical mission Turkish Airlines undertook in these moments are narrated in six stories in the commercial with a highly emotional theme. These emotions, in general, depend on collective memory generated by national consciousness. Collective memory is not simply remembering the past. It is constructed through the reconstruction and reinterpretation of the past in the present moment. This study inquires the motivations behind the nationalist emotions generated within the collective memory by engaging with the commercial released for the 85th anniversary of Turkish Airlines as the object of analysis. Symbols and myths can be read as key concepts that reveal the relation between 'identity and memory'. Because myths and symbols do not merely reflect on collective memory, they reconstruct it as well. In this sense, the theme of the commercial defines the image of Turkishness with virtues such as self-sacrifice, helpfulness, humanity, and courage through a process of meaning creation based on symbolic mythologizations like flag and homeland. These virtues go beyond describing the image of Turkishness and become an instrument that defines and gives meaning to Turkish identity.

Keywords: collective memory, emotions, identity, nationalism

Procedia PDF Downloads 153
6537 Parameter Tuning of Complex Systems Modeled in Agent Based Modeling and Simulation

Authors: Rabia Korkmaz Tan, Şebnem Bora

Abstract:

The major problem encountered when modeling complex systems with agent-based modeling and simulation techniques is the existence of large parameter spaces. A complex system model cannot be expected to reflect the whole of the real system, but by specifying the most appropriate parameters, the actual system can be represented by the model under certain conditions. When the studies conducted in recent years were reviewed, it has been observed that there are few studies for parameter tuning problem in agent based simulations, and these studies have focused on tuning parameters of a single model. In this study, an approach of parameter tuning is proposed by using metaheuristic algorithms such as Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Artificial Bee Colonies (ABC), Firefly (FA) algorithms. With this hybrid structured study, the parameter tuning problems of the models in the different fields were solved. The new approach offered was tested in two different models, and its achievements in different problems were compared. The simulations and the results reveal that this proposed study is better than the existing parameter tuning studies.

Keywords: parameter tuning, agent based modeling and simulation, metaheuristic algorithms, complex systems

Procedia PDF Downloads 226
6536 Exchanging Messages in Ancient Greek Tragedy: The Use of δέλτος in the Euripidean and Sophoclean Stage

Authors: Maria-Agori Gravvani

Abstract:

The part of communication holds a significant place in human life. From the early beginning of human history, humans tried to communicate orally with other people in order to survive and to communicate their needs. The level of education that the majority of the Athenean citizens had the opportunity to acquire in the Classic period was very low. Only the wealthy ones had the opportunity of the upper form of education that led them to a career in politics, while the other ones struggled for their daily survival. In the corpus of Euripides' and Sophocles' tragedies, the type of communication is written, too. Not only in the Iphigenia's tragedies of Euripides but also in the Sophocles' Trachiniae, the use of δέλτος bonds significant messages with people. Those written means of private communication play an important role in the plot of the tragedy and have hidden private messages from their owners. The main aim of this paper is to analyze the power of the deltos' written text in the tragedies of Euripides Ifigenia Taurica and Ifigenia Aulidensis and Sophocles' Trachiniae.

Keywords: deltos, ancient greek tragedy, sophocles, euripides

Procedia PDF Downloads 66
6535 A Hierarchical Bayesian Calibration of Data-Driven Models for Composite Laminate Consolidation

Authors: Nikolaos Papadimas, Joanna Bennett, Amir Sakhaei, Timothy Dodwell

Abstract:

Composite modeling of consolidation processes is playing an important role in the process and part design by indicating the formation of possible unwanted prior to expensive experimental iterative trial and development programs. Composite materials in their uncured state display complex constitutive behavior, which has received much academic interest, and this with different models proposed. Errors from modeling and statistical which arise from this fitting will propagate through any simulation in which the material model is used. A general hyperelastic polynomial representation was proposed, which can be readily implemented in various nonlinear finite element packages. In our case, FEniCS was chosen. The coefficients are assumed uncertain, and therefore the distribution of parameters learned using Markov Chain Monte Carlo (MCMC) methods. In engineering, the approach often followed is to select a single set of model parameters, which on average, best fits a set of experiments. There are good statistical reasons why this is not a rigorous approach to take. To overcome these challenges, A hierarchical Bayesian framework was proposed in which population distribution of model parameters is inferred from an ensemble of experiments tests. The resulting sampled distribution of hyperparameters is approximated using Maximum Entropy methods so that the distribution of samples can be readily sampled when embedded within a stochastic finite element simulation. The methodology is validated and demonstrated on a set of consolidation experiments of AS4/8852 with various stacking sequences. The resulting distributions are then applied to stochastic finite element simulations of the consolidation of curved parts, leading to a distribution of possible model outputs. With this, the paper, as far as the authors are aware, represents the first stochastic finite element implementation in composite process modelling.

Keywords: data-driven , material consolidation, stochastic finite elements, surrogate models

Procedia PDF Downloads 146
6534 Optimum Design of Alkali Activated Slag Concretes for Low Chloride Ion Permeability and Water Absorption Capacity

Authors: Müzeyyen Balçikanli, Erdoğan Özbay, Hakan Tacettin Türker, Okan Karahan, Cengiz Duran Atiş

Abstract:

In this research, effect of curing time (TC), curing temperature (CT), sodium concentration (SC) and silicate modules (SM) on the compressive strength, chloride ion permeability, and water absorption capacity of alkali activated slag (AAS) concretes were investigated. For maximization of compressive strength while for minimization of chloride ion permeability and water absorption capacity of AAS concretes, best possible combination of CT, CTime, SC and SM were determined. An experimental program was conducted by using the central composite design method. Alkali solution-slag ratio was kept constant at 0.53 in all mixture. The effects of the independent parameters were characterized and analyzed by using statistically significant quadratic regression models on the measured properties (dependent parameters). The proposed regression models are valid for AAS concretes with the SC from 0.1% to 7.5%, SM from 0.4 to 3.2, CT from 20 °C to 94 °C and TC from 1.2 hours to 25 hours. The results of test and analysis indicate that the most effective parameter for the compressive strength, chloride ion permeability and water absorption capacity is the sodium concentration.

Keywords: alkali activation, slag, rapid chloride permeability, water absorption capacity

Procedia PDF Downloads 312
6533 Recognition of Gene Names from Gene Pathway Figures Using Siamese Network

Authors: Muhammad Azam, Micheal Olaolu Arowolo, Fei He, Mihail Popescu, Dong Xu

Abstract:

The number of biological papers is growing quickly, which means that the number of biological pathway figures in those papers is also increasing quickly. Each pathway figure shows extensive biological information, like the names of genes and how the genes are related. However, manually annotating pathway figures takes a lot of time and work. Even though using advanced image understanding models could speed up the process of curation, these models still need to be made more accurate. To improve gene name recognition from pathway figures, we applied a Siamese network to map image segments to a library of pictures containing known genes in a similar way to person recognition from photos in many photo applications. We used a triple loss function and a triplet spatial pyramid pooling network by combining the triplet convolution neural network and the spatial pyramid pooling (TSPP-Net). We compared VGG19 and VGG16 as the Siamese network model. VGG16 achieved better performance with an accuracy of 93%, which is much higher than OCR results.

Keywords: biological pathway, image understanding, gene name recognition, object detection, Siamese network, VGG

Procedia PDF Downloads 291
6532 A Comparison of Convolutional Neural Network Architectures for the Classification of Alzheimer’s Disease Patients Using MRI Scans

Authors: Tomas Premoli, Sareh Rowlands

Abstract:

In this study, we investigate the impact of various convolutional neural network (CNN) architectures on the accuracy of diagnosing Alzheimer’s disease (AD) using patient MRI scans. Alzheimer’s disease is a debilitating neurodegenerative disorder that affects millions worldwide. Early, accurate, and non-invasive diagnostic methods are required for providing optimal care and symptom management. Deep learning techniques, particularly CNNs, have shown great promise in enhancing this diagnostic process. We aim to contribute to the ongoing research in this field by comparing the effectiveness of different CNN architectures and providing insights for future studies. Our methodology involved preprocessing MRI data, implementing multiple CNN architectures, and evaluating the performance of each model. We employed intensity normalization, linear registration, and skull stripping for our preprocessing. The selected architectures included VGG, ResNet, and DenseNet models, all implemented using the Keras library. We employed transfer learning and trained models from scratch to compare their effectiveness. Our findings demonstrated significant differences in performance among the tested architectures, with DenseNet201 achieving the highest accuracy of 86.4%. Transfer learning proved to be helpful in improving model performance. We also identified potential areas for future research, such as experimenting with other architectures, optimizing hyperparameters, and employing fine-tuning strategies. By providing a comprehensive analysis of the selected CNN architectures, we offer a solid foundation for future research in Alzheimer’s disease diagnosis using deep learning techniques. Our study highlights the potential of CNNs as a valuable diagnostic tool and emphasizes the importance of ongoing research to develop more accurate and effective models.

Keywords: Alzheimer’s disease, convolutional neural networks, deep learning, medical imaging, MRI

Procedia PDF Downloads 73
6531 Performance Improvement of Information System of a Banking System Based on Integrated Resilience Engineering Design

Authors: S. H. Iranmanesh, L. Aliabadi, A. Mollajan

Abstract:

Integrated resilience engineering (IRE) is capable of returning banking systems to the normal state in extensive economic circumstances. In this study, information system of a large bank (with several branches) is assessed and optimized under severe economic conditions. Data envelopment analysis (DEA) models are employed to achieve the objective of this study. Nine IRE factors are considered to be the outputs, and a dummy variable is defined as the input of the DEA models. A standard questionnaire is designed and distributed among executive managers to be considered as the decision-making units (DMUs). Reliability and validity of the questionnaire is examined based on Cronbach's alpha and t-test. The most appropriate DEA model is determined based on average efficiency and normality test. It is shown that the proposed integrated design provides higher efficiency than the conventional RE design. Results of sensitivity and perturbation analysis indicate that self-organization, fault tolerance, and reporting culture respectively compose about 50 percent of total weight.

Keywords: banking system, Data Envelopment Analysis (DEA), Integrated Resilience Engineering (IRE), performance evaluation, perturbation analysis

Procedia PDF Downloads 188
6530 Exploration and Evaluation of the Effect of Multiple Countermeasures on Road Safety

Authors: Atheer Al-Nuaimi, Harry Evdorides

Abstract:

Every day many people die or get disabled or injured on roads around the world, which necessitates more specific treatments for transportation safety issues. International road assessment program (iRAP) model is one of the comprehensive road safety models which accounting for many factors that affect road safety in a cost-effective way in low and middle income countries. In iRAP model road safety has been divided into five star ratings from 1 star (the lowest level) to 5 star (the highest level). These star ratings are based on star rating score which is calculated by iRAP methodology depending on road attributes, traffic volumes and operating speeds. The outcome of iRAP methodology are the treatments that can be used to improve road safety and reduce fatalities and serious injuries (FSI) numbers. These countermeasures can be used separately as a single countermeasure or mix as multiple countermeasures for a location. There is general agreement that the adequacy of a countermeasure is liable to consistent losses when it is utilized as a part of mix with different countermeasures. That is, accident diminishment appraisals of individual countermeasures cannot be easily added together. The iRAP model philosophy makes utilization of a multiple countermeasure adjustment factors to predict diminishments in the effectiveness of road safety countermeasures when more than one countermeasure is chosen. A multiple countermeasure correction factors are figured for every 100-meter segment and for every accident type. However, restrictions of this methodology incorporate a presumable over-estimation in the predicted crash reduction. This study aims to adjust this correction factor by developing new models to calculate the effect of using multiple countermeasures on the number of fatalities for a location or an entire road. Regression models have been used to establish relationships between crash frequencies and the factors that affect their rates. Multiple linear regression, negative binomial regression, and Poisson regression techniques were used to develop models that can address the effectiveness of using multiple countermeasures. Analyses are conducted using The R Project for Statistical Computing showed that a model developed by negative binomial regression technique could give more reliable results of the predicted number of fatalities after the implementation of road safety multiple countermeasures than the results from iRAP model. The results also showed that the negative binomial regression approach gives more precise results in comparison with multiple linear and Poisson regression techniques because of the overdispersion and standard error issues.

Keywords: international road assessment program, negative binomial, road multiple countermeasures, road safety

Procedia PDF Downloads 240
6529 What 4th-Year Primary-School Students are Thinking: A Paper Airplane Problem

Authors: Neslihan Şahin Çelik, Ali Eraslan

Abstract:

In recent years, mathematics educators have frequently stressed the necessity of instructing students about models and modeling approaches that encompass cognitive and metacognitive thought processes, starting from the first years of school and continuing on through the years of higher education. The purpose of this study is to examine the thought processes of 4th-grade primary school students in their modeling activities and to explore the difficulties encountered in these processes, if any. The study, of qualitative design, was conducted in the 2015-2016 academic year at a public state-school located in a central city in the Black Sea Region of Turkey. A preliminary study was first implemented with designated 4th grade students, after which the criterion sampling method was used to select three students that would be recruited into the focus group. The focus group that was thus formed was asked to work on the model eliciting activity of the Paper Airplane Problem and the entire process was recorded on video. The Paper Airplane Problem required the students to determine the winner with respect to: (a) the plane that stays in the air for the longest time; (b) the plane that travels the greatest distance in a straight-line path; and (c) the overall winner for the contest. A written transcript was made of the video recording, after which the recording and the students' worksheets were analyzed using the Blum and Ferri modeling cycle. The results of the study revealed that the students tested the hypotheses related to daily life that they had set up, generated ideas of their own, verified their models by making connections with real life, and tried to make their models generalizable. On the other hand, the students had some difficulties in terms of their interpretation of the table of data and their ways of operating on the data during the modeling processes.

Keywords: primary school students, model eliciting activity, mathematical modeling, modeling process, paper airplane problem

Procedia PDF Downloads 358
6528 Diagnostic Assessment for Mastery Learning of Engineering Students with a Bayesian Network Model

Authors: Zhidong Zhang, Yingchen Yang

Abstract:

In this study, a diagnostic assessment model for Mastery Engineering Learning was established based on a group of undergraduate students who studied in an engineering course. A diagnostic assessment model can examine both students' learning process and report achievement results. One very unique characteristic is that the diagnostic assessment model can recognize the errors and anything blocking students in their learning processes. The feedback is provided to help students to know how to solve the learning problems with alternative strategies and help the instructor to find alternative pedagogical strategies in the instructional designs. Dynamics is a core course in which is a common course being shared by several engineering programs. This course is a very challenging for engineering students to solve the problems. Thus knowledge acquisition and problem-solving skills are crucial for student success. Therefore, developing an effective and valid assessment model for student learning are of great importance. Diagnostic assessment is such a model which can provide effective feedback for both students and instructor in the mastery of engineering learning.

Keywords: diagnostic assessment, mastery learning, engineering, bayesian network model, learning processes

Procedia PDF Downloads 152
6527 Agile Project Management: A Real Application in a Multi-Project Research and Development Center

Authors: Aysegul Sarac

Abstract:

The aim of this study is to analyze the impacts of integrating agile development principles and practices, in particular to reduce project lead time in a multi-project environment. We analyze Arçelik Washing Machine R&D Center in which multiple projects are conducted by shared resources. In the first part of the study, we illustrate the current waterfall model system by using a value stream map. We define all activities starting from the first idea of the project to the customer and measure process time and lead time of projects. In the second part of the study we estimate potential improvements and select a set of these improvements to integrate agile principles. We aim to develop a future state map and analyze the impacts of integrating lean principles on project lead time. The main contribution of this study is that we analyze and integrate agile product development principles in a real multi-project system.

Keywords: agile project management, multi project system, project lead time, product development

Procedia PDF Downloads 305
6526 Long Short-Term Memory Stream Cruise Control Method for Automated Drift Detection and Adaptation

Authors: Mohammad Abu-Shaira, Weishi Shi

Abstract:

Adaptive learning, a commonly employed solution to drift, involves updating predictive models online during their operation to react to concept drifts, thereby serving as a critical component and natural extension for online learning systems that learn incrementally from each example. This paper introduces LSTM-SCCM “Long Short-Term Memory Stream Cruise Control Method”, a drift adaptation-as-a-service framework for online learning. LSTM-SCCM automates drift adaptation through prompt detection, drift magnitude quantification, dynamic hyperparameter tuning, performing shortterm optimization and model recalibration for immediate adjustments, and, when necessary, conducting long-term model recalibration to ensure deeper enhancements in model performance. LSTM-SCCM is incorporated into a suite of cutting-edge online regression models, assessing their performance across various types of concept drift using diverse datasets with varying characteristics. The findings demonstrate that LSTM-SCCM represents a notable advancement in both model performance and efficacy in handling concept drift occurrences. LSTM-SCCM stands out as the sole framework adept at effectively tackling concept drifts within regression scenarios. Its proactive approach to drift adaptation distinguishes it from conventional reactive methods, which typically rely on retraining after significant degradation to model performance caused by drifts. Additionally, LSTM-SCCM employs an in-memory approach combined with the Self-Adjusting Memory (SAM) architecture to enhance real-time processing and adaptability. The framework incorporates variable thresholding techniques and does not assume any particular data distribution, making it an ideal choice for managing high-dimensional datasets and efficiently handling large-scale data. Our experiments, which include abrupt, incremental, and gradual drifts across both low- and high-dimensional datasets with varying noise levels, and applied to four state-of-the-art online regression models, demonstrate that LSTM-SCCM is versatile and effective, rendering it a valuable solution for online regression models to address concept drift.

Keywords: automated drift detection and adaptation, concept drift, hyperparameters optimization, online and adaptive learning, regression

Procedia PDF Downloads 11
6525 Transformation of Industrial Policy towards Industry 4.0 and Its Impact on Firms' Competition

Authors: Arūnas Burinskas

Abstract:

Although Europe is on the threshold of a new industrial revolution called Industry 4.0, many believe that this will increase the flexibility of production, the mass adaptation of products to consumers and the speed of their service; it will also improve product quality and dramatically increase productivity. However, as expected, all the benefits of Industry 4.0 face many of the inevitable changes and challenges they pose. One of them is the inevitable transformation of current competition and business models. This article examines the possible results of competitive conversion from the classic Bertrand and Cournot models to qualitatively new competition based on innovation. Ability to deliver a new product quickly and the possibility to produce the individual design (through flexible and quickly configurable factories) by reducing equipment failures and increasing process automation and control is highly important. This study shows that the ongoing transformation of the competition model is changing the game. This, together with the creation of complex value networks, means huge investments that make it particularly difficult for small and medium-sized enterprises. In addition, the ongoing digitalization of data raises new concerns regarding legal obligations, intellectual property, and security.

Keywords: Bertrand and Cournot Competition, competition model, industry 4.0, industrial organisation, monopolistic competition

Procedia PDF Downloads 138
6524 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 164
6523 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction

Authors: Radul Shishkov, Orlin Davchev

Abstract:

The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.

Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction

Procedia PDF Downloads 63
6522 Reducing Metabolism Residues in Maintenance Goldfish (Carrasius auratus auratus) by Phytoremediation Plant

Authors: Anna Nurkhasanah, Hamzah Muhammad Ihsan, Nurul Wulandari

Abstract:

Water quality affects the body condition of aquatic organisms. One of the methods to manage water quality, usually called phytoremediation, involves using aquatic plants. The purpose of this study is to find out the best aquatic plants to reducing metabolism residues from aquatic organism. 5 aquariums (40x30x30 cm) containing 100 grams from each 4 different plants such as water hyacinth (Eichhornia crassipes), salvinia (Salvinia molesta), cabomba (Cabomba caroliniana), and hydrilla (Hydrilla verticillata), thirteen goldfis (Carrasius auratus auratus) are maintained. The maintenance is conducted through a week and water quality measurements are performed three times. The results show that pH value tends to range between 7,22-8,72. The temperature varies between 25-26 °C. DO values varies between 5,2-10,5 mg/L. Amoniac value is between 0,005–5,2 mg/L. Nitrite value is between 0,005 mg/L-2,356 mg/L. Nitrate value is between 0,791 mg/L-1,737 mg/L. CO2 value is between 2,2 mg/L-6,1 mg/L. The result of survival rate of goldfish for all treatments is 100%. Based on this study, the best aquatic plant to reduce metabolism residues is hydrilla.

Keywords: phytoremediation, goldfish, aquatic plants, water quality

Procedia PDF Downloads 521
6521 Prediction of Gully Erosion with Stochastic Modeling by using Geographic Information System and Remote Sensing Data in North of Iran

Authors: Reza Zakerinejad

Abstract:

Gully erosion is a serious problem that threading the sustainability of agricultural area and rangeland and water in a large part of Iran. This type of water erosion is the main source of sedimentation in many catchment areas in the north of Iran. Since in many national assessment approaches just qualitative models were applied the aim of this study is to predict the spatial distribution of gully erosion processes by means of detail terrain analysis and GIS -based logistic regression in the loess deposition in a case study in the Golestan Province. This study the DEM with 25 meter result ion from ASTER data has been used. The Landsat ETM data have been used to mapping of land use. The TreeNet model as a stochastic modeling was applied to prediction the susceptible area for gully erosion. In this model ROC we have set 20 % of data as learning and 20 % as learning data. Therefore, applying the GIS and satellite image analysis techniques has been used to derive the input information for these stochastic models. The result of this study showed a high accurate map of potential for gully erosion.

Keywords: TreeNet model, terrain analysis, Golestan Province, Iran

Procedia PDF Downloads 535
6520 Day of the Week Patterns and the Financial Trends' Role: Evidence from the Greek Stock Market during the Euro Era

Authors: Nikolaos Konstantopoulos, Aristeidis Samitas, Vasileiou Evangelos

Abstract:

The purpose of this study is to examine if the financial trends influence not only the stock markets’ returns, but also their anomalies. We choose to study the day of the week effect (DOW) for the Greek stock market during the Euro period (2002-12), because during the specific period there are not significant structural changes and there are long term financial trends. Moreover, in order to avoid possible methodological counterarguments that usually arise in the literature, we apply several linear (OLS) and nonlinear (GARCH family) models to our sample until we reach to the conclusion that the TGARCH model fits better to our sample than any other. Our results suggest that in the Greek stock market there is a long term predisposition for positive/negative returns depending on the weekday. However, the statistical significance is influenced from the financial trend. This influence may be the reason why there are conflict findings in the literature through the time. Finally, we combine the DOW’s empirical findings from 1985-2012 and we may assume that in the Greek case there is a tendency for long lived turn of the week effect.

Keywords: day of the week effect, GARCH family models, Athens stock exchange, economic growth, crisis

Procedia PDF Downloads 410
6519 Importance of New Policies of Process Management for Internet of Things Based on Forensic Investigation

Authors: Venkata Venugopal Rao Gudlur

Abstract:

The Proposed Policies referred to as “SOP”, on the Internet of Things (IoT) based Forensic Investigation into Process Management is the latest revolution to save time and quick solution for investigators. The forensic investigation process has been developed over many years from time to time it has been given the required information with no policies in investigation processes. This research reveals that the current IoT based forensic investigation into Process Management based is more connected to devices which is the latest revolution and policies. All future development in real-time information on gathering monitoring is evolved with smart sensor-based technologies connected directly to IoT. This paper present conceptual framework on process management. The smart devices are leading the way in terms of automated forensic models and frameworks established by different scholars. These models and frameworks were mostly focused on offering a roadmap for performing forensic operations with no policies in place. These initiatives would bring a tremendous benefit to process management and IoT forensic investigators proposing policies. The forensic investigation process may enhance more security and reduced data losses and vulnerabilities.

Keywords: Internet of Things, Process Management, Forensic Investigation, M2M Framework

Procedia PDF Downloads 102
6518 Parallel Transformation Processes of Historical Centres: The Cases of Sevilla and Valparaiso

Authors: Jorge Ferrada Herrera, Pablo M. Millán-Millán

Abstract:

The delimitation in the cities of heritage areas implicit in strong processes of transformation, both social and material. The study shows how two cities, seemingly different as Seville (Spain) and Valparaiso (Chile), share the same transformation process from its declaration as heritage cities. The metdología used in research has been on the one hand the analytic-criticism has shown us all processes and the level of involvement of these. On the other hand the direct observation methodology has allowed us to ratify all studied. Faced with these processes research shows social resources that people have developed to address each of them. The study concludes the need to strengthen the social and associative fabric in heritage areas as a resource to ensure the survival of heritage, not only material but also social and cultural. As examples, we have chosen Seville and Valparaiso: the gentrification of Seville prior to the universal exhibition of ‘92 –with pretty specific plans-- is paralleled by Valparaiso’s plan to revitalize its port and its protected (UNESCO) area. The whole of our theoretical discourse will be based thereupon.

Keywords: historical centers, tourism, heritage, social processes

Procedia PDF Downloads 305