Search results for: light gradient boosting model (LGBM)
18654 Quantum Statistical Machine Learning and Quantum Time Series
Authors: Omar Alzeley, Sergey Utev
Abstract:
Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series
Procedia PDF Downloads 46918653 Methodology for Obtaining Static Alignment Model
Authors: Lely A. Luengas, Pedro R. Vizcaya, Giovanni Sánchez
Abstract:
In this paper, a methodology is presented to obtain the Static Alignment Model for any transtibial amputee person. The proposed methodology starts from experimental data collected on the Hospital Militar Central, Bogotá, Colombia. The effects of transtibial prosthesis malalignment on amputees were measured in terms of joint angles, center of pressure (COP) and weight distribution. Some statistical tools are used to obtain the model parameters. Mathematical predictive models of prosthetic alignment were created. The proposed models are validated in amputees and finding promising results for the prosthesis Static Alignment. Static alignment process is unique to each subject; nevertheless the proposed methodology can be used in each transtibial amputee.Keywords: information theory, prediction model, prosthetic alignment, transtibial prosthesis
Procedia PDF Downloads 25718652 Design and Implementation of Low-code Model-building Methods
Authors: Zhilin Wang, Zhihao Zheng, Linxin Liu
Abstract:
This study proposes a low-code model-building approach that aims to simplify the development and deployment of artificial intelligence (AI) models. With an intuitive way to drag and drop and connect components, users can easily build complex models and integrate multiple algorithms for training. After the training is completed, the system automatically generates a callable model service API. This method not only lowers the technical threshold of AI development and improves development efficiency but also enhances the flexibility of algorithm integration and simplifies the deployment process of models. The core strength of this method lies in its ease of use and efficiency. Users do not need to have a deep programming background and can complete the design and implementation of complex models with a simple drag-and-drop operation. This feature greatly expands the scope of AI technology, allowing more non-technical people to participate in the development of AI models. At the same time, the method performs well in algorithm integration, supporting many different types of algorithms to work together, which further improves the performance and applicability of the model. In the experimental part, we performed several performance tests on the method. The results show that compared with traditional model construction methods, this method can make more efficient use, save computing resources, and greatly shorten the model training time. In addition, the system-generated model service interface has been optimized for high availability and scalability, which can adapt to the needs of different application scenarios.Keywords: low-code, model building, artificial intelligence, algorithm integration, model deployment
Procedia PDF Downloads 2918651 Effect of Sand Particle Distribution in Oil and Gas Pipeline Erosion
Authors: Christopher Deekia Nwimae, Nigel Simms, Liyun Lao
Abstract:
Erosion in pipe bends caused by particles is a major obstacle in the oil and gas fields and might cause the breakdown of production equipment. This work studied the effects imposed by flow velocity and impact of solid particles diameter in an elbow; erosion rate was verified with experimental data using the computational fluid dynamics (CFD) approach. Two-way coupled Euler-Lagrange and discrete phase model was employed to calculate the air/solid particle flow in an elbow. One erosion model and three-particle rebound models were used to predict the erosion rate on the 90° elbows. The generic erosion model was used in the CFD-based erosion model, and after comparing it with experimental data, results showed agreement with the CFD-based predictions as observed.Keywords: erosion, prediction, elbow, computational fluid dynamics
Procedia PDF Downloads 15718650 6D Posture Estimation of Road Vehicles from Color Images
Authors: Yoshimoto Kurihara, Tad Gonsalves
Abstract:
Currently, in the field of object posture estimation, there is research on estimating the position and angle of an object by storing a 3D model of the object to be estimated in advance in a computer and matching it with the model. However, in this research, we have succeeded in creating a module that is much simpler, smaller in scale, and faster in operation. Our 6D pose estimation model consists of two different networks – a classification network and a regression network. From a single RGB image, the trained model estimates the class of the object in the image, the coordinates of the object, and its rotation angle in 3D space. In addition, we compared the estimation accuracy of each camera position, i.e., the angle from which the object was captured. The highest accuracy was recorded when the camera position was 75°, the accuracy of the classification was about 87.3%, and that of regression was about 98.9%.Keywords: 6D posture estimation, image recognition, deep learning, AlexNet
Procedia PDF Downloads 15518649 A Robust Optimization Model for Multi-Objective Closed-Loop Supply Chain
Authors: Mohammad Y. Badiee, Saeed Golestani, Mir Saman Pishvaee
Abstract:
In recent years consumers and governments have been pushing companies to design their activities in such a way as to reduce negative environmental impacts by producing renewable product or threat free disposal policy more and more. It is therefore important to focus more accurate to the optimization of various aspect of total supply chain. Modeling a supply chain can be a challenging process due to the fact that there are a large number of factors that need to be considered in the model. The use of multi-objective optimization can lead to overcome those problems since more information is used when designing the model. Uncertainty is inevitable in real world. Considering uncertainty on parameters in addition to use multi-objectives are ways to give more flexibility to the decision making process since the process can take into account much more constraints and requirements. In this paper we demonstrate a stochastic scenario based robust model to cope with uncertainty in a closed-loop multi-objective supply chain. By applying the proposed model in a real world case, the power of proposed model in handling data uncertainty is shown.Keywords: supply chain management, closed-loop supply chain, multi-objective optimization, goal programming, uncertainty, robust optimization
Procedia PDF Downloads 41618648 Generalized Additive Model Approach for the Chilean Hake Population in a Bio-Economic Context
Authors: Selin Guney, Andres Riquelme
Abstract:
The traditional bio-economic method for fisheries modeling uses some estimate of the growth parameters and the system carrying capacity from a biological model for the population dynamics (usually a logistic population growth model) which is then analyzed as a traditional production function. The stock dynamic is transformed into a revenue function and then compared with the extraction costs to estimate the maximum economic yield. In this paper, the logistic population growth model for the population is combined with a forecast of the abundance and location of the stock by using a generalized additive model approach. The paper focuses on the Chilean hake population. This method allows for the incorporation of climatic variables and the interaction with other marine species, which in turn will increase the reliability of the estimates and generate better extraction paths for different conservation objectives, such as the maximum biological yield or the maximum economic yield.Keywords: bio-economic, fisheries, GAM, production
Procedia PDF Downloads 25218647 The Relationship between the Epithermal Mineralization, Thermalism, and Basement Faults in the Region of Guelma: NE of Algeria
Authors: B. Merdas
Abstract:
The Guelma region constitutes a vast geothermal field whose local geothermal gradient is very high. Indeed, various thermal and thermo sources emerging in the region, including some at relatively high temperatures. In the mio Pliocene Hammam N'bails, basin emerges a hot spring that leaves develop a thick series of thermal travertine linked to it. Near the thermal emergences has settled a very special mineralization antimony and zinc and lead. The results of analyses of the thermal waters of the source of Hammam N'bails and the associated travertine, show abnormal values in Pb, Sb, Zn, As, and other metals, demonstrating the genetic link between those waters and mineralization. Hammam N'bails mineralizations by their mineral assembling represented and their association with the hot springs, are very similar to epithermal deposits with precious metals (gold and silver) like Senator mine in Turkey or ‘Carlin-type’ in Nevada (USA).Keywords: hot springs, mineralization; basement faults, Guelma, NE Algeria
Procedia PDF Downloads 43018646 Contractual Complexity and Contract Parties' Opportunistic Behavior in Construction Projects: In a Contractual Function View
Authors: Mengxia Jin, Yongqiang Chen, Wenqian Wang, Yu Wang
Abstract:
The complexity and specificity of construction projects have made common opportunism phenomenon, and contractual governance for opportunism has been a topic of considerable ongoing research. Based on TCE, the research distinguishes control and coordination as different functions of the contract to investigate their complexity separately. And in a nuanced way, the dimensionality of contractual control is examined. Through the analysis of motivation and capability of strong or weak form opportunism, the framework focuses on the relationship between the complexity of above contractual dimensions and different types of opportunistic behavior and attempts to verify the possible explanatory mechanism. The explanatory power of the research model is evaluated in the light of empirical evidence from questionnaires. We collect data from Chinese companies in the construction industry, and the data collection is still in progress. The findings will speak to the debate surrounding the effects of contract complexity on opportunistic behavior. This nuanced research will derive implications for research on the role of contractual mechanisms in dealing with inter-organizational opportunism and offer suggestions for curbing contract parties’ opportunistic behavior in construction projects.Keywords: contractual complexity, contractual control, contractual coordinatio, opportunistic behavior
Procedia PDF Downloads 38418645 A Model-Reference Sliding Mode for Dual-Stage Actuator Servo Control in HDD
Authors: S. Sonkham, U. Pinsopon, W. Chatlatanagulchai
Abstract:
This paper presents a method of sliding mode control (SMC) designing and developing for the servo system in a dual-stage actuator (DSA) hard disk drive. Mathematical modelling of hard disk drive actuators is obtained, extracted from measuring frequency response of the voice-coil motor (VCM) and PZT micro-actuator separately. Matlab software tools are used for mathematical model estimation and also for controller design and simulation. A model-reference approach for tracking requirement is selected as a proposed technique. The simulation results show that performance of a model-reference SMC controller design in DSA servo control can be satisfied in the tracking error, as well as keeping the positioning of the head within the boundary of +/-5% of track width under the presence of internal and external disturbance. The overall results of model-reference SMC design in DSA are met per requirement specifications and significant reduction in %off track is found when compared to the single-state actuator (SSA).Keywords: hard disk drive, dual-stage actuator, track following, hdd servo control, sliding mode control, model-reference, tracking control
Procedia PDF Downloads 36518644 Stabilization Control of the Nonlinear AIDS Model Based on the Theory of Polynomial Fuzzy Control Systems
Authors: Shahrokh Barati
Abstract:
In this paper, we introduced AIDS disease at first, then proposed dynamic model illustrate its progress, after expression of a short history of nonlinear modeling by polynomial phasing systems, we considered the stability conditions of the systems, which contained a huge amount of researches in order to modeling and control of AIDS in dynamic nonlinear form, in this approach using a frame work of control any polynomial phasing modeling system which have been generalized by part of phasing model of T-S, in order to control the system in better way, the stability conditions were achieved based on polynomial functions, then we focused to design the appropriate controller, firstly we considered the equilibrium points of system and their conditions and in order to examine changes in the parameters, we presented polynomial phase model that was the generalized approach rather than previous Takagi Sugeno models, then with using case we evaluated the equations in both open loop and close loop and with helping the controlling feedback, the close loop equations of system were calculated, to simulate nonlinear model of AIDS disease, we used polynomial phasing controller output that was capable to make the parameters of a nonlinear system to follow a sustainable reference model properly.Keywords: polynomial fuzzy, AIDS, nonlinear AIDS model, fuzzy control systems
Procedia PDF Downloads 46818643 Using Daily Light Integral Concept to Construct the Ecological Plant Design Strategy of Urban Landscape
Authors: Chuang-Hung Lin, Cheng-Yuan Hsu, Jia-Yan Lin
Abstract:
It is an indispensible strategy to adopt greenery approach on architectural bases so as to improve ecological habitats, decrease heat-island effect, purify air quality, and relieve surface runoff as well as noise pollution, all of which are done in an attempt to achieve sustainable environment. How we can do with plant design to attain the best visual quality and ideal carbon dioxide fixation depends on whether or not we can appropriately make use of greenery according to the nature of architectural bases. To achieve the goal, it is a need that architects and landscape architects should be provided with sufficient local references. Current greenery studies focus mainly on the heat-island effect of urban with large scale. Most of the architects still rely on people with years of expertise regarding the adoption and disposition of plantation in connection with microclimate scale. Therefore, environmental design, which integrates science and aesthetics, requires fundamental research on landscape environment technology divided from building environment technology. By doing so, we can create mutual benefits between green building and the environment. This issue is extremely important for the greening design of the bases of green buildings in cities and various open spaces. The purpose of this study is to establish plant selection and allocation strategies under different building sunshade levels. Initially, with the shading of sunshine on the greening bases as the starting point, the effects of the shades produced by different building types on the greening strategies were analyzed. Then, by measuring the PAR( photosynthetic active radiation), the relative DLI( daily light integral) was calculated, while the DLI Map was established in order to evaluate the effects of the building shading on the established environmental greening, thereby serving as a reference for plant selection and allocation. The discussion results were to be applied in the evaluation of environment greening of greening buildings and establish the “right plant, right place” design strategy of multi-level ecological greening for application in urban design and landscape design development, as well as the greening criteria to feedback to the eco-city greening buildings.Keywords: daily light integral, plant design, urban open space
Procedia PDF Downloads 51118642 Vibration-Based Data-Driven Model for Road Health Monitoring
Authors: Guru Prakash, Revanth Dugalam
Abstract:
A road’s condition often deteriorates due to harsh loading such as overload due to trucks, and severe environmental conditions such as heavy rain, snow load, and cyclic loading. In absence of proper maintenance planning, this results in potholes, wide cracks, bumps, and increased roughness of roads. In this paper, a data-driven model will be developed to detect these damages using vibration and image signals. The key idea of the proposed methodology is that the road anomaly manifests in these signals, which can be detected by training a machine learning algorithm. The use of various machine learning techniques such as the support vector machine and Radom Forest method will be investigated. The proposed model will first be trained and tested with artificially simulated data, and the model architecture will be finalized by comparing the accuracies of various models. Once a model is fixed, the field study will be performed, and data will be collected. The field data will be used to validate the proposed model and to predict the future road’s health condition. The proposed will help to automate the road condition monitoring process, repair cost estimation, and maintenance planning process.Keywords: SVM, data-driven, road health monitoring, pot-hole
Procedia PDF Downloads 8618641 An Integreated Intuitionistic Fuzzy ELECTRE Model for Multi-Criteria Decision-Making
Authors: Babek Erdebilli
Abstract:
The aim of this study is to develop and describe a new methodology for the Multi-Criteria Decision-Making (MCDM) problem using IFE (Elimination Et Choix Traduisant La Realite (ELECTRE) model. The proposed models enable Decision-Makers (DMs) on the assessment and use Intuitionistic Fuzzy Numbers (IFN). A numerical example is provided to demonstrate and clarify the proposed analysis procedure. Also, an empirical experiment is conducted to validation the effectiveness.Keywords: multi-criteria decision-making, IFE, DM’s, fuzzy electre model
Procedia PDF Downloads 65118640 Computationally Efficient Electrochemical-Thermal Li-Ion Cell Model for Battery Management System
Authors: Sangwoo Han, Saeed Khaleghi Rahimian, Ying Liu
Abstract:
Vehicle electrification is gaining momentum, and many car manufacturers promise to deliver more electric vehicle (EV) models to consumers in the coming years. In controlling the battery pack, the battery management system (BMS) must maintain optimal battery performance while ensuring the safety of a battery pack. Tasks related to battery performance include determining state-of-charge (SOC), state-of-power (SOP), state-of-health (SOH), cell balancing, and battery charging. Safety related functions include making sure cells operate within specified, static and dynamic voltage window and temperature range, derating power, detecting faulty cells, and warning the user if necessary. The BMS often utilizes an RC circuit model to model a Li-ion cell because of its robustness and low computation cost among other benefits. Because an equivalent circuit model such as the RC model is not a physics-based model, it can never be a prognostic model to predict battery state-of-health and avoid any safety risk even before it occurs. A physics-based Li-ion cell model, on the other hand, is more capable at the expense of computation cost. To avoid the high computation cost associated with a full-order model, many researchers have demonstrated the use of a single particle model (SPM) for BMS applications. One drawback associated with the single particle modeling approach is that it forces to use the average current density in the calculation. The SPM would be appropriate for simulating drive cycles where there is insufficient time to develop a significant current distribution within an electrode. However, under a continuous or high-pulse electrical load, the model may fail to predict cell voltage or Li⁺ plating potential. To overcome this issue, a multi-particle reduced-order model is proposed here. The use of multiple particles combined with either linear or nonlinear charge-transfer reaction kinetics enables to capture current density distribution within an electrode under any type of electrical load. To maintain computational complexity like that of an SPM, governing equations are solved sequentially to minimize iterative solving processes. Furthermore, the model is validated against a full-order model implemented in COMSOL Multiphysics.Keywords: battery management system, physics-based li-ion cell model, reduced-order model, single-particle and multi-particle model
Procedia PDF Downloads 11118639 Forecasting Model to Predict Dengue Incidence in Malaysia
Authors: W. H. Wan Zakiyatussariroh, A. A. Nasuhar, W. Y. Wan Fairos, Z. A. Nazatul Shahreen
Abstract:
Forecasting dengue incidence in a population can provide useful information to facilitate the planning of the public health intervention. Many studies on dengue cases in Malaysia were conducted but are limited in modeling the outbreak and forecasting incidence. This article attempts to propose the most appropriate time series model to explain the behavior of dengue incidence in Malaysia for the purpose of forecasting future dengue outbreaks. Several seasonal auto-regressive integrated moving average (SARIMA) models were developed to model Malaysia’s number of dengue incidence on weekly data collected from January 2001 to December 2011. SARIMA (2,1,1)(1,1,1)52 model was found to be the most suitable model for Malaysia’s dengue incidence with the least value of Akaike information criteria (AIC) and Bayesian information criteria (BIC) for in-sample fitting. The models further evaluate out-sample forecast accuracy using four different accuracy measures. The results indicate that SARIMA (2,1,1)(1,1,1)52 performed well for both in-sample fitting and out-sample evaluation.Keywords: time series modeling, Box-Jenkins, SARIMA, forecasting
Procedia PDF Downloads 48618638 On the Utility of Bidirectional Transformers in Gene Expression-Based Classification
Authors: Babak Forouraghi
Abstract:
A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of the flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on the spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts, as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with an attention mechanism. In previous works on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work, with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on the presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.Keywords: machine learning, classification and regression, gene circuit design, bidirectional transformers
Procedia PDF Downloads 6118637 Phenological and Molecular Genetic Diversity Analysis among Saudi durum Wheat Landraces
Authors: Naser B. Almari, Salem S. Alghamdi, Muhammad Afzal, Mohamed Helmy El Shal
Abstract:
Wheat landraces are a rich genetic resource for boosting agronomic qualities in breeding programs while also providing diversity and unique adaptation to local environmental conditions. These genotypes have grown increasingly important in the face of recent climate change challenges. This research aimed to look at the genetic diversity of Saudi Durum wheat landraces using morpho-phenological and molecular data. The principal components analysis (PCA) analysis recorded 78.47 % variance and 1.064 eigenvalues for the first six PCs of the total, respectively. The significant characters contributed more to the diversity are the length of owns at the tip relative to the length of the ear, culm: glaucosity of the neck, flag leaf: glaucosity of the sheath, flag leaf: anthocyanin coloration of auricles, plant: frequency of plants with recurved flag leaves, ear: length, and ear: shape in profile in the PC1. The significant wheat genotypes contributed more in the PC1 (8, 14, 497, 650, 569, 590, 594, 598, 600, 601, and 604). The cluster analysis recorded an 85.42 cophenetic correlation among the 22 wheat genotypes and grouped the genotypes into two main groups. Group, I contain 8 genotypes, however, the 2nd group contains 12 wheat genotypes, while two genotypes (13 and 497) are standing alone in the dendrogram and unable to make a group with any one of the genotypes. The second group was subdivided into two subgroups. The genotypes (14, 602, and 600) were present in the second sub-group. The genotypes were grouped into two main groups. The first group contains 17 genotypes, while the second group contains 3 (8, 977, and 594) wheat genotypes. The genotype (602) was standing alone and unable to make a group with any wheat genotype. The genotypes 650 and 13 also stand alone in the first group. Using the Mantel test, the data recorded a significant (R2 = 0.0006) correlation (phenotypic and genetic) among 22 wheat durum genotypes.Keywords: durum wheat, PCA, cluster analysis, SRAP, genetic diversity
Procedia PDF Downloads 11518636 Optimization Model for Support Decision for Maximizing Production of Mixed Fresh Fruit Farms
Authors: Andrés I. Ávila, Patricia Aros, César San Martín, Elizabeth Kehr, Yovana Leal
Abstract:
Planning models for fresh products is a very useful tool for improving the net profits. To get an efficient supply chain model, several functions should be considered to get a complete simulation of several operational units. We consider a linear programming model to help farmers to decide if it is convenient to choose what area should be planted for three kinds of export fruits considering their future investment. We consider area, investment, water, productivity minimal unit, and harvest restrictions to develop a monthly based model to compute the average income in five years. Also, conditions on the field as area, water availability, and initial investment are required. Using the Chilean costs and dollar-peso exchange rate, we can simulate several scenarios to understand the possible risks associated to this market. Also, this tool help to support decisions for government and individual farmers.Keywords: mixed integer problem, fresh fruit production, support decision model, agricultural and biosystems engineering
Procedia PDF Downloads 43818635 Analysis of the Impact of NVivo and EndNote on Academic Research Productivity
Authors: Sujit K. Basak
Abstract:
The aim of this paper is to analyze the impact of literature review software on researchers. The aim of this study was achieved by analyzing models in terms of perceived usefulness, perceived ease of use, and acceptance level. Collected data was analyzed using WarpPLS 4.0 software. This study used two theoretical frameworks namely Technology Acceptance Model and the Training Needs Assessment Model. The study was experimental and was conducted at a public university in South Africa. The results of the study showed that acceptance level has a high impact on research workload and productivity followed by perceived usefulness and perceived ease of use.Keywords: technology acceptance model, training needs assessment model, literature review software, research productivity
Procedia PDF Downloads 50318634 A Spatial Approach to Model Mortality Rates
Authors: Yin-Yee Leong, Jack C. Yue, Hsin-Chung Wang
Abstract:
Human longevity has been experiencing its largest increase since the end of World War II, and modeling the mortality rates is therefore often the focus of many studies. Among all mortality models, the Lee–Carter model is the most popular approach since it is fairly easy to use and has good accuracy in predicting mortality rates (e.g., for Japan and the USA). However, empirical studies from several countries have shown that the age parameters of the Lee–Carter model are not constant in time. Many modifications of the Lee–Carter model have been proposed to deal with this problem, including adding an extra cohort effect and adding another period effect. In this study, we propose a spatial modification and use clusters to explain why the age parameters of the Lee–Carter model are not constant. In spatial analysis, clusters are areas with unusually high or low mortality rates than their neighbors, where the “location” of mortality rates is measured by age and time, that is, a 2-dimensional coordinate. We use a popular cluster detection method—Spatial scan statistics, a local statistical test based on the likelihood ratio test to evaluate where there are locations with mortality rates that cannot be described well by the Lee–Carter model. We first use computer simulation to demonstrate that the cluster effect is a possible source causing the problem of the age parameters not being constant. Next, we show that adding the cluster effect can solve the non-constant problem. We also apply the proposed approach to mortality data from Japan, France, the USA, and Taiwan. The empirical results show that our approach has better-fitting results and smaller mean absolute percentage errors than the Lee–Carter model.Keywords: mortality improvement, Lee–Carter model, spatial statistics, cluster detection
Procedia PDF Downloads 17118633 Nd³⁺: Si₂N₂O (Sinoite) Phosphors for White Light Emitting Diodes
Authors: Alparslan A. Balta, Hilmi Yurdakul, Orkun Tunckan, Servet Turan, Arife Yurdakul
Abstract:
A silicon oxynitride (Si2N2O), the mineralogical name is “Sinoite”, reveals the outstanding physical, mechanical and thermal properties, e.g., good oxidation resistance at high temperatures, high fracture toughness with rod shape, high hardness, low theoretical density, good thermal shock resistance by low thermal expansion coefficient and high thermal conductivity. In addition, the orthorhombic crystal structure of Si2N2O allows accommodating the rare earth (RE) element atoms along the “c” axis due to existing large structural interstitial sites. Here, 0.02 to 0.12 wt. % Nd3+ doped Si2N2O samples were successfully synthesized by spark plasma sintering (SPS) method at 30MPa pressure and 1650oC temperature. Li2O was also utilized as a sintering additive to take advantage of low eutectic point during synthesizing. The specimens were characterized in detail by scanning electron microscopy (SEM), transmission electron microscopy (TEM), X-ray diffraction (XRD) and cathodoluminescence (CL) in SEM and photoluminescence (PL) spectroscopy. Based on the overall results, the Si2N2O phase was obtained above 90% by the SPS route. Furthermore, Nd3+: Si2N2O samples showed a very broad intense emission peak between 400-700 nm, which corresponds to white color. Therefore, this material can be considered as a promising candidate for white light-emitting diodes (WLEDs) purposes. This study was supported by TUBITAK under project number 217M667.Keywords: neodymium, oxynitride, Si₂N₂O, WLEDs
Procedia PDF Downloads 13818632 Impact of VARK Learning Model at Tertiary Level Education
Authors: Munazza A. Mirza, Khawar Khurshid
Abstract:
Individuals are generally associated with different learning styles, which have been explored extensively in recent past. The learning styles refer to the potential of an individual by which s/he can easily comprehend and retain information. Among various learning style models, VARK is the most accepted model which categorizes the learners with respect to their sensory characteristics. Based on the number of preferred learning modes, the learners can be categorized as uni-modal, bi-modal, tri-modal, or quad/multi-modal. Although there is a prevalent belief in the learning styles, however, the model is not being frequently and effectively utilized in the higher education. This research describes the identification model to validate teacher’s didactic practice and student’s performance linkage with the learning styles. The identification model is recommended to check the effective application and evaluation of the various learning styles. The proposed model is a guideline to effectively implement learning styles inventory in order to ensure that it will validate performance linkage with learning styles. If performance is linked with learning styles, this may help eradicate the distrust on learning style theory. For this purpose, a comprehensive study was conducted to compare and understand how VARK inventory model is being used to identify learning preferences and their correlation with learner’s performance. A comparative analysis of the findings of these studies is presented to understand the learning styles of tertiary students in various disciplines. It is concluded with confidence that the learning styles of students cannot be associated with any specific discipline. Furthermore, there is not enough empirical proof to link performance with learning styles.Keywords: learning style, VARK, sensory preferences, identification model, didactic practices
Procedia PDF Downloads 27818631 Applying the Extreme-Based Teaching Model in Post-Secondary Online Classroom Setting: A Field Experiment
Authors: Leon Pan
Abstract:
The first programming course within post-secondary education has long been recognized as a challenging endeavor for both educators and students alike. Historically, these courses have exhibited high failure rates and a notable number of dropouts. Instructors often lament students' lack of effort in their coursework, and students often express frustration that the teaching methods employed are not effective. Drawing inspiration from the successful principles of Extreme Programming, this study introduces an approach—the Extremes-based teaching model — aimed at enhancing the teaching of introductory programming courses. To empirically determine the effectiveness of the model, a comparison was made between a section taught using the extreme-based model and another utilizing traditional teaching methods. Notably, the extreme-based teaching class required students to work collaboratively on projects while also demanding continuous assessment and performance enhancement within groups. This paper details the application of the extreme-based model within the post-secondary online classroom context and presents the compelling results that emphasize its effectiveness in advancing the teaching and learning experiences. The extreme-based model led to a significant increase of 13.46 points in the weighted total average and a commendable 10% reduction in the failure rate.Keywords: extreme-based teaching model, innovative pedagogical methods, project-based learning, team-based learning
Procedia PDF Downloads 5918630 Thermodynamics of the Local Hadley Circulation Over Central Africa
Authors: Landry Tchambou Tchouongsi, Appolinaire Derbetini Vondou
Abstract:
This study describes the local Hadley circulation (HC) during the December-February (DJF) and June-August (JJA) seasons, respectively, in Central Africa (CA) from the divergent component of the mean meridional wind and also from a new method called the variation of the ψ vector. Historical data from the ERA5 reanalysis for the period 1983 to 2013 were used. The results show that the maximum of the upward branch of the local Hadley circulation in the DJF and JJA seasons is located under the Congo Basin (CB). However, seasonal and horizontal variations in the mean temperature gradient and thermodynamic properties are largely associated with the distribution of convection and large-scale upward motion. Thus, temperatures beneath the CB show a slight variation between the DJF and JJA seasons. Moreover, energy transport of the moist static energy (MSE) adequately captures the mean flow component of the HC over the tropics. By the way, the divergence under the CB is enhanced by the presence of the low pressure of western Cameroon and the contribution of the warm and dry air currents coming from the Sahara.Keywords: Circulation, reanalysis, thermodynamic, local Hadley.
Procedia PDF Downloads 8918629 Functional Decomposition Based Effort Estimation Model for Software-Intensive Systems
Authors: Nermin Sökmen
Abstract:
An effort estimation model is needed for software-intensive projects that consist of hardware, embedded software or some combination of the two, as well as high level software solutions. This paper first focuses on functional decomposition techniques to measure functional complexity of a computer system and investigates its impact on system development effort. Later, it examines effects of technical difficulty and design team capability factors in order to construct the best effort estimation model. With using traditional regression analysis technique, the study develops a system development effort estimation model which takes functional complexity, technical difficulty and design team capability factors as input parameters. Finally, the assumptions of the model are tested.Keywords: functional complexity, functional decomposition, development effort, technical difficulty, design team capability, regression analysis
Procedia PDF Downloads 29318628 Blood Volume Pulse Extraction for Non-Contact Photoplethysmography Measurement from Facial Images
Authors: Ki Moo Lim, Iman R. Tayibnapis
Abstract:
According to WHO estimation, 38 out of 56 million (68%) global deaths in 2012, were due to noncommunicable diseases (NCDs). To avert NCD, one of the solutions is early detection of diseases. In order to do that, we developed 'U-Healthcare Mirror', which is able to measure vital sign such as heart rate (HR) and respiration rate without any physical contact and consciousness. To measure HR in the mirror, we utilized digital camera. The camera records red, green, and blue (RGB) discoloration from user's facial image sequences. We extracted blood volume pulse (BVP) from the RGB discoloration because the discoloration of the facial skin is accordance with BVP. We used blind source separation (BSS) to extract BVP from the RGB discoloration and adaptive filters for removing noises. We utilized singular value decomposition (SVD) method to implement the BSS and the adaptive filters. HR was estimated from the obtained BVP. We did experiment for HR measurement by using our method and previous method that used independent component analysis (ICA) method. We compared both of them with HR measurement from commercial oximeter. The experiment was conducted under various distance between 30~110 cm and light intensity between 5~2000 lux. For each condition, we did measurement 7 times. The estimated HR showed 2.25 bpm of mean error and 0.73 of pearson correlation coefficient. The accuracy has improved compared to previous work. The optimal distance between the mirror and user for HR measurement was 50 cm with medium light intensity, around 550 lux.Keywords: blood volume pulse, heart rate, photoplethysmography, independent component analysis
Procedia PDF Downloads 32918627 Application of an Analytical Model to Obtain Daily Flow Duration Curves for Different Hydrological Regimes in Switzerland
Authors: Ana Clara Santos, Maria Manuela Portela, Bettina Schaefli
Abstract:
This work assesses the performance of an analytical model framework to generate daily flow duration curves, FDCs, based on climatic characteristics of the catchments and on their streamflow recession coefficients. According to the analytical model framework, precipitation is considered to be a stochastic process, modeled as a marked Poisson process, and recession is considered to be deterministic, with parameters that can be computed based on different models. The analytical model framework was tested for three case studies with different hydrological regimes located in Switzerland: pluvial, snow-dominated and glacier. For that purpose, five time intervals were analyzed (the four meteorological seasons and the civil year) and two developments of the model were tested: one considering a linear recession model and the other adopting a nonlinear recession model. Those developments were combined with recession coefficients obtained from two different approaches: forward and inverse estimation. The performance of the analytical framework when considering forward parameter estimation is poor in comparison with the inverse estimation for both, linear and nonlinear models. For the pluvial catchment, the inverse estimation shows exceptional good results, especially for the nonlinear model, clearing suggesting that the model has the ability to describe FDCs. For the snow-dominated and glacier catchments the seasonal results are better than the annual ones suggesting that the model can describe streamflows in those conditions and that future efforts should focus on improving and combining seasonal curves instead of considering single annual ones.Keywords: analytical streamflow distribution, stochastic process, linear and non-linear recession, hydrological modelling, daily discharges
Procedia PDF Downloads 16218626 Robustified Asymmetric Logistic Regression Model for Global Fish Stock Assessment
Authors: Osamu Komori, Shinto Eguchi, Hiroshi Okamura, Momoko Ichinokawa
Abstract:
The long time-series data on population assessments are essential for global ecosystem assessment because the temporal change of biomass in such a database reflects the status of global ecosystem properly. However, the available assessment data usually have limited sample sizes and the ratio of populations with low abundance of biomass (collapsed) to those with high abundance (non-collapsed) is highly imbalanced. To allow for the imbalance and uncertainty involved in the ecological data, we propose a binary regression model with mixed effects for inferring ecosystem status through an asymmetric logistic model. In the estimation equation, we observe that the weights for the non-collapsed populations are relatively reduced, which in turn puts more importance on the small number of observations of collapsed populations. Moreover, we extend the asymmetric logistic regression model using propensity score to allow for the sample biases observed in the labeled and unlabeled datasets. It robustified the estimation procedure and improved the model fitting.Keywords: double robust estimation, ecological binary data, mixed effect logistic regression model, propensity score
Procedia PDF Downloads 26618625 Investigating the Challenges Faced by English Language Teachers in Implementing Outcome Based Education the Outcome Based Education model in Engineering Universities of Sindh
Authors: Habibullah Pathan
Abstract:
The present study aims to explore problems faced by English Language Teachers (ELT) while implementing the Outcome Based Education (OBE) model in engineering universities of Sindh. OBE is an emerging model initiative of the International Engineering Alliance. Traditional educational systems are teacher-centered or curriculum-centered, in which learners are not able to achieve desired outcomes, but the OBE model enables learners to know the outcomes before the start of the program. OBE is a circular process that begins from the needs and demands of society to stakeholders who ask the experts to produce the alumnus who can fulfill the needs and ends up getting new enrollment in the respective programs who can work according to the demands. In all engineering institutions, engineering courses besides English language courses are taught on the OBE model. English language teachers were interviewed to learn the in-depth of the problems faced by them. The study found that teachers were facing problems including pedagogical, OBE training, assessment, evaluation and administrative support. This study will be a guide for public and private English language teachers to cope with these challenges while teaching the English language on the OBE model. OBE is an emerging model by which the institutions can produce such a product that can meet the demands.Keywords: problems of ELT teachers, outcome based education (OBE), implementing, assessment
Procedia PDF Downloads 98