Search results for: vector error correction model (VECM)
15804 Developing an Audit Quality Model for an Emerging Market
Authors: Bita Mashayekhi, Azadeh Maddahi, Arash Tahriri
Abstract:
The purpose of this paper is developing a model for audit quality, with regard to the contextual and environmental attributes of the audit profession in Iran. For this purpose, using an exploratory approach, and because of the special attributes of the auditing profession in Iran in terms of the legal environment, regulatory and supervisory mechanisms, audit firms size, and etc., we used grounded theory approach as a qualitative research method. Therefore, we got the opinions of the experts in the auditing and capital market areas through unstructured interviews. As a result, the authors revealed the determinants of audit quality, and by using these determinants, developed an Integrated Audit Quality Model, including causal conditions, intervening conditions, context, as well as action strategies related to AQ and their consequences. In this research, audit quality is studied using a systemic approach. According to this approach, the quality of inputs, processes, and outputs of auditing determines the quality of auditing, therefore, the quality of all different parts of this system is considered.Keywords: audit quality, integrated audit quality model, demand for audit service, supply of audit, grounded theory
Procedia PDF Downloads 28615803 Defining Methodology for Multi Model Software Process Improvement Framework
Authors: Aedah Abd Rahman
Abstract:
Software organisations may implement single or multiple frameworks in order to remain competitive. There are wide selection of generic Software Process Improvement (SPI) frameworks, best practices and standards implemented with different focuses and goals. Issues and difficulties emerge in the SPI practices from the context of software development and IT Service Management (ITSM). This research looks into the integration of multiple frameworks from the perspective of software development and ITSM. The research question of this study is how to define steps of methodology to solve the multi model software process improvement problem. The objective of this study is to define the research approach and methodologies to produce a more integrated and efficient Multi Model Process Improvement (MMPI) solution. A multi-step methodology is used which contains the case study, framework mapping and Delphi study. The research outcome has proven the usefulness and appropriateness of the proposed framework in SPI and quality practice in Malaysian software industries. This mixed method research approach is used to tackle problems from every angle in the context of software development and services. This methodology is used to facilitate the implementation and management of multi model environment of SPI frameworks in multiple domains.Keywords: Delphi study, methodology, multi model software process improvement, service management
Procedia PDF Downloads 26315802 Quantification of the Variables of the Information Model for the Use of School Terminology from 1884 to 2014 in Dalmatia
Authors: Vinko Vidučić, Tanja Brešan Ančić, Marijana Tomelić Ćurlin
Abstract:
Prior to quantifying the variables of the information model for using school terminology in Croatia's region of Dalmatia from 1884 to 2014, the most relevant model variables had to be determined: historical circumstances, standard of living, education system, linguistic situation, and media. The research findings show that there was no significant transfer of the 1884 school terms into 1949 usage; likewise, the 1949 school terms were not widely used in 2014. On the other hand, the research revealed that the meaning of school terms changed over the decades. The quantification of the variables will serve as the groundwork for creating an information model for using school terminology in Dalmatia from 1884 to 2014 and for defining direct growth rates in further research.Keywords: education system, historical circumstances, linguistic situation, media, school terminology, standard of living
Procedia PDF Downloads 22015801 Rail Degradation Modelling Using ARMAX: A Case Study Applied to Melbourne Tram System
Authors: M. Karimpour, N. Elkhoury, L. Hitihamillage, S. Moridpour, R. Hesami
Abstract:
There is a necessity among rail transportation authorities for a superior understanding of the rail track degradation overtime and the factors influencing rail degradation. They need an accurate technique to identify the time when rail tracks fail or need maintenance. In turn, this will help to increase the level of safety and comfort of the passengers and the vehicles as well as improve the cost effectiveness of maintenance activities. An accurate model can play a key role in prediction of the long-term behaviour of railroad tracks. An accurate model can decrease the cost of maintenance. In this research, the rail track degradation is predicted using an autoregressive moving average with exogenous input (ARMAX). An ARMAX has been implemented on Melbourne tram data to estimate the values for the tram track degradation. Gauge values and rail usage in Million Gross Tone (MGT) are the main parameters used in the model. The developed model can accurately predict the future status of the tram tracks.Keywords: ARMAX, dynamic systems, MGT, prediction, rail degradation
Procedia PDF Downloads 25315800 The Influence of Contact Models on Discrete Element Modeling of the Ballast Layer Subjected to Cyclic Loading
Authors: Peyman Aela, Lu Zong, Guoqing Jing
Abstract:
Recently, there has been growing interest in numerical modeling of ballast railway tracks. A commonly used mechanistic modeling approach for ballast is the discrete element method (DEM). Up to now, the effects of the contact model on ballast particle behavior have not been precisely examined. In this regard, selecting the appropriate contact model is mainly associated with the particle characteristics and the loading condition. Since ballast is cohesionless material, different contact models, including the linear spring, Hertz-Mindlin, and Hysteretic models, could be used to calculate particle-particle or wall-particle contact forces. Moreover, the simulation of a dynamic test is vital to investigate the effect of damping parameters on the ballast deformation. In this study, ballast box tests were simulated by DEM to examine the influence of different contact models on the mechanical behavior of the ballast layer under cyclic loading. This paper shows how the contact model can affect the deformation and damping of a ballast layer subjected to cyclic loading in a ballast box.Keywords: ballast, contact model, cyclic loading, DEM
Procedia PDF Downloads 20515799 Supplier Selection in a Scenario Based Stochastic Model with Uncertain Defectiveness and Delivery Lateness Rates
Authors: Abeer Amayri, Akif A. Bulgak
Abstract:
Due to today’s globalization as well as outsourcing practices of the companies, the Supply Chain (SC) performances have become more dependent on the efficient movement of material among places that are geographically dispersed, where there is more chance for disruptions. One such disruption is the quality and delivery uncertainties of outsourcing. These uncertainties could lead the products to be unsafe and, as is the case in a number of recent examples, companies may have to end up in recalling their products. As a result of these problems, there is a need to develop a methodology for selecting suppliers globally in view of risks associated with low quality and late delivery. Accordingly, we developed a two-stage stochastic model that captures the risks associated with uncertainty in quality and delivery as well as a solution procedure for the model. The stochastic model developed simultaneously optimizes supplier selection and purchase quantities under price discounts over a time horizon. In particular, our target is the study of global organizations with multiple sites and multiple overseas suppliers, where the pricing is offered in suppliers’ local currencies. Our proposed methodology is applied to a case study for a US automotive company having two assembly plants and four potential global suppliers to illustrate how the proposed model works in practice.Keywords: global supply chains, quality, stochastic programming, supplier selection
Procedia PDF Downloads 46315798 A Study on Inference from Distance Variables in Hedonic Regression
Authors: Yan Wang, Yasushi Asami, Yukio Sadahiro
Abstract:
In urban area, several landmarks may affect housing price and rents, hedonic analysis should employ distance variables corresponding to each landmarks. Unfortunately, the effects of distances to landmarks on housing prices are generally not consistent with the true price. These distance variables may cause magnitude error in regression, pointing a problem of spatial multicollinearity. In this paper, we provided some approaches for getting the samples with less bias and method on locating the specific sampling area to avoid the multicollinerity problem in two specific landmarks case.Keywords: landmarks, hedonic regression, distance variables, collinearity, multicollinerity
Procedia PDF Downloads 45415797 Flow Characterization in Complex Terrain for Aviation Safety
Authors: Adil Rasheed, Mandar Tabib
Abstract:
The paper describes the ability of a high-resolution Computational Fluid Dynamics model to predict terrain-induced turbulence and wind shear close to the ground. Various sensitivity studies to choose the optimal simulation setup for modeling the flow characteristics in a complex terrain are presented. The capabilities of the model are demonstrated by applying it to the Sandnessjøen Airport, Stokka in Norway, an airport that is located in a mountainous area. The model is able to forecast turbulence in real time and trigger an alert when atmospheric conditions might result in high wind shear and turbulence.Keywords: aviation safety, terrain-induced turbulence, atmospheric flow, alert system
Procedia PDF Downloads 41815796 Mistuning in Radial Inflow Turbines
Authors: Valentina Futoryanova, Hugh Hunt
Abstract:
One of the common failure modes of the diesel engine turbochargers is high cycle fatigue of the turbine wheel blades. Mistuning of the blades due to the casting process is believed to contribute to the failure mode. Laser vibrometer is used to characterize mistuning for a population of turbine wheels through the analysis of the blade response to piezo speaker induced noise. The turbine wheel design under investigation is radial and is typically used in 6-12 L diesel engine applications. Amplitudes and resonance frequencies are reviewed and summarized. The study also includes test results for a paddle wheel that represents a perfectly tuned system and acts as a reference. Mass spring model is developed for the paddle wheel and the model suitability is tested against the actual data. Randomization is applied to the stiffness matrix to model the mistuning effect in the turbine wheels. Experimental data is shown to have good agreement with the model.Keywords: vibration, radial turbines, mistuning, turbine blades, modal analysis, periodic structures, finite element
Procedia PDF Downloads 43515795 Long Term Love Relationships Analyzed as a Dynamic System with Random Variations
Authors: Nini Johana Marín Rodríguez, William Fernando Oquendo Patino
Abstract:
In this work, we model a coupled system where we explore the effects of steady and random behavior on a linear system like an extension of the classic Strogatz model. This is exemplified by modeling a couple love dynamics as a linear system of two coupled differential equations and studying its stability for four types of lovers chosen as CC='Cautious- Cautious', OO='Only other feelings', OP='Opposites' and RR='Romeo the Robot'. We explore the effects of, first, introducing saturation, and second, adding a random variation to one of the CC-type lover, which will shape his character by trying to model how its variability influences the dynamics between love and hate in couple in a long run relationship. This work could also be useful to model other kind of systems where interactions can be modeled as linear systems with external or internal random influence. We found the final results are not easy to predict and a strong dependence on initial conditions appear, which a signature of chaos.Keywords: differential equations, dynamical systems, linear system, love dynamics
Procedia PDF Downloads 35815794 Evaluation of Liquefaction Potential of Fine Grained Soil: Kerman Case Study
Authors: Reza Ziaie Moayed, Maedeh Akhavan Tavakkoli
Abstract:
This research aims to investigate and evaluate the liquefaction potential in a project in Kerman city based on different methods for fine-grained soils. Examining the previous damages caused by recent earthquakes, it has been observed that fine-grained soils play an essential role in the level of damage caused by soil liquefaction. But, based on previous investigations related to liquefaction, there is limited attention to evaluating the cyclic resistance ratio for fine-grain soils, especially with the SPT method. Although using a standard penetration test (SPT) to find the liquefaction potential of fine-grain soil is not common, it can be a helpful method based on its rapidness, serviceability, and availability. In the present study, the liquefaction potential has been first determined by the soil’s physical properties obtained from laboratory tests. Then, using the SPT test and its available criterion for evaluating the cyclic resistance ratio and safety factor of liquefaction, the correction of effecting fine-grained soils is made, and then the results are compared. The results show that using the SPT test for liquefaction is more accurate than using laboratory tests in most cases due to the contribution of different physical parameters of soil, which leads to an increase in the ultimate N₁(60,cs).Keywords: liquefaction, cyclic resistance ratio, SPT test, clay soil, cohesion soils
Procedia PDF Downloads 10615793 Analysis of Users’ Behavior on Book Loan Log Based on Association Rule Mining
Authors: Kanyarat Bussaban, Kunyanuth Kularbphettong
Abstract:
This research aims to create a model for analysis of student behavior using Library resources based on data mining technique in case of Suan Sunandha Rajabhat University. The model was created under association rules, apriori algorithm. The results were found 14 rules and the rules were tested with testing data set and it showed that the ability of classify data was 79.24 percent and the MSE was 22.91. The results showed that the user’s behavior model by using association rule technique can use to manage the library resources.Keywords: behavior, data mining technique, a priori algorithm, knowledge discovery
Procedia PDF Downloads 41015792 Single-Element Simulations of Wood Material in LS-DYNA
Authors: Ren Zuo Wang
Abstract:
In this paper, in order to investigate the behavior of the wood structure, the non-linearity of wood material model in LS-DYNA is adopted. It is difficult and less efficient to conduct the experiment of the ancient wood structure, hence LS-DYNA software can be used to simulate nonlinear responses of ancient wood structure. In LS-DYNA software, there is material model called *MAT_WOOD or *MAT_143. This model is to simulate a single-element response of the wood subjected to tension and compression under the parallel and the perpendicular material directions. Comparing with the exact solution and numerical simulations results using LS-DYNA, it demonstrates the accuracy and the efficiency of the proposed simulation method.Keywords: LS-DYNA, wood structure, single-element simulations, MAT_143
Procedia PDF Downloads 66515791 Grid Connected Photovoltaic Micro Inverter
Authors: S. J. Bindhu, Edwina G. Rodrigues, Jijo Balakrishnan
Abstract:
A grid-connected photovoltaic (PV) micro inverter with good performance properties is proposed in this paper. The proposed inverter with a quadrupler, having more efficiency and less voltage stress across the diodes. The stress that come across the diodes that use in the inverter section is considerably low in the proposed converter, also the protection scheme that we provided can eliminate the chances of the error due to fault. The proposed converter is implemented using perturb and observe algorithm so that the fluctuation in the voltage can be reduce and can attain maximum power point. Finally, some simulation and experimental results are also presented to demonstrate the effectiveness of the proposed converter.Keywords: DC-DC converter, MPPT, quadrupler, PV panel
Procedia PDF Downloads 84615790 Text Mining of Twitter Data Using a Latent Dirichlet Allocation Topic Model and Sentiment Analysis
Authors: Sidi Yang, Haiyi Zhang
Abstract:
Twitter is a microblogging platform, where millions of users daily share their attitudes, views, and opinions. Using a probabilistic Latent Dirichlet Allocation (LDA) topic model to discern the most popular topics in the Twitter data is an effective way to analyze a large set of tweets to find a set of topics in a computationally efficient manner. Sentiment analysis provides an effective method to show the emotions and sentiments found in each tweet and an efficient way to summarize the results in a manner that is clearly understood. The primary goal of this paper is to explore text mining, extract and analyze useful information from unstructured text using two approaches: LDA topic modelling and sentiment analysis by examining Twitter plain text data in English. These two methods allow people to dig data more effectively and efficiently. LDA topic model and sentiment analysis can also be applied to provide insight views in business and scientific fields.Keywords: text mining, Twitter, topic model, sentiment analysis
Procedia PDF Downloads 18115789 Analysis on the Need of Engineering Drawing and Feasibility Study on 3D Model Based Engineering Implementation
Authors: Parthasarathy J., Ramshankar C. S.
Abstract:
Engineering drawings these days play an important role in every part of an industry. By and large, Engineering drawings are influential over every phase of the product development process. Traditionally, drawings are used for communication in industry because they are the clearest way to represent the product manufacturing information. Until recently, manufacturing activities were driven by engineering data captured in 2D paper documents or digital representations of those documents. The need of engineering drawing is inevitable. Still Engineering drawings are disadvantageous in re-entry of data throughout manufacturing life cycle. This document based approach is prone to errors and requires costly re-entry of data at every stage in the manufacturing life cycle. So there is a requirement to eliminate Engineering drawings throughout product development process and to implement 3D Model Based Engineering (3D MBE or 3D MBD). Adopting MBD appears to be the next logical step to continue reducing time-to-market and improve product quality. Ideally, by fully applying the MBD concept, the product definition will no longer rely on engineering drawings throughout the product lifecycle. This project addresses the need of Engineering drawing and its influence in various parts of an industry and the need to implement the 3D Model Based Engineering with its advantages and the technical barriers that must be overcome in order to implement 3D Model Based Engineering. This project also addresses the requirements of neutral formats and its realisation in order to implement the digital product definition principles in a light format. In order to prove the concepts of 3D Model Based Engineering, the screw jack body part is also demonstrated. At ZF Windpower Coimbatore Limited, 3D Model Based Definition is implemented to Torque Arm (Machining and Casting), Steel tube, Pinion shaft, Cover, Energy tube.Keywords: engineering drawing, model based engineering MBE, MBD, CAD
Procedia PDF Downloads 43715788 A Bi-Objective Model to Address Simultaneous Formulation of Project Scheduling and Material Ordering
Authors: Babak H. Tabrizi, Seyed Farid Ghaderi
Abstract:
Concurrent planning of project scheduling and material ordering has been increasingly addressed within last decades as an approach to improve the project execution costs. Therefore, we have taken the problem into consideration in this paper, aiming to maximize schedules quality robustness, in addition to minimize the relevant costs. In this regard, a bi-objective mathematical model is developed to formulate the problem. Moreover, it is possible to utilize the all-unit discount for materials purchasing. The problem is then solved by the constraint method, and the Pareto front is obtained for a variety of robustness values. The applicability and efficiency of the proposed model is tested by different numerical instances, finally.Keywords: e-constraint method, material ordering, project management, project scheduling
Procedia PDF Downloads 29815787 The Impact of Malicious Attacks on the Performance of Routing Protocols in Mobile Ad-Hoc Networks
Authors: Habib Gorine, Rabia Saleh
Abstract:
Mobile Ad-Hoc Networks are the special type of wireless networks which share common security requirements with other networks such as confidentiality, integrity, authentication, and availability, which need to be addressed in order to secure data transfer through the network. Their routing protocols are vulnerable to various malicious attacks which could have a devastating consequence on data security. In this paper, three types of attacks such as selfish, gray hole, and black hole attacks have been applied to the two most important routing protocols in MANET named dynamic source routing and ad-hoc on demand distance vector in order to analyse and compare the impact of these attacks on the Network performance in terms of throughput, average delay, packet loss, and consumption of energy using NS2 simulator.Keywords: MANET, wireless networks, routing protocols, malicious attacks, wireless networks simulation
Procedia PDF Downloads 32215786 Estimation of Soil Moisture at High Resolution through Integration of Optical and Microwave Remote Sensing and Applications in Drought Analyses
Authors: Donglian Sun, Yu Li, Paul Houser, Xiwu Zhan
Abstract:
California experienced severe drought conditions in the past years. In this study, the drought conditions in California are analyzed using soil moisture anomalies derived from integrated optical and microwave satellite observations along with auxiliary land surface data. Based on the U.S. Drought Monitor (USDM) classifications, three typical drought conditions were selected for the analysis: extreme drought conditions in 2007 and 2013, severe drought conditions in 2004 and 2009, and normal conditions in 2005 and 2006. Drought is defined as negative soil moisture anomaly. To estimate soil moisture at high spatial resolutions, three approaches are explored in this study: the universal triangle model that estimates soil moisture from Normalized Difference Vegetation Index (NDVI) and Land Surface Temperature (LST); the basic model that estimates soil moisture under different conditions with auxiliary data like precipitation, soil texture, topography, and surface types; and the refined model that uses accumulated precipitation and its lagging effects. It is found that the basic model shows better agreements with the USDM classifications than the universal triangle model, while the refined model using precipitation accumulated from the previous summer to current time demonstrated the closest agreements with the USDM patterns.Keywords: soil moisture, high resolution, regional drought, analysis and monitoring
Procedia PDF Downloads 14215785 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model
Authors: Chaudhuri Manoj Kumar Swain, Susmita Das
Abstract:
This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis
Procedia PDF Downloads 18315784 Causes of Variation Orders in the Egyptian Construction Industry: Time and Cost Impacts
Authors: A. Samer Ezeldin, Jwanda M. El Sarag
Abstract:
Variation orders are of great importance in any construction project. Variation orders are defined as any change in the scope of works of a project that can be an addition omission, or even modification. This paper investigates the variation orders that occur during construction projects in Egypt. The literature review represents a comparison of causes of variation orders among Egypt, Tanzania, Nigeria, Malaysia and the United Kingdom. A classification of occurrence of variation orders due to owner related factors, consultant related factors and other factors are signified in the literature review. These classified events that lead to variation orders were introduced in a survey with 19 events to observe their frequency of occurrence, and their time and cost impacts. The survey data was obtained from 87 participants that included clients, consultants, and contractors and a database of 42 scenarios was created. A model is then developed to help assist project managers in predicting the frequency of variations and account for a budget for any additional costs and minimize any delays that can take place. Two experts with more than 25 years of experience were given the model to verify that the model was working effectively. The model was then validated on a residential compound that was completed in July 2016 to prove that the model actually produces acceptable results.Keywords: construction, cost impact, Egypt, time impact, variation orders
Procedia PDF Downloads 18715783 Analysis of a Coupled Hydro-Sedimentological Numerical Model for the Western Tombolo of Giens
Authors: Yves Lacroix, Van Van Than, Didier Léandri, Pierre Liardet
Abstract:
The western Tombolo of the Giens peninsula in southern France, known as Almanarre beach, is subject to coastal erosion. We are trying to use computer simulation in order to propose solutions to stop this erosion. Our aim was first to determine the main factors for this erosion and successfully apply a coupled hydro-sedimentological numerical model based on observations and measurements that have been performed on the site for decades. We have gathered all available information and data about waves, winds, currents, tides, bathymetry, coastal line, and sediments concerning the site. These have been divided into two sets: one devoted to calibrating a numerical model using Mike 21 software, the other to serve as a reference in order to numerically compare the present situation to what it could be if we implemented different types of underwater constructions. This paper presents the first part of the study: selecting and melting different sources into a coherent data basis, identifying the main erosion factors, and calibrating the coupled software model against the selected reference period. Our results bring calibration of the numerical model with good fitting coefficients. They also show that the winter South-Western storm events conjugated to depressive weather conditions constitute a major factor of erosion, mainly due to wave impact in the northern part of the Almanarre beach. Together, current and wind impact is shown negligible.Keywords: Almanarre beach, coastal erosion, hydro-sedimentological, numerical model
Procedia PDF Downloads 38015782 Design of Knowledge Management System with Geographic Information System
Authors: Angga Hidayah Ramadhan, Luciana Andrawina, M. Azani Hasibuan
Abstract:
Data will be as a core of the decision if it has a good treatment or process, which is process that data into information, and information into knowledge to make a wisdom or decision. Today, many companies have not realize it include XYZ University Admission Directorate as executor of National Admission called Seleksi Masuk Bersama (SMB) that during the time, the workers only uses their feeling to make a decision. Whereas if it done, then that company can analyze the data to make a right decision to get a pin sales from student candidate or registrant that follow SMB as many as possible. Therefore, needs Knowledge Management System (KMS) with Geographic Information System (GIS) use 5C4C that can process that company data becomes more useful and can help make decisions. This information system can process data into information based on the pin sold data with 5C (Contextualized, Categorize, Calculation, Correction, Condensed) and convert information into knowledge with 4C (Comparing, Consequence, Connection, Conversation) that has been several steps until these data can be useful to make easier to take a decision or wisdom, resolve problems, communicate, and quicker to learn to the employees have not experience and also for ease of viewing/visualization based on spatial data that equipped with GIS functionality that can be used to indicate events in each province with indicator that facilitate in this system. The system also have a function to save the tacit on the system then to be proceed into explicit in expert system based on the problems that will be found from the consequences of information. With the system each team can make a decision with same ways, structured, and the important is based on the actual event/data.Keywords: 5C4C, data, information, knowledge
Procedia PDF Downloads 46715781 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models
Authors: Rossella Arcucci, Luisa D'Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti
Abstract:
This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.Keywords: data assimilation, GPU architectures, ocean models, parallel algorithm
Procedia PDF Downloads 41615780 Giftedness Cloud Model: A Psychological and Ecological Vision of Giftedness Concept
Authors: Rimeyah H. S. Almutairi, Alaa Eldin A. Ayoub
Abstract:
The aim of this study was to identify empirical and theoretical studies that explored giftedness theories and identification. In order to assess and synthesize the mechanisms, outcomes, and impacts of gifted identification models. Thus, we sought to provide an evidence-informed answer to how does current giftedness theories work and effectiveness. In order to develop a model that incorporates the advantages of existing models and avoids their disadvantages as much as possible. We conducted a systematic literature review (SLR). The disciplined analysis resulted in a final sample consisting of 30 appropriate searches. The results indicated that: (a) there is no uniform and consistent definition of Giftedness; (b) researchers are using several non-consistent criteria to detect gifted, and (d) The detection of talent is largely limited to early ages, and there is obvious neglect of adults. This study contributes to the development of Giftedness Cloud Model (GCM) which defined as a model that attempts to interpretation giftedness within an interactive psychological and ecological framework. GCM aims to help a talented to reach giftedness core and manifestation talent in creative productivity or invention. Besides that, GCM suggests classifying giftedness into four levels of mastery, excellence, creative productivity, and manifestation. In addition, GCM presents an idea to distinguish between talent and giftedness.Keywords: giftedness cloud model, talent, systematic literature review, giftedness concept
Procedia PDF Downloads 17015779 Finite Element Simulation of RC Exterior Beam-Column Joints Using Damage Plasticity Model
Authors: A. M. Halahla, M. H. Baluch, M. K. Rahman, A. H. Al-Gadhib, M. N. Akhtar
Abstract:
In the present study, 3D simulation of a typical exterior (RC) beam–column joint (BCJ) strengthened with carbon fiber-reinforced plastic (CFRP) sheet are carried out. Numerical investigations are performed using a nonlinear finite element ( FE) analysis by incorporating damage plasticity model (CDP), for material behaviour the concrete response in compression, tension softening were used, linear plastic with isotropic hardening for reinforcing steel, and linear elastic lamina material model for CFRP sheets using the commercial FE software ABAQUS. The numerical models developed in the present study are validated with the results obtained from the experiment under monotonic loading using the hydraulic Jack in displacement control mode. The experimental program includes casting of deficient BCJ loaded to failure load for both un-strengthened and strengthened BCJ. The failure mode, and deformation response of CFRP strengthened and un-strengthened joints and propagation of damage in the components of BCJ are discussed. Finite element simulations are compared with the experimental result and are noted to yield reasonable comparisons. The damage plasticity model was able to capture with good accuracy of the ultimate load and the mode of failure in the beam column joint.Keywords: reinforced concrete, exterior beam-column joints, concrete damage plasticity model, computational simulation, 3-D finite element model
Procedia PDF Downloads 38915778 An Interlock Model of Friction and Superlubricity
Authors: Azadeh Malekan, Shahin Rouhani
Abstract:
Superlubricity is a phenomenon where two surfaces in contact show negligible friction;this may be because the asperities of the two surfaces do not interlock. Two rough surfaces, when pressed against each other, can get into a formation where the summits of asperities of one surface lock into the valleys of the other surface. The amount of interlock depends on the geometry of the two surfaces. We suggest the friction force may then be proportional to the amount of interlock; this explains Superlubricity as the situation where there is little interlock. Then the friction force will be directly proportional to the normal force as it is related to the work necessary to lift the upper surface in order to clear the interlock. To investigate this model, we simulate the contact of two surfaces. In order to validate our model, we first investigate Amontons‘ law. Assuming that asperities retain deformations in the time scale while the top asperity moves across the lattice spacing Amonton’s law is observed. Structural superlubricity is examined by the hypothesis that surfaces are very rigid and there is no deformation in asperities. This may happen at small normal forces. When two identical surfaces come into contact, rotating the top surface we observe a peak in friction force near the angle of orientation where the two surfaces can interlock.Keywords: friction, amonton`s law, superlubricity, contact model
Procedia PDF Downloads 15115777 Simultaneous versus Sequential Model in Foreign Entry
Authors: Patricia Heredia, Isabel Saz, Marta Fernández
Abstract:
This article proposes that the decision regarding exporting and the choice of export channel are nested and non-independent decisions. We assume that firms make two sequential decisions before arriving at their final choice: the decision to access foreign markets and the decision about the type of channel. This hierarchical perspective of the choices involved in the process is appealing for two reasons. First, it supports the idea that people have a limited analytical capacity. Managers often break down a complex decision into a hierarchical process because this makes it more manageable. Secondly, it recognizes that important differences exist between entry modes. In light of the above, the objective of this study is to test different entry mode choice processes: independent decisions and nested and non-independent decisions. To do this, the methodology estimates and compares the following two models: (i) a simultaneous single-stage model with three entry mode choices (using a multinomial logit model); ii) a two-stage model with the export decision preceding the channel decision using a sequential logit model. The study uses resource-based factors in determining these decision processes concerning internationalization and the study carries out empirical analysis using a DOC Rioja sample of 177 firms.Using the Akaike and Schwarz Information Criteria, the empirical evidence supports the existence of a nested structure, where the decision about exporting precedes the export mode decision. The implications and contributions of the findings are discussed.Keywords: sequential logit model, two-stage choice process, export mode, wine industry
Procedia PDF Downloads 3615776 Protection and Immune Responses of DNA Vaccines Targeting Virulence Factors of Streptococcus iniae in Nile Tilapia (Oreochromis niloticus)
Authors: Pattanapon Kayansamruaj, Ha Thanh Dong, Nopadon Pirarat, Channarong Rodkhum
Abstract:
Streptococcus iniae (SI) is a devastating pathogenic bacteria causing heavy mortality in farmed fish. The application of commercialized bacterin vaccine has been reported failures as the outbreaks of the new serotype of SI were emerged in farms after vaccination and subsequently caused severe losses. In the present study, we attempted to develop effective DNA vaccines against SI infection using Nile tilapia (Oreochromis niloticus) as an animal model. Two monovalent DNA vaccines were constructed by the insertion of coding sequences of cell wall-associated virulence factors-encoding genes, comprised of eno (α-enolase) and mtsB (hydrophobic membrane protein), into cytomegalovirus expression vector (pCI-neo). In the animal trial, 30-g Nile tilapia were injected intramuscularly with 15 µg of each vaccine (mock vaccine group was injected by naked pCI-neo) and maintained for 35 days prior challenging with pathogenic SI at the dosage of 107 CFU/fish. At 13 days post-challenge, the relative percent survival of pEno, pMtsB and mock vaccine were 57%, 45% and 27%, respectively. The expression levels of immune responses-associated genes, namely, IL1β, TNF-α, TGF-β, COX2, IL-6, IL-12 and IL-13, were investigated from the spleen of experimental animal at 7 days post-vaccination (PV) and 7 days post-challenge (PC) using quantitative RT-PCR technique. Generally, at 7 days PV, the pEno vaccinated group exhibited highest level of up-regulation (1.7 to 2.9 folds) of every gene, but TGF-β, comparing to pMtsB and mock vaccine groups. However, at 7 days PC, pEno group showed significant up-regulation (1.4 to 8.5 folds) of immune-related genes as similar as mock vaccine group, while pMtsB group had lowest level of up-regulation (0.7 to 3.3 folds). Summarily, this study indicated that the pEno and pMtsB vaccines could elicit the immune responses of the fish and the magnitude of gene expression at 7 days PV was also consistent with the protection level conferred by the vaccine.Keywords: gene expression, DNA vaccine, Nile tilapia, Streptococcus iniae
Procedia PDF Downloads 33315775 The Effect of Feature Selection on Pattern Classification
Authors: Chih-Fong Tsai, Ya-Han Hu
Abstract:
The aim of feature selection (or dimensionality reduction) is to filter out unrepresentative features (or variables) making the classifier perform better than the one without feature selection. Since there are many well-known feature selection algorithms, and different classifiers based on different selection results may perform differently, very few studies consider examining the effect of performing different feature selection algorithms on the classification performances by different classifiers over different types of datasets. In this paper, two widely used algorithms, which are the genetic algorithm (GA) and information gain (IG), are used to perform feature selection. On the other hand, three well-known classifiers are constructed, which are the CART decision tree (DT), multi-layer perceptron (MLP) neural network, and support vector machine (SVM). Based on 14 different types of datasets, the experimental results show that in most cases IG is a better feature selection algorithm than GA. In addition, the combinations of IG with DT and IG with SVM perform best and second best for small and large scale datasets.Keywords: data mining, feature selection, pattern classification, dimensionality reduction
Procedia PDF Downloads 672