Search results for: Model of community management
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9926

Search results for: Model of community management

7976 On the Mathematical Model of Vascular Endothelial Growth Connected with a Tumor Proliferation

Authors: N. Khatiashvili, Ch. Pirumova, V. Akhobadze

Abstract:

In the paper the mathematical model of tumor growth is considered. New capillary network formation, which supply cancer cells with the nutrients, is taken into the account. A formula estimating a tumor growth in connection with the number of capillaries is obtained.

Keywords: Differential Equations, Mathematical Models, Vascular Endothelial, Tumor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1204
7975 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models

Authors: Rossella Arcucci, Luisa D’Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti

Abstract:

This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.

Keywords: Data Assimilation, Parallel Algorithm, GPU architectures, Ocean Models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1982
7974 Continual Learning Using Data Generation for Hyperspectral Remote Sensing Scene Classification

Authors: Samiah Alammari, Nassim Ammour

Abstract:

When providing a massive number of tasks successively to a deep learning process, a good performance of the model requires preserving the previous tasks data to retrain the model for each upcoming classification. Otherwise, the model performs poorly due to the catastrophic forgetting phenomenon. To overcome this shortcoming, we developed a successful continual learning deep model for remote sensing hyperspectral image regions classification. The proposed neural network architecture encapsulates two trainable subnetworks. The first module adapts its weights by minimizing the discrimination error between the land-cover classes during the new task learning, and the second module tries to learn how to replicate the data of the previous tasks by discovering the latent data structure of the new task dataset. We conduct experiments on hyperspectral image (HSI) dataset on Indian Pines. The results confirm the capability of the proposed method.

Keywords: Continual learning, data reconstruction, remote sensing, hyperspectral image segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188
7973 Conventional and PSO Based Approaches for Model Reduction of SISO Discrete Systems

Authors: S. K. Tomar, R. Prasad, S. Panda, C. Ardil

Abstract:

Reduction of Single Input Single Output (SISO) discrete systems into lower order model, using a conventional and an evolutionary technique is presented in this paper. In the conventional technique, the mixed advantages of Modified Cauer Form (MCF) and differentiation are used. In this method the original discrete system is, first, converted into equivalent continuous system by applying bilinear transformation. The denominator of the equivalent continuous system and its reciprocal are differentiated successively, the reduced denominator of the desired order is obtained by combining the differentiated polynomials. The numerator is obtained by matching the quotients of MCF. The reduced continuous system is converted back into discrete system using inverse bilinear transformation. In the evolutionary technique method, Particle Swarm Optimization (PSO) is employed to reduce the higher order model. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical example.

Keywords: Discrete System, Single Input Single Output (SISO), Bilinear Transformation, Reduced Order Model, Modified CauerForm, Polynomial Differentiation, Particle Swarm Optimization, Integral Squared Error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927
7972 Urban Flood Control and Management - An Integrated Approach

Authors: Ranjan Sarukkalige, Joseph Sanjaya Ma

Abstract:

Flood management is one of the important fields in urban storm water management. Floods are influenced by the increase of huge storm event, or improper planning of the area. This study mainly provides the flood protection in four stages; planning, flood event, responses and evaluation. However it is most effective then flood protection is considered in planning/design and evaluation stages since both stages represent the land development of the area. Structural adjustments are often more reliable than nonstructural adjustments in providing flood protection, however structural adjustments are constrained by numerous factors such as political constraints and cost. Therefore it is important to balance both adjustments with the situation. The technical decisions provided will have to be approved by the higher-ups who have the power to decide on the final solution. Costs however, are the biggest factor in determining the final decision. Therefore this study recommends flood protection system should have been integrated and enforces more in the early stages (planning and design) as part of the storm water management plan. Factors influencing the technical decisions provided should be reduced as low as possible to avoid a reduction in the expected performance of the proposed adjustments.

Keywords: Urban Flood, flood protection, water management, storm water, cost,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1496
7971 A Middleware Management System with Supporting Holonic Modules for Reconfigurable Management System

Authors: Roscoe McLean, Jared Padayachee, Glen Bright

Abstract:

There is currently a gap in the technology covering the rapid establishment of control after a reconfiguration in a Reconfigurable Manufacturing System. This gap involves the detection of the factory floor state and the communication link between the factory floor and the high-level software. In this paper, a thin, hardware-supported Middleware Management System (MMS) is proposed and its design and implementation are discussed. The research found that a cost-effective localization technique can be combined with intelligent software to speed up the ramp-up of a reconfigured system. The MMS makes the process more intelligent, more efficient and less time-consuming, thus supporting the industrial implementation of the RMS paradigm.

Keywords: Intelligent systems, middleware, reconfigurable manufacturing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1589
7970 Multivalued Knowledge-Base based on Multivalued Datalog

Authors: Agnes Achs

Abstract:

The basic aim of our study is to give a possible model for handling uncertain information. This model is worked out in the framework of DATALOG. The concept of multivalued knowledgebase will be defined as a quadruple of any background knowledge; a deduction mechanism; a connecting algorithm, and a function set of the program, which help us to determine the uncertainty levels of the results. At first the concept of fuzzy Datalog will be summarized, then its extensions for intuitionistic- and interval-valued fuzzy logic is given and the concept of bipolar fuzzy Datalog is introduced. Based on these extensions the concept of multivalued knowledge-base will be defined. This knowledge-base can be a possible background of a future agent-model.

Keywords: Fuzzy-, intuitionistic-, bipolar datalog, multivalued knowledge-base

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1138
7969 The Strategies for Teaching Digital Art in the Classroom as a Way of Enhancing Pupils’ Artistic Creativity

Authors: Aber Salem Aboalgasm, Rupert Ward

Abstract:

Teaching art by digital means is a big challenge for the majority of teachers of art and design in primary schools, yet it allows relationships between art, technology and creativity to be clearly identified. The aim of this article is to present a modern way of teaching art, using digital tools in the art classroom to improve creative ability in pupils aged between nine and eleven years. It also presents a conceptual model for creativity based on digital art. The model could be useful for pupils interested in learning to draw by using an e-drawing package, and for teachers who are interested in teaching modern digital art in order to improve children’s creativity. By illustrating the strategy of teaching art through technology, this model may also help education providers to make suitable choices about which technological approaches are most effective in enhancing students’ creative ability, and which digital art tools can benefit children by developing their technical skills. It is also expected that use of this model will help to develop skills of social interaction, which may in turn improve intellectual ability.

Keywords: Digital tools, motivation, creative activity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3073
7968 Kinetic and Optimization Studies on Ethanol Production from Corn Flour

Authors: K. Manikandan, T. Viruthagiri

Abstract:

Studies on Simultaneous Saccharification and Fermentation (SSF) of corn flour, a major agricultural product as the substrate using starch digesting glucoamylase enzyme derived from Aspergillus niger and non starch digesting and sugar fermenting Saccharomyces cerevisiae in a batch fermentation. Experiments based on Central Composite Design (CCD) were conducted to study the effect of substrate concentration, pH, temperature, enzyme concentration on Ethanol Concentration and the above parameters were optimized using Response Surface Methodology (RSM). The optimum values of substrate concentration, pH, temperature and enzyme concentration were found to be 160 g/l, 5.5, 30°C and 50 IU respectively. The effect of inoculums age on ethanol concentration was also investigated. The corn flour solution equivalent to 16% initial starch concentration gave the highest ethanol concentration of 63.04 g/l after 48 h of fermentation at optimum conditions of pH and temperature. Monod model and Logistic model were used for growth kinetics and Leudeking – Piret model was used for product formation kinetics.

Keywords: Simultaneous Saccharification and Fermentation(SSF), Corn Starch, Ethanol, Logisitic Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3888
7967 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory

Authors: Chiung-Hui Chen

Abstract:

The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.

Keywords: Behavior, big data, hierarchical Hidden Markov Model, intelligent object.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 739
7966 Replicating Brain’s Resting State Functional Connectivity Network Using a Multi-Factor Hub-Based Model

Authors: B. L. Ho, L. Shi, D. F. Wang, V. C. T. Mok

Abstract:

The brain’s functional connectivity while temporally non-stationary does express consistency at a macro spatial level. The study of stable resting state connectivity patterns hence provides opportunities for identification of diseases if such stability is severely perturbed. A mathematical model replicating the brain’s spatial connections will be useful for understanding brain’s representative geometry and complements the empirical model where it falls short. Empirical computations tend to involve large matrices and become infeasible with fine parcellation. However, the proposed analytical model has no such computational problems. To improve replicability, 92 subject data are obtained from two open sources. The proposed methodology, inspired by financial theory, uses multivariate regression to find relationships of every cortical region of interest (ROI) with some pre-identified hubs. These hubs acted as representatives for the entire cortical surface. A variance-covariance framework of all ROIs is then built based on these relationships to link up all the ROIs. The result is a high level of match between model and empirical correlations in the range of 0.59 to 0.66 after adjusting for sample size; an increase of almost forty percent. More significantly, the model framework provides an intuitive way to delineate between systemic drivers and idiosyncratic noise while reducing dimensions by more than 30 folds, hence, providing a way to conduct attribution analysis. Due to its analytical nature and simple structure, the model is useful as a standalone toolkit for network dependency analysis or as a module for other mathematical models.

Keywords: Functional magnetic resonance imaging, multivariate regression, network hubs, resting state functional connectivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 784
7965 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: Recurrent Neural Network, players lineup, basketball data, decision making model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 782
7964 Sampled-Data Model Predictive Tracking Control for Mobile Robot

Authors: Wookyong Kwon, Sangmoon Lee

Abstract:

In this paper, a sampled-data model predictive tracking control method is presented for mobile robots which is modeled as constrained continuous-time linear parameter varying (LPV) systems. The presented sampled-data predictive controller is designed by linear matrix inequality approach. Based on the input delay approach, a controller design condition is derived by constructing a new Lyapunov function. Finally, a numerical example is given to demonstrate the effectiveness of the presented method.

Keywords: Model predictive control, sampled-data control, linear parameter varying systems, LPV.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1245
7963 Kuehne + Nagel's PharmaChain: IoT-Enabled Product Monitoring Using Radio Frequency Identification

Authors: Rebecca Angeles

Abstract:

This case study features the Kuehne + Nagel PharmaChain solution for ‘cold chain’ pharmaceutical and biologic product shipments with IOT-enabled features for shipment temperature and location tracking. Using the case study method and content analysis, this research project investigates the application of the structurational model of technology theory introduced by Orlikowski in order to interpret the firm’s entry and participation in the IOT-impelled marketplace.

Keywords: Internet of things, IoT, radio frequency identification, supply chain management, business intelligence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1952
7962 Forecasting Malaria Cases in Bujumbura

Authors: Hermenegilde Nkurunziza, Albrecht Gebhardt, Juergen Pilz

Abstract:

The focus in this work is to assess which method allows a better forecasting of malaria cases in Bujumbura ( Burundi) when taking into account association between climatic factors and the disease. For the period 1996-2007, real monthly data on both malaria epidemiology and climate in Bujumbura are described and analyzed. We propose a hierarchical approach to achieve our objective. We first fit a Generalized Additive Model to malaria cases to obtain an accurate predictor, which is then used to predict future observations. Various well-known forecasting methods are compared leading to different results. Based on in-sample mean average percentage error (MAPE), the multiplicative exponential smoothing state space model with multiplicative error and seasonality performed better.

Keywords: Burundi, Forecasting, Malaria, Regressionmodel, State space model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1968
7961 Human Growth Curve Estimation through a Combination of Longitudinal and Cross-sectional Data

Authors: Sedigheh Mirzaei S., Debasis Sengupta

Abstract:

Parametric models have been quite popular for studying human growth, particularly in relation to biological parameters such as peak size velocity and age at peak size velocity. Longitudinal data are generally considered to be vital for fittinga parametric model to individual-specific data, and for studying the distribution of these biological parameters in a human population. However, cross-sectional data are easier to obtain than longitudinal data. In this paper, we present a method of combining longitudinal and cross-sectional data for the purpose of estimating the distribution of the biological parameters. We demonstrate, through simulations in the special case ofthePreece Baines model, how estimates based on longitudinal data can be improved upon by harnessing the information contained in cross-sectional data.We study the extent of improvement for different mixes of the two types of data, and finally illustrate the use of the method through data collected by the Indian Statistical Institute.

Keywords: Preece-Baines growth model, MCMC method, Mixed effect model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2117
7960 A Quantitative Study about Assessing the Effectiveness of Electronic Customer Relationship Management: A Case of Two Hotels in Mauritius

Authors: Shaheena Erkiah, Adjnu Damar Ladkoo

Abstract:

Worldwide, improving tourism competitiveness has been on the agendas of many stakeholders of the hotel sector, and they seem to have agreed that one of the best ways to compete is via the implementation of electronic customer relationship management (e-CRM). In so doing, the organizations enjoy strategic positioning on the competitive market by managing better not only the customers but, other business components including knowledge and employee management. Over the recent years, the tourism industry in Mauritius has witnessed a drastic economic boom at international and national levels; providing a new outlook to boost business performance through existing and potential customers. E-CRM has been one of the management tools used to achieving this position. Thus, this insightful context- Mauritius- was opted for the study. The aim was to assess the effectiveness of e-CRM as a strategic tool in the hotel sector in Mauritius through the implementation of business strategy to create competitive advantage and impact on the business performance. To achieve the objectives of the study, a quantitative research methodology was adopted and the research revealed that e-CRM is indeed an effective strategic tool in the hotel industry in Mauritius that can provide a competitive advantage and impact positively on the organization’s performance.

Keywords: Customer, electronic, management, relationship, strategic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1153
7959 Elasto-Plastic Behavior of Rock during Temperature Drop

Authors: N. Reppas, Y. L. Gui, B. Wetenhall, C. T. Davie, J. Ma

Abstract:

A theoretical constitutive model describing the stress-strain behavior of rock subjected to different confining pressures is presented. A bounding surface plastic model with hardening effects is proposed which includes the effect of temperature drop. The bounding surface is based on a mapping rule and the temperature effect on rock is controlled by Poisson’s ratio. Validation of the results against available experimental data is also presented. The relation of deviatoric stress and axial strain is illustrated at different temperatures to analyze the effect of temperature decrease in terms of stiffness of the material.

Keywords: Bounding surface, cooling of rock, plasticity model, rock deformation, elasto-plastic behavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 864
7958 The Model of Blended Learning and Its Use at Foreign Language Teaching

Authors: A. A. Kudysheva, A. N. Kudyshev

Abstract:

In present article the model of Blended Learning, its advantage at foreign language teaching, and also some problems that can arise during its use are considered. The Blended Learning is a special organization of learning, which allows to combine classroom work and modern technologies in electronic distance teaching environment. Nowadays a lot of European educational institutions and companies use such technology. Through this method: student gets the opportunity to learn in a group (classroom) with a teacher and additionally at home at a convenient time; student himself sets the optimal speed and intensity of the learning process; this method helps student to discipline himself and learn to work independently.

Keywords: Foreign language, information and communication technology (ICT), model of Blended Learning, virtual cool room, technophobia

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3373
7957 Extraction of Forest Plantation Resources in Selected Forest of San Manuel, Pangasinan, Philippines Using LiDAR Data for Forest Status Assessment

Authors: Mark Joseph Quinto, Roan Beronilla, Guiller Damian, Eliza Camaso, Ronaldo Alberto

Abstract:

Forest inventories are essential to assess the composition, structure and distribution of forest vegetation that can be used as baseline information for management decisions. Classical forest inventory is labor intensive and time-consuming and sometimes even dangerous. The use of Light Detection and Ranging (LiDAR) in forest inventory would improve and overcome these restrictions. This study was conducted to determine the possibility of using LiDAR derived data in extracting high accuracy forest biophysical parameters and as a non-destructive method for forest status analysis of San Manual, Pangasinan. Forest resources extraction was carried out using LAS tools, GIS, Envi and .bat scripts with the available LiDAR data. The process includes the generation of derivatives such as Digital Terrain Model (DTM), Canopy Height Model (CHM) and Canopy Cover Model (CCM) in .bat scripts followed by the generation of 17 composite bands to be used in the extraction of forest classification covers using ENVI 4.8 and GIS software. The Diameter in Breast Height (DBH), Above Ground Biomass (AGB) and Carbon Stock (CS) were estimated for each classified forest cover and Tree Count Extraction was carried out using GIS. Subsequently, field validation was conducted for accuracy assessment. Results showed that the forest of San Manuel has 73% Forest Cover, which is relatively much higher as compared to the 10% canopy cover requirement. On the extracted canopy height, 80% of the tree’s height ranges from 12 m to 17 m. CS of the three forest covers based on the AGB were: 20819.59 kg/20x20 m for closed broadleaf, 8609.82 kg/20x20 m for broadleaf plantation and 15545.57 kg/20x20m for open broadleaf. Average tree counts for the tree forest plantation was 413 trees/ha. As such, the forest of San Manuel has high percent forest cover and high CS.

Keywords: Carbon stock, forest inventory, LiDAR, tree count.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1253
7956 Dynamic Metadata Schemes in the Neutron and Photon Science Communities: A Case Study of X-Ray Photon Correlation Spectroscopy

Authors: Amir Tosson, Mohammad Reza, Christian Gutt

Abstract:

Metadata is one of the most important aspects for advancing data management practices within all research communities. Definitions and schemes of metadata are inter alia of particular significance in the domain of neutron and photon scattering experiments covering a broad area of different scientific disciplines. The demand of describing continuously evolving highly non-standardized experiments, including the resulting processed and published data, constitutes a considerable challenge for a static definition of metadata. Here, we present the concept of dynamic metadata for the neutron and photon scientific community, which enriches a static set of defined basic metadata. We explore the idea of dynamic metadata with the help of the use case of X-ray Photon Correlation Spectroscopy (XPCS), which is a synchrotron-based scattering technique that allows the investigation of nanoscale dynamic processes. It serves here as a demonstrator of how dynamic metadata can improve data acquisition, sharing, and analysis workflows. Our approach enables researchers to tailor metadata definitions dynamically and adapt them to the evolving demands of describing data and results from a diverse set of experiments. We demonstrate that dynamic metadata standards yield advantages that enhance data reproducibility, interoperability, and the dissemination of knowledge.

Keywords: Big data, metadata, schemas, XPCS, X-ray Photon Correlation Spectroscopy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 86
7955 Finite Volume Model to Study the Effect of Buffer on Cytosolic Ca2+ Advection Diffusion

Authors: Brajesh Kumar Jha, Neeru Adlakha, M. N. Mehta

Abstract:

Calcium [Ca2+] is an important second messenger which plays an important role in signal transduction. There are several parameters that affect its concentration profile like buffer source etc. The effect of stationary immobile buffer on Ca2+ concentration has been incorporated which is a very important parameter needed to be taken into account in order to make the model more realistic. Interdependence of all the important parameters like diffusion coefficient and influx over [Ca2+] profile has been studied. Model is developed in the form of advection diffusion equation together with buffer concentration. A program has been developed using finite volume method for the entire problem and simulated on an AMD-Turion 32-bit machine to compute the numerical results.

Keywords: Ca2+ profile, buffer, Astrocytes, Advection diffusion, FVM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1643
7954 A DEA Model for Performance Evaluation in The Presence of Time Lag Effect

Authors: Yanshuang Zhang, Byungho Jeong

Abstract:

Data Envelopment Analysis (DEA) is a methodology that computes efficiency values for decision making units (DMU) in a given period by comparing the outputs with the inputs. In many cases, there are some time lag between the consumption of inputs and the production of outputs. For a long-term research project, it is hard to avoid the production lead time phenomenon. This time lag effect should be considered in evaluating the performance of organizations. This paper suggests a model to calculate efficiency values for the performance evaluation problem with time lag. In the experimental part, the proposed methods are compared with the CCR and an existing time lag model using the data set of the 21st century frontier R&D program which is a long-term national R&D program of Korea.

Keywords: DEA, Efficiency, Time Lag

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1868
7953 Strategic Management via System Dynamics Simulation Models

Authors: G. Papageorgiou, A. Hadjis

Abstract:

This paper examines the problem of strategic management in highly turbulent dynamic business environmental conditions. As shown the high complexity of the problem can be managed with the use of System Dynamics Models and Computer Simulation in obtaining insights, and thorough understanding of the interdependencies between the organizational structure and the business environmental elements, so that effective product –market strategies can be designed. Simulation reveals the underlying forces that hold together the structure of an organizational system in relation to its environment. Such knowledge will contribute to the avoidance of fundamental planning errors and enable appropriate proactive well focused action.

Keywords: Strategic Management, System Dynamics, Modelingand Simulation, Strategic Planning, Organizational Dynamics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2586
7952 Kinetic Modeling of the Fischer-Tropsch Reactions and Modeling Steady State Heterogeneous Reactor

Authors: M. Ahmadi Marvast, M. Sohrabi, H. Ganji

Abstract:

The rate of production of main products of the Fischer-Tropsch reactions over Fe/HZSM5 bifunctional catalyst in a fixed bed reactor is investigated at a broad range of temperature, pressure, space velocity, H2/CO feed molar ratio and CO2, CH4 and water flow rates. Model discrimination and parameter estimation were performed according to the integral method of kinetic analysis. Due to lack of mechanism development for Fisher – Tropsch Synthesis on bifunctional catalysts, 26 different models were tested and the best model is selected. Comprehensive one and two dimensional heterogeneous reactor models are developed to simulate the performance of fixed-bed Fischer – Tropsch reactors. To reduce computational time for optimization purposes, an Artificial Feed Forward Neural Network (AFFNN) has been used to describe intra particle mass and heat transfer diffusion in the catalyst pellet. It is seen that products' reaction rates have direct relation with H2 partial pressure and reverse relation with CO partial pressure. The results show that the hybrid model has good agreement with rigorous mechanistic model, favoring that the hybrid model is about 25-30 times faster.

Keywords: Fischer-Tropsch, heterogeneous modeling, kinetic study.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2800
7951 Design and Analysis of Flexible Slider Crank Mechanism

Authors: Thanh-Phong Dao, Shyh-Chour Huang

Abstract:

This study presents the optimal design and formulation of a kinematic model of a flexible slider crank mechanism. The objective of the proposed innovative design is to take extra advantage of the compliant mechanism and maximize the fatigue life by applying the Taguchi method. A formulated kinematic model is developed using a pseudo-rigid-body model (PRBM). By means of mathematic models, the kinematic behaviors of the flexible slider crank mechanism are captured using MATLAB software. Finite element analysis (FEA) is used to show the stress distribution. The results show that the optimal shape of the flexible hinge includes a force of 8.5N, a width of 9mm and a thickness of 1.1mm. Analysis of variance shows that the thickness of the proposed hinge is the most significant parameter, with an F test of 15.5. Finally, a prototype is manufactured to prepare for testing the kinematic and dynamic behaviors.

Keywords: Kinematic behavior, fatigue life, pseudo-rigid-body model, flexible slider crank mechanism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5101
7950 Hand Gesture Recognition Based on Combined Features Extraction

Authors: Mahmoud Elmezain, Ayoub Al-Hamadi, Bernd Michaelis

Abstract:

Hand gesture is an active area of research in the vision community, mainly for the purpose of sign language recognition and Human Computer Interaction. In this paper, we propose a system to recognize alphabet characters (A-Z) and numbers (0-9) in real-time from stereo color image sequences using Hidden Markov Models (HMMs). Our system is based on three main stages; automatic segmentation and preprocessing of the hand regions, feature extraction and classification. In automatic segmentation and preprocessing stage, color and 3D depth map are used to detect hands where the hand trajectory will take place in further step using Mean-shift algorithm and Kalman filter. In the feature extraction stage, 3D combined features of location, orientation and velocity with respected to Cartesian systems are used. And then, k-means clustering is employed for HMMs codeword. The final stage so-called classification, Baum- Welch algorithm is used to do a full train for HMMs parameters. The gesture of alphabets and numbers is recognized using Left-Right Banded model in conjunction with Viterbi algorithm. Experimental results demonstrate that, our system can successfully recognize hand gestures with 98.33% recognition rate.

Keywords: Gesture Recognition, Computer Vision & Image Processing, Pattern Recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4009
7949 A Comparative Analysis of Solid Waste Treatment Technologies on Cost and Environmental Basis

Authors: Nesli Aydin

Abstract:

Waste management decision making in developing countries has moved towards being more pragmatic, transparent, sustainable and comprehensive. Turkey is required to make its waste related legislation compatible with European Legislation as it is a candidate country of the European Union. Improper Turkish practices such as open burning and open dumping practices must be abandoned urgently, and robust waste management systems have to be structured. The determination of an optimum waste management system in any region requires a comprehensive analysis in which many criteria are taken into account by stakeholders. In conducting this sort of analysis, there are two main criteria which are evaluated by waste management analysts; economic viability and environmentally friendliness. From an analytical point of view, a central characteristic of sustainable development is an economic-ecological integration. It is predicted that building a robust waste management system will need significant effort and cooperation between the stakeholders in developing countries such as Turkey. In this regard, this study aims to provide data regarding the cost and environmental burdens of waste treatment technologies such as an incinerator, an autoclave (with different capacities), a hydroclave and a microwave coupled with updated information on calculation methods, and a framework for comparing any proposed scenario performances on a cost and environmental basis.

Keywords: Decision making, economic viability, environmentally friendliness, stakeholder, waste management systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1246
7948 Improving Automotive Efficiency through Lean Management Tools: A Case Study

Authors: Raed EL-Khalil, Hussein Zeaiter

Abstract:

Managing and improving efficiency in the current highly competitive global automotive industry demands that those companies adopt leaner and more flexible systems. During the past 20 years the domestic automotive industry in North America has been focusing on establishing new management strategies in order to meet market demands. The lean management process also known as Toyota Manufacturing Process (TPS) or lean manufacturing encompasses tools and techniques that were established in order to provide the best quality product with the fastest lead time at the lowest cost. The following paper presents a study that focused on improving labor efficiency at one of the Big Three (Ford, GM, Chrysler LLC) domestic automotive facility in North America. The objective of the study was to utilize several lean management tools in order to optimize the efficiency and utilization levels at the “Pre- Marriage” chassis area in a truck manufacturing and assembly facility. Utilizing three different lean tools (i.e. Standardization of work, 7 Wastes, and 5S) this research was able to improve efficiency by 51%, utilization by 246%, and reduce operations by 14%. The return on investment calculated based on the improvements made was 284%.

Keywords: Lean Manufacturing, Standardized Work, Operation Efficiency and Utilization, Operations Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5477
7947 A Quantum Algorithm of Constructing Image Histogram

Authors: Yi Zhang, Kai Lu, Ying-hui Gao, Mo Wang

Abstract:

Histogram plays an important statistical role in digital image processing. However, the existing quantum image models are deficient to do this kind of image statistical processing because different gray scales are not distinguishable. In this paper, a novel quantum image representation model is proposed firstly in which the pixels with different gray scales can be distinguished and operated simultaneously. Based on the new model, a fast quantum algorithm of constructing histogram for quantum image is designed. Performance comparison reveals that the new quantum algorithm could achieve an approximately quadratic speedup than the classical counterpart. The proposed quantum model and algorithm have significant meanings for the future researches of quantum image processing.

Keywords: Quantum Image Representation, Quantum Algorithm, Image Histogram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2329