Search results for: empirical model
5325 Lattice Boltzmann Simulation of the Carbonization of Wood Particle
Authors: Ahmed Mahmoudi, Imen Mejri, Mohamed A. Abbassi, Ahmed Omri
Abstract:
A numerical study based on the Lattice Boltzmann Method (LBM) is proposed to solve one, two and three dimensional heat and mass transfer for isothermal carbonization of thick wood particles. To check the validity of the proposed model, computational results have been compared with the published data and a good agreement is obtained. Then, the model is used to study the effect of reactor temperature and thermal boundary conditions, on the evolution of the local temperature and the mass distributions of the wood particle during carbonization
Keywords: Lattice Boltzmann Method, pyrolysis conduction, carbonization, Heat and mass transfer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27075324 Aspect-Level Sentiment Analysis with Multi-Channel and Graph Convolutional Networks
Authors: Jiajun Wang, Xiaoge Li
Abstract:
The purpose of the aspect-level sentiment analysis task is to identify the sentiment polarity of aspects in a sentence. Currently, most methods mainly focus on using neural networks and attention mechanisms to model the relationship between aspects and context, but they ignore the dependence of words in different ranges in the sentence, resulting in deviation when assigning relationship weight to other words other than aspect words. To solve these problems, we propose an aspect-level sentiment analysis model that combines a multi-channel convolutional network and graph convolutional network (GCN). Firstly, the context and the degree of association between words are characterized by Long Short-Term Memory (LSTM) and self-attention mechanism. Besides, a multi-channel convolutional network is used to extract the features of words in different ranges. Finally, a convolutional graph network is used to associate the node information of the dependency tree structure. We conduct experiments on four benchmark datasets. The experimental results are compared with those of other models, which shows that our model is better and more effective.
Keywords: Aspect-level sentiment analysis, attention, multi-channel convolution network, graph convolution network, dependency tree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5065323 An Empirical Analysis and Comparative Study of Liquidity Ratios and Asset-Liability Management of Banks Operating in India
Authors: Amit Kumar Meena, Joydip Dhar
Abstract:
This paper is focused on the analysis and comparison of liquidity ratios and asset liability management practices in top three banks from public, private and foreign sector in India. The analysis is based upon the liquidity ratios calculation and the determination of maturity gap profiles for the banks under study. The paper also compares these banks maturity gap profiles with their corresponding group’s maturity gap profiles. This paper identifies the interest rate sensitivity of the balance sheet items of these banks to determine the gap between rate sensitive assets and rate sensitive liabilities. The results of this study suggest that overall banks in India have very good short term liquidity position and all banks are financing their short term liabilities by their long term assets.
Keywords: ALM, Liquidity ratios, Rate sensitive Assets, Rate Sensitive Liabilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 46555322 Design and Analysis of MEMS based Accelerometer for Automatic Detection of Railway Wheel Flat
Authors: Rajib Ul Alam Uzzal, Ion Stiharu, Waiz Ahmed
Abstract:
This paper presents the modeling of a MEMS based accelerometer in order to detect the presence of a wheel flat in the railway vehicle. A haversine wheel flat is assigned to one wheel of a 5 DOF pitch plane vehicle model, which is coupled to a 3 layer track model. Based on the simulated acceleration response obtained from the vehicle-track model, an accelerometer is designed that meets all the requirements to detect the presence of a wheel flat. The proposed accelerometer can survive in a dynamic shocking environment with acceleration up to ±150g. The parameters of the accelerometer are calculated in order to achieve the required specifications using lumped element approximation and the results are used for initial design layout. A finite element analysis code (COMSOL) is used to perform simulations of the accelerometer under various operating conditions and to determine the optimum configuration. The simulated results are found within about 2% of the calculated values, which indicates the validity of lumped element approach. The stability of the accelerometer is also determined in the desired range of operation including the condition under shock.
Keywords: MEMS accelerometer, Pitch plane vehicle, wheel flat.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30735321 Utilizing Dutch Auction in an Agent-based Model E-commerce System
Authors: Costin Badica, Maria Ganzha, Maciej Gawinecki, Pawel Kobzdej, Marcin Paprzycki
Abstract:
Recently, we have presented an initial implementation of a model agent-based e-commerce system, which utilized a simple price negotiation mechanism–English Auction. In this note we discuss how a Dutch Auction involving multiple units of a product can be included in our system. We present UML diagrams of agents involved in price negotiations and briefly discuss rule-based mechanism exemplifying Dutch Auction.Keywords: e-commerce, rule-based price negotiation mechanism, Dutch Auction, agent system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17415320 Dynamical Analysis of a Harvesting Model of Phytoplankton-Zooplankton Interaction
Authors: Anuj K. Sharma, Amit Sharma, Kulbhushan Agnihotri
Abstract:
In this work, we propose and analyze a model of Phytoplankton-Zooplankton interaction with harvesting considering that some species are exploited commercially for food. Criteria for local stability, instability and global stability are derived and some threshold harvesting levels are explored to maintain the population at an appropriate equilibrium level even if the species are exploited continuously.Further,biological and bionomic equilibria of the system are obtained and an optimal harvesting policy is also analysed using the Pantryagin’s Maximum Principle.Finally analytical findings are also supported by some numerical simulations.
Keywords: Phytoplankton-Zooplankton, Global stability, Bionomic Equilibrium, Pontrying-Maximum Principal.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22735319 Technological Deep Assessment of Automotive Parts Manufacturers Case of Iranian Manufacturers
Authors: Manouchehre Ansari, Mahmoud Dehghan Nayeri, Reza Yousefi Zenouz
Abstract:
In order to develop any strategy, it is essential to first identify opportunities, threats, weak and strong points. Assessment of technology level provides the possibility of concentrating on weak and strong points. The results of technology assessment have a direct effect on decision making process in the field of technology transfer or expansion of internal research capabilities so it has a critical role in technology management. This paper presents a conceptual model to analyze the technology capability of a company as a whole and in four main aspects of technology. This model was tested on 10 automotive parts manufacturers in IRAN. Using this model, capability level of manufacturers was investigated in four fields of managing aspects, hard aspects, human aspects, and information and knowledge aspects. Results show that these firms concentrate on hard aspect of technology while others aspects are poor and need to be supported more. So this industry should develop other aspects of technology as well as hard aspect to have effective and efficient use of its technology. These paper findings are useful for the technology planning and management in automotive part manufactures in IRAN and other Industries which are technology followers and transport their needed technologies.Keywords: Technology, Technological evaluation, TechnologyMaturity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17375318 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator
Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty
Abstract:
Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and Validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) where in the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation & Control design team. This paper discusses about the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.
Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), Steady State, Transient State.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25195317 Knowledge Acquisition and Client Organisations: Case Study of a Student as Producer
Authors: Barry Ardley, Abi Hunt, Nick Taylor
Abstract:
As a theoretical and practical framework this study uses the student as producer approach to learning in higher education, as adopted by the Lincoln International Business School, University of Lincoln, UK. Student as producer positions learners as skilled and capable agents, able to participate as partners with tutors in live research projects. To illuminate the nature of this approach to learning and to highlight its critical issues, the authors report on two guided student consultancy projects. These were set up with the assistance of two local organisations in the city of Lincoln UK. Using the student as producer model to deliver the projects enabled learners to acquire and develop a range of key skills and knowledge, not easily accessible in more traditional educational settings. This paper presents a systematic case study analysis of the eight organising principles of the student as producer model, as adopted by university tutors. The experience of tutors implementing student as producer suggests that the model can be widely applied to benefit not only the learning and teaching experiences of higher education students, and staff, but additionally, a university’s research programme and its community partners.
Keywords: Experiential learning, consultancy clients, student as producer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2425316 The Acceptance of E-Assessment Considering Security Perspective: Work in Progress
Authors: Kavitha Thamadharan, Nurazean Maarop
Abstract:
The implementation of e-assessment as tool to support the process of teaching and learning in university has become a popular technological means in universities. E-Assessment provides many advantages to the users especially the flexibility in teaching and learning. The e-assessment system has the capability to improve its quality of delivering education. However, there still exists a drawback in terms of security which limits the user acceptance of the online learning system. Even though there are studies providing solutions for identified security threats in e-learning usage, there is no particular model which addresses the factors that influences the acceptance of e-assessment system by lecturers from security perspective. The aim of this study is to explore security aspects of eassessment in regard to the acceptance of the technology. As a result a conceptual model of secure acceptance of e-assessment is proposed. Both human and security factors are considered in formulation of this conceptual model. In order to increase understanding of critical issues related to the subject of this study, interpretive approach involving convergent mixed method research method is proposed to be used to execute the research. This study will be useful in providing more insightful understanding regarding the factors that influence the user acceptance of e-assessment system from security perspective.
Keywords: Secure Technology Acceptance, E-Assessment Security, E-Assessment, Education Technology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24385315 Minimizing Mutant Sets by Equivalence and Subsumption
Authors: Samia Alblwi, Amani Ayad
Abstract:
Mutation testing is the art of generating syntactic variations of a base program and checking whether a candidate test suite can identify all the mutants that are not semantically equivalent to the base; this technique can be used to assess the quality of test suite. One of the main obstacles to the widespread use of mutation testing is cost, as even small programs (a few dozen lines of code) can give rise to a large number of mutants (up to hundreds); this has created an incentive to seek to reduce the number of mutants while preserving their collective effectiveness. Two criteria have been used to reduce the size of mutant sets: equivalence, which aims to partition the set of mutants into equivalence classes modulo semantic equivalence, and selecting one representative per class; and, subsumption, which aims to define a partial ordering among mutants that ranks mutants by effectiveness and seeks to select maximal elements in this ordering. In this paper, we analyze these two policies using analytical and empirical criteria.
Keywords: Mutation testing, mutant sets, mutant equivalence, mutant subsumption, mutant set minimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1935314 Automatic Distance Compensation for Robust Voice-based Human-Computer Interaction
Authors: Randy Gomez, Keisuke Nakamura, Kazuhiro Nakadai
Abstract:
Distant-talking voice-based HCI system suffers from performance degradation due to mismatch between the acoustic speech (runtime) and the acoustic model (training). Mismatch is caused by the change in the power of the speech signal as observed at the microphones. This change is greatly influenced by the change in distance, affecting speech dynamics inside the room before reaching the microphones. Moreover, as the speech signal is reflected, its acoustical characteristic is also altered by the room properties. In general, power mismatch due to distance is a complex problem. This paper presents a novel approach in dealing with distance-induced mismatch by intelligently sensing instantaneous voice power variation and compensating model parameters. First, the distant-talking speech signal is processed through microphone array processing, and the corresponding distance information is extracted. Distance-sensitive Gaussian Mixture Models (GMMs), pre-trained to capture both speech power and room property are used to predict the optimal distance of the speech source. Consequently, pre-computed statistic priors corresponding to the optimal distance is selected to correct the statistics of the generic model which was frozen during training. Thus, model combinatorics are post-conditioned to match the power of instantaneous speech acoustics at runtime. This results to an improved likelihood in predicting the correct speech command at farther distances. We experiment using real data recorded inside two rooms. Experimental evaluation shows voice recognition performance using our method is more robust to the change in distance compared to the conventional approach. In our experiment, under the most acoustically challenging environment (i.e., Room 2: 2.5 meters), our method achieved 24.2% improvement in recognition performance against the best-performing conventional method.
Keywords: Human Machine Interaction, Human Computer Interaction, Voice Recognition, Acoustic Model Compensation, Acoustic Speech Enhancement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18855313 The Impact of Stakeholder Communication Strategies on Consumers- Acceptance and Financial Performance: In the Case of Fertilizer Industry in Malaysia
Authors: Hasnida Abdul Wahab, Shahrina Md Nordin, Lai Fong Woon, Hasrina Mustafa
Abstract:
There has been a growing emphasis in communication management from simple coordination of promotional tools to a complex strategic process. This study will examine the current marketing communications and engagement strategies used in addressing the key stakeholders. In the case of fertilizer industry in Malaysia, there has been little empirical research on stakeholder communication when major challenges facing the modern corporation is the need to communicate its identity, its values and products in order to distinguish itself from competitors. The study will employ both quantitative and qualitative methods and the use of Structural Equation Modeling (SEM) to establish a causal relationship amongst the key factors of stakeholder communication strategies and increment in consumers- choice/acceptance and impact on financial performance. One of the major contributions is a conceptual framework for communication strategies and engagement in increasing consumers- acceptance level and the firm-s financial performance.Keywords: Consumers' acceptance, financial performance, stakeholder communication strategies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20775312 Construct the Fur Input Mixed Model with Activity-Based Benefit Assessment Approach of Leather Industry
Authors: M. F. Wu, F. T. Cheng
Abstract:
Leather industry is the most important traditional industry to provide the leather products in the world for thousand years. The fierce global competitive environment and common awareness of global carbon reduction make livestock supply quantities falling, salt and wet blue leather material reduces and the price skyrockets significantly. Exchange rate fluctuation led sales revenue decreasing which due to the differences of export exchanges and compresses the overall profitability of leather industry. This paper applies activity-based benefit assessment approach to build up fitness fur input mixed model, fur is Wet Blue, which concerned with four key factors: the output rate of wet blue, unit cost of wet blue, yield rate and grade level of Wet Blue to achieve the low cost strategy under given unit price of leather product condition of the company. The research findings indicate that applying this model may improve the input cost structure, decrease numbers of leather product inventories and to raise the competitive advantages of the enterprise in the future.
Keywords: Activity-Based Benefit Assessment Approach, Input mixed, Output Rate, Wet Blue.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16635311 Gender based Barriers to Effective Collaboration: A Case Study on Children's Safeguard Partnerships
Authors: L. McAllister, A. Dudau
Abstract:
This paper explores gender related barriers to interagency collaboration in statutory children safeguard partnerships against a theoretical framework that considers individuals, professions and organisations interacting as part of a complex adaptive system. We argue that gender-framed obstacles to effective communication between culturally discrepant agencies can ultimately impact on the effectiveness of policy delivery,. We focused our research on three partnership structures in Sefton Metropolitan Borough in order to observe how interactions occur, whether the agencies involved perceive their occupational environment as being gender affected and whether they believe this can hinder effective collaboration with other biased organisations. Our principal empirical findings indicate that there is a general awareness amongst professionals of the role that gender plays in each of the agencies reviewed, that gender may well constitute a barrier to effective communication, but there is a sense in which there is little scope for change in the short term. We aim to signal here, however, the need to change against the risk of service failure.
Keywords: Children's safeguard, gender, gendered professions, inter-agency collaboration, partnerships.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20045310 Improving Subjective Bias Detection Using Bidirectional Encoder Representations from Transformers and Bidirectional Long Short-Term Memory
Authors: Ebipatei Victoria Tunyan, T. A. Cao, Cheol Young Ock
Abstract:
Detecting subjectively biased statements is a vital task. This is because this kind of bias, when present in the text or other forms of information dissemination media such as news, social media, scientific texts, and encyclopedias, can weaken trust in the information and stir conflicts amongst consumers. Subjective bias detection is also critical for many Natural Language Processing (NLP) tasks like sentiment analysis, opinion identification, and bias neutralization. Having a system that can adequately detect subjectivity in text will boost research in the above-mentioned areas significantly. It can also come in handy for platforms like Wikipedia, where the use of neutral language is of importance. The goal of this work is to identify the subjectively biased language in text on a sentence level. With machine learning, we can solve complex AI problems, making it a good fit for the problem of subjective bias detection. A key step in this approach is to train a classifier based on BERT (Bidirectional Encoder Representations from Transformers) as upstream model. BERT by itself can be used as a classifier; however, in this study, we use BERT as data preprocessor as well as an embedding generator for a Bi-LSTM (Bidirectional Long Short-Term Memory) network incorporated with attention mechanism. This approach produces a deeper and better classifier. We evaluate the effectiveness of our model using the Wiki Neutrality Corpus (WNC), which was compiled from Wikipedia edits that removed various biased instances from sentences as a benchmark dataset, with which we also compare our model to existing approaches. Experimental analysis indicates an improved performance, as our model achieved state-of-the-art accuracy in detecting subjective bias. This study focuses on the English language, but the model can be fine-tuned to accommodate other languages.
Keywords: Subjective bias detection, machine learning, BERT–BiLSTM–Attention, text classification, natural language processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8305309 Robust Batch Process Scheduling in Pharmaceutical Industries: A Case Study
Authors: Tommaso Adamo, Gianpaolo Ghiani, Antonio D. Grieco, Emanuela Guerriero
Abstract:
Batch production plants provide a wide range of scheduling problems. In pharmaceutical industries a batch process is usually described by a recipe, consisting of an ordering of tasks to produce the desired product. In this research work we focused on pharmaceutical production processes requiring the culture of a microorganism population (i.e. bacteria, yeasts or antibiotics). Several sources of uncertainty may influence the yield of the culture processes, including (i) low performance and quality of the cultured microorganism population or (ii) microbial contamination. For these reasons, robustness is a valuable property for the considered application context. In particular, a robust schedule will not collapse immediately when a cell of microorganisms has to be thrown away due to a microbial contamination. Indeed, a robust schedule should change locally in small proportions and the overall performance measure (i.e. makespan, lateness) should change a little if at all. In this research work we formulated a constraint programming optimization (COP) model for the robust planning of antibiotics production. We developed a discrete-time model with a multi-criteria objective, ordering the different criteria and performing a lexicographic optimization. A feasible solution of the proposed COP model is a schedule of a given set of tasks onto available resources. The schedule has to satisfy tasks precedence constraints, resource capacity constraints and time constraints. In particular time constraints model tasks duedates and resource availability time windows constraints. To improve the schedule robustness, we modeled the concept of (a, b) super-solutions, where (a, b) are input parameters of the COP model. An (a, b) super-solution is one in which if a variables (i.e. the completion times of a culture tasks) lose their values (i.e. cultures are contaminated), the solution can be repaired by assigning these variables values with a new values (i.e. the completion times of a backup culture tasks) and at most b other variables (i.e. delaying the completion of at most b other tasks). The efficiency and applicability of the proposed model is demonstrated by solving instances taken from a real-life pharmaceutical company. Computational results showed that the determined super-solutions are near-optimal.Keywords: Constraint programming, super-solutions, robust scheduling, batch process, pharmaceutical industries.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19755308 Regional Role of Higher Education Institutions in Croatia
Authors: Mirjana Jeleč Raguž, Barbara Pisker
Abstract:
Development of knowledge based society carries multiple challenges to the higher education system. Some of the challenges laid before the higher education sector of countries which aspire to become knowledge based societies are: the entrepreneurial leadership of the higher education institutions, finding new sources of financing in order to minimize dependence on public resources, creating connections with the labor market, commercial utilization of R&D results, promotion of innovations as well as the overall promotion of science excellence relevant to the economic sector. Within a framework of this paper and its main subject of research, the challenge which is being put before the higher education institutions is an effort of establishing regional mission of higher education through the open collaboration with regional key factors, both private and public. Development of the mentioned collaboration and its contribution to the overall regional development in Croatia is the main subject of empirical research in this paper.
Keywords: Croatia, Higher Education Institutions, Regional Role, Science-Industry Interaction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14085307 Inventory Control for a Joint Replenishment Problem with Stochastic Demand
Authors: Bassem Roushdy, Nahed Sobhy, Abdelrhim Abdelhamid, Ahmed Mahmoud
Abstract:
Most papers model Joint Replenishment Problem (JRP) as a (kT,S) where kT is a multiple value for a common review period T,and S is a predefined order up to level. In general the (T,S) policy is characterized by a long out of control period which requires a large amount of safety stock compared to the (R,Q) policy. In this paper a probabilistic model is built where an item, call it item(i), with the shortest order time between interval (T)is modeled under (R,Q) policy and its inventory is continuously reviewed, while the rest of items (j) are periodically reviewed at a definite time corresponding to itemKeywords: Inventory management, Joint replenishment, policy evaluation, stochastic process
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30495306 Relationships between Social Entrepreneurship, CSR and Social Innovation: In Theory and Practice
Authors: Krisztina Szegedi, Gyula Fülöp, Ádám Bereczk
Abstract:
The shared goal of social entrepreneurship, corporate social responsibility and social innovation is the advancement of society. The business model of social enterprises is characterized by unique strategies based on the competencies of the entrepreneurs, and is not aimed primarily at the maximization of profits, but rather at carrying out goals for the benefit of society. Corporate social responsibility refers to the active behavior of a company, by which it can create new solutions to meet the needs of society, either on its own or in cooperation with other social stakeholders. The objectives of this article are to define concepts, describe and integrate relevant theoretical models, develop a model and introduce some examples of international practice that can inspire initiatives for social development.
Keywords: Corporate social responsibility, CSR, social innovation, social entrepreneurship.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 41485305 A Visual Control Flow Language and Its Termination Properties
Authors: László Lengyel, Tihamér Levendovszky, Hassan Charaf
Abstract:
This paper presents the visual control flow support of Visual Modeling and Transformation System (VMTS), which facilitates composing complex model transformations out of simple transformation steps and executing them. The VMTS Visual Control Flow Language (VCFL) uses stereotyped activity diagrams to specify control flow structures and OCL constraints to choose between different control flow branches. This work discusses the termination properties of VCFL and provides an algorithm to support the termination analysis of VCFL transformations.
Keywords: Control Flow, Metamodel-Based Visual Model Transformation, OCL, Termination Properties, UML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20665304 The Role of Business Survey Measures in Forecasting Croatian Industrial Production
Authors: M. Cizmesija, N. Erjavec, V. Bahovec
Abstract:
While the European Union (EU) harmonized methodology is a benchmark of worldwide used business survey (BS) methodology, the choice of variables that are components of the confidence indicators, as the leading indicators, is not strictly determined and unique. Therefore, the aim of this paper is to investigate and to quantify the relationship between all business survey variables in manufacturing industry and industrial production as a reference macroeconomic series in Croatia. The assumption is that there are variables in the business survey, that are not components of Industrial Confidence Indicator (ICI) and which can accurately (and sometimes better then ICI) predict changes in Croatian industrial production. Empirical analyses are conducted using quarterly data of BS variables in manufacturing industry and Croatian industrial production over the period from the first quarter 2005 to the first quarter 2013. Research results confirmed the assumption: three BS variables which is not components of ICI (competitive position, demand and liquidity) are the best leading indicator then ICI, in forecasting changes in Croatian industrial production instantaneously, with one, two or three quarter ahead.
Keywords: Balance, Business Survey, Confidence Indicators, Industrial Production, Forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19215303 Radiowave Propagation in Picocellular Environment Using 2.5D Ray Tracing Technique
Authors: Fathi Alwafie
Abstract:
This paper presents a ray tracing simulation technique for characterize the radiowave propagation inside building. The implementation of an algorithm capable of enumerating a large number of propagation paths in interactive time for the special case of 2.5D. The effective dielectric constants of the building structure in the simulations are indicated. The study describes an efficient 2.5D model of ray tracing algorithm were compared with 3D model. The result of the first investigations is that the environment of the indoor wave significantly changes as we change the electric parameters of material constructions. A detailed analysis of the dependence of the indoor wave on the wideband characteristics of the channel: root mean square (RMS) delay spread characteristics and Mean excess delay, is also investigated.
Keywords: Picrocellular, Propagation, Ray tracing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16155302 Effect of Delay on Supply Side on Market Behavior: A System Dynamic Approach
Authors: M. Khoshab, M. J. Sedigh
Abstract:
Dynamic systems, which in mathematical point of view are those governed by differential equations, are much more difficult to study and to predict their behavior in comparison with static systems which are governed by algebraic equations. Economical systems such as market are among complicated dynamic systems. This paper tries to adopt a very simple mathematical model for market and to study effect of supply and demand function on behavior of the market while the supply side experiences a lag due to production restrictions.Keywords: Dynamic System, Lag on Supply Demand, Market Stability, Supply Demand Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15505301 Detection of Ultrasonic Images in the Presence of a Random Number of Scatterers: A Statistical Learning Approach
Authors: J. P. Dubois, O. M. Abdul-Latif
Abstract:
Support Vector Machine (SVM) is a statistical learning tool that was initially developed by Vapnik in 1979 and later developed to a more complex concept of structural risk minimization (SRM). SVM is playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM was applied to the detection of medical ultrasound images in the presence of partially developed speckle noise. The simulation was done for single look and multi-look speckle models to give a complete overlook and insight to the new proposed model of the SVM-based detector. The structure of the SVM was derived and applied to clinical ultrasound images and its performance in terms of the mean square error (MSE) metric was calculated. We showed that the SVM-detected ultrasound images have a very low MSE and are of good quality. The quality of the processed speckled images improved for the multi-look model. Furthermore, the contrast of the SVM detected images was higher than that of the original non-noisy images, indicating that the SVM approach increased the distance between the pixel reflectivity levels (detection hypotheses) in the original images.
Keywords: LS-SVM, medical ultrasound imaging, partially developed speckle, multi-look model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13425300 Development of Decision Support System for House Evaluation and Purchasing
Authors: Chia-Yu Hsu, Julaimin Goh, Pei-Chann Chang
Abstract:
Home is important for Chinese people. Because the information regarding the house attributes and surrounding environments is incomplete in most real estate agency, most house buyers are difficult to consider the overall factors effectively and only can search candidates by sorting-based approach. This study aims to develop a decision support system for housing purchasing, in which surrounding facilities of each house are quantified. Then, all considered house factors and customer preferences are incorporated into Simple Multi-Attribute Ranking Technique (SMART) to support the housing evaluation. To evaluate the validity of proposed approach, an empirical study was conducted from a real estate agency. Based on the customer requirement and preferences, the proposed approach can identify better candidate house with consider the overall house attributes and surrounding facilities.Keywords: decision support system, real estate, decision analysis, housing evaluation, SMART
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22025299 SVM-Based Detection of SAR Images in Partially Developed Speckle Noise
Authors: J. P. Dubois, O. M. Abdul-Latif
Abstract:
Support Vector Machine (SVM) is a statistical learning tool that was initially developed by Vapnik in 1979 and later developed to a more complex concept of structural risk minimization (SRM). SVM is playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM was applied to the detection of SAR (synthetic aperture radar) images in the presence of partially developed speckle noise. The simulation was done for single look and multi-look speckle models to give a complete overlook and insight to the new proposed model of the SVM-based detector. The structure of the SVM was derived and applied to real SAR images and its performance in terms of the mean square error (MSE) metric was calculated. We showed that the SVM-detected SAR images have a very low MSE and are of good quality. The quality of the processed speckled images improved for the multi-look model. Furthermore, the contrast of the SVM detected images was higher than that of the original non-noisy images, indicating that the SVM approach increased the distance between the pixel reflectivity levels (the detection hypotheses) in the original images.Keywords: Least Square-Support Vector Machine, SyntheticAperture Radar. Partially Developed Speckle, Multi-Look Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15375298 A Comparative Study of Additive and Nonparametric Regression Estimators and Variable Selection Procedures
Authors: Adriano Z. Zambom, Preethi Ravikumar
Abstract:
One of the biggest challenges in nonparametric regression is the curse of dimensionality. Additive models are known to overcome this problem by estimating only the individual additive effects of each covariate. However, if the model is misspecified, the accuracy of the estimator compared to the fully nonparametric one is unknown. In this work the efficiency of completely nonparametric regression estimators such as the Loess is compared to the estimators that assume additivity in several situations, including additive and non-additive regression scenarios. The comparison is done by computing the oracle mean square error of the estimators with regards to the true nonparametric regression function. Then, a backward elimination selection procedure based on the Akaike Information Criteria is proposed, which is computed from either the additive or the nonparametric model. Simulations show that if the additive model is misspecified, the percentage of time it fails to select important variables can be higher than that of the fully nonparametric approach. A dimension reduction step is included when nonparametric estimator cannot be computed due to the curse of dimensionality. Finally, the Boston housing dataset is analyzed using the proposed backward elimination procedure and the selected variables are identified.Keywords: Additive models, local polynomial regression, residuals, mean square error, variable selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10115297 Capacity Flexibility within Production
Authors: Johannes Nywlt, Julian Becker, Sebastian Bertsch
Abstract:
Due to high dynamics in current markets the expectations regarding logistics increase steadily. However, the complexity and variety of products and production make it difficult to understand the interdependencies between logistical objectives and their determining factors. Therefore specific models are needed to meet this challenge. The Logistic Operating Curves Theory is such a model. With its aid the basic correlations between the logistic objectives can be described. Within this model the capacity flexibility represents an important parameter. However, a proper mathematical description for this parameter is still missing. Within this paper such a description will be developed in order to make the Logistic Operating Curves Theory more accurate.
Keywords: Capacity flexibility, Production controlling, Production logistics, Production management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20865296 Model Predictive Control of Turbocharged Diesel Engine with Exhaust Gas Recirculation
Authors: U. Yavas, M. Gokasan
Abstract:
Control of diesel engine’s air path has drawn a lot of attention due to its multi input-multi output, closed coupled, non-linear relation. Today, precise control of amount of air to be combusted is a must in order to meet with tight emission limits and performance targets. In this study, passenger car size diesel engine is modeled by AVL Boost RT, and then simulated with standard, industry level PID controllers. Finally, linear model predictive control is designed and simulated. This study shows the importance of modeling and control of diesel engines with flexible algorithm development in computer based systems.Keywords: Predictive control, engine control, engine modeling, PID control, feedforward compensation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1817