Search results for: Markov chains
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 232

Search results for: Markov chains

142 Production Throughput Modeling under Five Uncertain Variables Using Bayesian Inference

Authors: Amir Azizi, Amir Yazid B. Ali, Loh Wei Ping

Abstract:

Throughput is an important measure of performance of production system. Analyzing and modeling of production throughput is complex in today-s dynamic production systems due to uncertainties of production system. The main reasons are that uncertainties are materialized when the production line faces changes in setup time, machinery break down, lead time of manufacturing, and scraps. Besides, demand changes are fluctuating from time to time for each product type. These uncertainties affect the production performance. This paper proposes Bayesian inference for throughput modeling under five production uncertainties. Bayesian model utilized prior distributions related to previous information about the uncertainties where likelihood distributions are associated to the observed data. Gibbs sampling algorithm as the robust procedure of Monte Carlo Markov chain was employed for sampling unknown parameters and estimating the posterior mean of uncertainties. The Bayesian model was validated with respect to convergence and efficiency of its outputs. The results presented that the proposed Bayesian models were capable to predict the production throughput with accuracy of 98.3%.

Keywords: Bayesian inference, Uncertainty modeling, Monte Carlo Markov chain, Gibbs sampling, Production throughput

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2104
141 Classification of State Transition by Using a Microwave Doppler Sensor for Wandering Detection

Authors: K. Shiba, T. Kaburagi, Y. Kurihara

Abstract:

With global aging, people who require care, such as people with dementia (PwD), are increasing within many developed countries. And PwDs may wander and unconsciously set foot outdoors, it may lead serious accidents, such as, traffic accidents. Here, round-the-clock monitoring by caregivers is necessary, which can be a burden for the caregivers. Therefore, an automatic wandering detection system is required when an elderly person wanders outdoors, in which case the detection system transmits a ‘moving’ followed by an ‘absence’ state. In this paper, we focus on the transition from the ‘resting’ to the ‘absence’ state, via the ‘moving’ state as one of the wandering transitions. To capture the transition of the three states, our method based on the hidden Markov model (HMM) is built. Using our method, the restraint where the ‘resting’ state and ‘absence’ state cannot be transmitted to each other is applied. To validate our method, we conducted the experiment with 10 subjects. Our results show that the method can classify three states with 0.92 accuracy.

Keywords: Wander, microwave Doppler sensor, respiratory frequency band, the state transition, hidden Markov model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 803
140 Genetic-Based Multi Resolution Noisy Color Image Segmentation

Authors: Raghad Jawad Ahmed

Abstract:

Segmentation of a color image composed of different kinds of regions can be a hard problem, namely to compute for an exact texture fields. The decision of the optimum number of segmentation areas in an image when it contains similar and/or un stationary texture fields. A novel neighborhood-based segmentation approach is proposed. A genetic algorithm is used in the proposed segment-pass optimization process. In this pass, an energy function, which is defined based on Markov Random Fields, is minimized. In this paper we use an adaptive threshold estimation method for image thresholding in the wavelet domain based on the generalized Gaussian distribution (GGD) modeling of sub band coefficients. This method called Normal Shrink is computationally more efficient and adaptive because the parameters required for estimating the threshold depend on sub band data energy that used in the pre-stage of segmentation. A quad tree is employed to implement the multi resolution framework, which enables the use of different strategies at different resolution levels, and hence, the computation can be accelerated. The experimental results using the proposed segmentation approach are very encouraging.

Keywords: Color image segmentation, Genetic algorithm, Markov random field, Scale space filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1532
139 Using the Technology-Organization-Environment Framework and Zuboff’s Concepts for Understanding Environmental Sustainability and RFID: Two Case Studies

Authors: Rebecca Angeles

Abstract:

Radio frequency identification (RFID) has been recognized as a key enabler of efficient and effective supply chains. Recently, with increasing concern for environmental sustainability, researchers and practitioners have been exploring the role of RFID in supporting “green supply chains.” This qualitative study uses the technology-organization-environment framework of Tornatzky and Fleischer, and Zuboff’s concepts of automating-informating-transformating in analyzing two case studies involving RFID use: the recycling of Hewlett Packard inkjet printers and the garbage and recycling program of the City of Grand Rapids, Michigan.

Keywords: Environmental sustainability, green supply chain management, radio frequency identification, technology-organization-environment framework, Zuboff’automate-informate-transformate concepts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5636
138 Business Intelligence and Strategic Decision Simulation

Authors: S. Sabbour, H. Lasi, P. von Tessin

Abstract:

The purpose of this study is two-fold. First, it attempts to explore potential opportunities for utilizing visual interactive simulations along with Business Intelligence (BI) as a decision support tool for strategic decision making. Second, it tries to figure out the essential top-level managerial requirements that would transform strategic decision simulation into an integral component of BI systems. The domain of particular interest was the application of visual interactive simulation capabilities in the field of supply chains. A qualitative exploratory method was applied, through the use of interviews with two leading companies. The collected data was then analysed to demonstrate the difference between the literature perspective and the practical managerial perspective on the issue. The results of the study suggest that although the use of simulation particularly in managing supply chains is very evident in literature, yet, in practice such utilization is still in its infancy, particularly regarding strategic decisions. Based on the insights a prototype of a simulation based BI-solution-extension was developed and evaluated.

Keywords: Business Intelligence, decision support, strategic decisions, simulation, SCM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2766
137 Scientific Production on Lean Supply Chains Published in Journals Indexed by SCOPUS and Web of Science Databases: A Bibliometric Study

Authors: T. Botelho de Sousa, F. Raphael Cabral Furtado, O. Eduardo da Silva Ferri, A. Batista, W. Augusto Varella, C. Eduardo Pinto, J. Mimar Santa Cruz Yabarrena, S. Gibran Ruwer, F. Müller Guerrini, L. Adalberto Philippsen Júnior

Abstract:

Lean Supply Chain Management (LSCM) is an emerging research field in Operations Management (OM). As a strategic model that focuses on reduced cost and waste with fulfilling the needs of customers, LSCM attracts great interest among researchers and practitioners. The purpose of this paper is to present an overview of Lean Supply Chains literature, based on bibliometric analysis through 57 papers published in indexed journals by SCOPUS and/or Web of Science databases. The results indicate that the last three years (2015, 2016, and 2017) were the most productive on LSCM discussion, especially in Supply Chain Management and International Journal of Lean Six Sigma journals. India, USA, and UK are the most productive countries; nevertheless, cross-country studies by collaboration among researchers were detected, by social network analysis, as a research practice, appearing to play a more important role on LSCM studies. Despite existing limitation, such as limited indexed journal database, bibliometric analysis helps to enlighten ongoing efforts on LSCM researches, including most used technical procedures and collaboration network, showing important research gaps, especially, for development countries researchers.

Keywords: Lean supply chains, bibliometric study, SCOPUS, web of Science.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 889
136 A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling

Authors: Addin Osman, Anwar Ali Yahya, Mohammed Basit Kamal

Abstract:

Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.

Keywords: Benchmark collection, program educational objectives, student outcomes, ABET, Accreditation, machine learning, supervised multiclass classification, text mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 783
135 An Efficient Motion Recognition System Based on LMA Technique and a Discrete Hidden Markov Model

Authors: Insaf Ajili, Malik Mallem, Jean-Yves Didier

Abstract:

Human motion recognition has been extensively increased in recent years due to its importance in a wide range of applications, such as human-computer interaction, intelligent surveillance, augmented reality, content-based video compression and retrieval, etc. However, it is still regarded as a challenging task especially in realistic scenarios. It can be seen as a general machine learning problem which requires an effective human motion representation and an efficient learning method. In this work, we introduce a descriptor based on Laban Movement Analysis technique, a formal and universal language for human movement, to capture both quantitative and qualitative aspects of movement. We use Discrete Hidden Markov Model (DHMM) for training and classification motions. We improve the classification algorithm by proposing two DHMMs for each motion class to process the motion sequence in two different directions, forward and backward. Such modification allows avoiding the misclassification that can happen when recognizing similar motions. Two experiments are conducted. In the first one, we evaluate our method on a public dataset, the Microsoft Research Cambridge-12 Kinect gesture data set (MSRC-12) which is a widely used dataset for evaluating action/gesture recognition methods. In the second experiment, we build a dataset composed of 10 gestures(Introduce yourself, waving, Dance, move, turn left, turn right, stop, sit down, increase velocity, decrease velocity) performed by 20 persons. The evaluation of the system includes testing the efficiency of our descriptor vector based on LMA with basic DHMM method and comparing the recognition results of the modified DHMM with the original one. Experiment results demonstrate that our method outperforms most of existing methods that used the MSRC-12 dataset, and a near perfect classification rate in our dataset.

Keywords: Human Motion Recognition, Motion representation, Laban Movement Analysis, Discrete Hidden Markov Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 674
134 Advances in Artificial Intelligence Using Speech Recognition

Authors: Khaled M. Alhawiti

Abstract:

This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.

Keywords: Speech recognition, acoustic phonetic, artificial intelligence, Hidden Markov Models (HMM), statistical models of speech recognition, human machine performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7895
133 Speaker Independent Quranic Recognizer Basedon Maximum Likelihood Linear Regression

Authors: Ehab Mourtaga, Ahmad Sharieh, Mousa Abdallah

Abstract:

An automatic speech recognition system for the formal Arabic language is needed. The Quran is the most formal spoken book in Arabic, it is spoken all over the world. In this research, an automatic speech recognizer for Quranic based speakerindependent was developed and tested. The system was developed based on the tri-phone Hidden Markov Model and Maximum Likelihood Linear Regression (MLLR). The MLLR computes a set of transformations which reduces the mismatch between an initial model set and the adaptation data. It uses the regression class tree, as well as, estimates a set of linear transformations for the mean and variance parameters of a Gaussian mixture HMM system. The 30th Chapter of the Quran, with five of the most famous readers of the Quran, was used for the training and testing of the data. The chapter includes about 2000 distinct words. The advantages of using the Quranic verses as the database in this developed recognizer are the uniqueness of the words and the high level of orderliness between verses. The level of accuracy from the tested data ranged 68 to 85%.

Keywords: Hidden Markov Model (HMM), MaximumLikelihood Linear Regression (MLLR), Quran, Regression ClassTree, Speech Recognition, Speaker-independent.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1871
132 A Model for Business Network Governance: Case Study in the Pharmaceutical Industry

Authors: Emil Crişan, Matthias Klumpp

Abstract:

This paper discusses the theory behind the existence of an idealistic model for business network governance and uses a clarifying case-study, containing governance structures and processes within a business network framework. The case study from a German pharmaceutical industry company complements existing literature by providing a comprehensive explanation of the relations between supply chains and business networks, and also between supply chain management and business network governance. Supply chains and supply chain management are only one side of the interorganizational relationships and ensure short-term performance, while real-world governance structures are needed for ensuring the long-term existence of a supply chain. Within this context, a comprehensive model for business governance is presented. An interesting finding from the case study is that multiple business network governance systems co-exist within the evaluated supply chain.

Keywords: Business network, pharmaceutical industry, supply chain governance, supply chain management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2306
131 Comparison among Various Question Generations for Decision Tree Based State Tying in Persian Language

Authors: Nasibeh Nasiri, Dawood Talebi Khanmiri

Abstract:

Performance of any continuous speech recognition system is highly dependent on performance of the acoustic models. Generally, development of the robust spoken language technology relies on the availability of large amounts of data. Common way to cope with little data for training each state of Markov models is treebased state tying. This tying method applies contextual questions to tie states. Manual procedure for question generation suffers from human errors and is time consuming. Various automatically generated questions are used to construct decision tree. There are three approaches to generate questions to construct HMMs based on decision tree. One approach is based on misrecognized phonemes, another approach basically uses feature table and the other is based on state distributions corresponding to context-independent subword units. In this paper, all these methods of automatic question generation are applied to the decision tree on FARSDAT corpus in Persian language and their results are compared with those of manually generated questions. The results show that automatically generated questions yield much better results and can replace manually generated questions in Persian language.

Keywords: Decision Tree, Markov Models, Speech Recognition, State Tying.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1682
130 Analysis of Linguistic Disfluencies in Bilingual Children’s Discourse

Authors: Sheena Christabel Pravin, M. Palanivelan

Abstract:

Speech disfluencies are common in spontaneous speech. The primary purpose of this study was to distinguish linguistic disfluencies from stuttering disfluencies in bilingual Tamil–English (TE) speaking children. The secondary purpose was to determine whether their disfluencies are mediated by native language dominance and/or on an early onset of developmental stuttering at childhood. A detailed study was carried out to identify the prosodic and acoustic features that uniquely represent the disfluent regions of speech. This paper focuses on statistical modeling of repetitions, prolongations, pauses and interjections in the speech corpus encompassing bilingual spontaneous utterances from school going children – English and Tamil. Two classifiers including Hidden Markov Models (HMM) and the Multilayer Perceptron (MLP), which is a class of feed-forward artificial neural network, were compared in the classification of disfluencies. The results of the classifiers document the patterns of disfluency in spontaneous speech samples of school-aged children to distinguish between Children Who Stutter (CWS) and Children with Language Impairment CLI). The ability of the models in classifying the disfluencies was measured in terms of F-measure, Recall, and Precision.

Keywords: Bilingual, children who stutter, children with language impairment, Hidden Markov Models, multi-layer perceptron, linguistic disfluencies, stuttering disfluencies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 977
129 Analysis of Linked in Series Servers with Blocking, Priority Feedback Service and Threshold Policy

Authors: Walenty Oniszczuk

Abstract:

The use of buffer thresholds, blocking and adequate service strategies are well-known techniques for computer networks traffic congestion control. This motivates the study of series queues with blocking, feedback (service under Head of Line (HoL) priority discipline) and finite capacity buffers with thresholds. In this paper, the external traffic is modelled using the Poisson process and the service times have been modelled using the exponential distribution. We consider a three-station network with two finite buffers, for which a set of thresholds (tm1 and tm2) is defined. This computer network behaves as follows. A task, which finishes its service at station B, gets sent back to station A for re-processing with probability o. When the number of tasks in the second buffer exceeds a threshold tm2 and the number of task in the first buffer is less than tm1, the fed back task is served under HoL priority discipline. In opposite case, for fed backed tasks, “no two priority services in succession" procedure (preventing a possible overflow in the first buffer) is applied. Using an open Markovian queuing schema with blocking, priority feedback service and thresholds, a closed form cost-effective analytical solution is obtained. The model of servers linked in series is very accurate. It is derived directly from a twodimensional state graph and a set of steady-state equations, followed by calculations of main measures of effectiveness. Consequently, efficient expressions of the low computational cost are determined. Based on numerical experiments and collected results we conclude that the proposed model with blocking, feedback and thresholds can provide accurate performance estimates of linked in series networks.

Keywords: Blocking, Congestion control, Feedback, Markov chains, Performance evaluation, Threshold-base networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1247
128 Event Information Extraction System (EIEE): FSM vs HMM

Authors: Shaukat Wasi, Zubair A. Shaikh, Sajid Qasmi, Hussain Sachwani, Rehman Lalani, Aamir Chagani

Abstract:

Automatic Extraction of Event information from social text stream (emails, social network sites, blogs etc) is a vital requirement for many applications like Event Planning and Management systems and security applications. The key information components needed from Event related text are Event title, location, participants, date and time. Emails have very unique distinctions over other social text streams from the perspective of layout and format and conversation style and are the most commonly used communication channel for broadcasting and planning events. Therefore we have chosen emails as our dataset. In our work, we have employed two statistical NLP methods, named as Finite State Machines (FSM) and Hidden Markov Model (HMM) for the extraction of event related contextual information. An application has been developed providing a comparison among the two methods over the event extraction task. It comprises of two modules, one for each method, and works for both bulk as well as direct user input. The results are evaluated using Precision, Recall and F-Score. Experiments show that both methods produce high performance and accuracy, however HMM was good enough over Title extraction and FSM proved to be better for Venue, Date, and time.

Keywords: Emails, Event Extraction, Event Detection, Finite state machines, Hidden Markov Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2272
127 Jeffrey's Prior for Unknown Sinusoidal Noise Model via Cramer-Rao Lower Bound

Authors: Samuel A. Phillips, Emmanuel A. Ayanlowo, Rasaki O. Olanrewaju, Olayode Fatoki

Abstract:

This paper employs the Jeffrey's prior technique in the process of estimating the periodograms and frequency of sinusoidal model for unknown noisy time variants or oscillating events (data) in a Bayesian setting. The non-informative Jeffrey's prior was adopted for the posterior trigonometric function of the sinusoidal model such that Cramer-Rao Lower Bound (CRLB) inference was used in carving-out the minimum variance needed to curb the invariance structure effect for unknown noisy time observational and repeated circular patterns. An average monthly oscillating temperature series measured in degree Celsius (0C) from 1901 to 2014 was subjected to the posterior solution of the unknown noisy events of the sinusoidal model via Markov Chain Monte Carlo (MCMC). It was not only deduced that two minutes period is required before completing a cycle of changing temperature from one particular degree Celsius to another but also that the sinusoidal model via the CRLB-Jeffrey's prior for unknown noisy events produced a miniature posterior Maximum A Posteriori (MAP) compare to a known noisy events.

Keywords: Cramer-Rao Lower Bound (CRLB), Jeffrey's prior, Sinusoidal, Maximum A Posteriori (MAP), Markov Chain Monte Carlo (MCMC), Periodograms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 602
126 Rheological and Thermomechanical Properties of Graphene/ABS/PP Nanocomposites

Authors: Marianna I. Triantou, Konstantina I. Stathi, Petroula A. Tarantili

Abstract:

In the present study, the incorporation of graphene into blends of acrylonitrile-butadiene-styrene terpolymer with polypropylene (ABS/PP) was investigated focusing on the improvement of their thermomechanical characteristics and the effect on their rheological behavior. The blends were prepared by melt mixing in a twin-screw extruder and were characterized by measuring the MFI as well as by performing DSC, TGA and mechanical tests. The addition of graphene to ABS/PP blends tends to increase their melt viscosity, due to the confinement of polymer chains motion. Also, graphene causes an increment of the crystallization temperature (Tc), especially in blends with higher PP content, because of the reduction of surface energy of PP nucleation, which is a consequence of the attachment of PP chains to the surface of graphene through the intermolecular CH-π interaction. Moreover, the above nanofiller improves the thermal stability of PP and increases the residue of thermal degradation at all the investigated compositions of blends, due to the thermal isolation effect and the mass transport barrier effect. Regarding the mechanical properties, the addition of graphene improves the elastic modulus, because of its intrinsic mechanical characteristics and its rigidity, and this effect is particularly strong in the case of pure PP.

Keywords: Acrylonitrile-butadiene-styrene terpolymer, blends, graphene, polypropylene.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3740
125 Brain Image Segmentation Using Conditional Random Field Based On Modified Artificial Bee Colony Optimization Algorithm

Authors: B. Thiagarajan, R. Bremananth

Abstract:

Tumor is an uncontrolled growth of tissues in any part of the body. Tumors are of different types and they have different characteristics and treatments. Brain tumor is inherently serious and life-threatening because of its character in the limited space of the intracranial cavity (space formed inside the skull). Locating the tumor within MR (magnetic resonance) image of brain is integral part of the treatment of brain tumor. This segmentation task requires classification of each voxel as either tumor or non-tumor, based on the description of the voxel under consideration. Many studies are going on in the medical field using Markov Random Fields (MRF) in segmentation of MR images. Even though the segmentation process is better, computing the probability and estimation of parameters is difficult. In order to overcome the aforementioned issues, Conditional Random Field (CRF) is used in this paper for segmentation, along with the modified artificial bee colony optimization and modified fuzzy possibility c-means (MFPCM) algorithm. This work is mainly focused to reduce the computational complexities, which are found in existing methods and aimed at getting higher accuracy. The efficiency of this work is evaluated using the parameters such as region non-uniformity, correlation and computation time. The experimental results are compared with the existing methods such as MRF with improved Genetic Algorithm (GA) and MRF-Artificial Bee Colony (MRF-ABC) algorithm.

Keywords: Conditional random field, Magnetic resonance, Markov random field, Modified artificial bee colony.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2896
124 The Impact of Quality Cost on Revenue Sharing in Supply Chain Management

Authors: Fayza Obied-Allah

Abstract:

Customer’ needs, quality, and value creation while reducing costs through supply chain management provides challenges and opportunities for companies and researchers. In the light of these challenges, modern ideas must contribute to counter these challenges and exploit opportunities. Therefore, this paper discusses the impact of the quality cost on revenue sharing as a most important incentive to configure business networks. This paper develops the quality cost approach to align with the modern era. It develops a model to measure quality costs which might enable firms to manage revenue sharing in a supply chain. The developed model includes five categories; besides the well-known four categories (namely prevention costs, appraisal costs, internal failure costs, and external failure costs), a new category has been developed in this research as a new vision of the relationship between quality costs and innovations in industry. This new category is Recycle Cost. This paper also examines whether such quality costs in supply chains influence the revenue sharing between partners. Using the author's quality cost model, the relationship between quality costs and revenue sharing among partners is examined using a case study in an Egyptian manufacturing company which is a part of a supply chain. This paper argues that the revenue-sharing proportion allocated to supplier increases as the recycle cost of supplier increases, and the revenue-sharing proportion allocated to manufacturer increases as the prevention and appraisal costs increase, as well as the failure costs, the recycle costs of manufacturer, and the recycle costs of suppliers decrease. However, the results present surprising findings. The purposes of this study are developing quality cost approach and understanding the relationships between quality costs and revenue sharing in supply chains. Therefore, the present study contributes to theory and practice by explaining how the cost of recycling can be combined in quality cost model to better understanding the revenue sharing among partners in supply chains.

Keywords: Quality cost, Recycle cost, Revenue sharing, Supply chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1925
123 Comparison of Two Maintenance Policies for a Two-Unit Series System Considering General Repair

Authors: Seyedvahid Najafi, Viliam Makis

Abstract:

In recent years, maintenance optimization has attracted special attention due to the growth of industrial systems complexity. Maintenance costs are high for many systems, and preventive maintenance is effective when it increases operations' reliability and safety at a reduced cost. The novelty of this research is to consider general repair in the modeling of multi-unit series systems and solve the maintenance problem for such systems using the semi-Markov decision process (SMDP) framework. We propose an opportunistic maintenance policy for a series system composed of two main units. Unit 1, which is more expensive than unit 2, is subjected to condition monitoring, and its deterioration is modeled using a gamma process. Unit 1 hazard rate is estimated by the proportional hazards model (PHM), and two hazard rate control limits are considered as the thresholds of maintenance interventions for unit 1. Maintenance is performed on unit 2, considering an age control limit. The objective is to find the optimal control limits and minimize the long-run expected average cost per unit time. The proposed algorithm is applied to a numerical example to compare the effectiveness of the proposed policy (policy Ⅰ) with policy Ⅱ, which is similar to policy Ⅰ, but instead of general repair, replacement is performed. Results show that policy Ⅰ leads to lower average cost compared with policy Ⅱ. 

Keywords: Condition-based maintenance, proportional hazards model, semi-Markov decision process, two-unit series systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 510
122 Optimisation of Intermodal Transport Chain of Supermarkets on Isle of Wight, UK

Authors: Jingya Liu, Yue Wu, Jiabin Luo

Abstract:

This work investigates an intermodal transportation system for delivering goods from a Regional Distribution Centre to supermarkets on the Isle of Wight (IOW) via the port of Southampton or Portsmouth in the UK. We consider this integrated logistics chain as a 3-echelon transportation system. In such a system, there are two types of transport methods used to deliver goods across the Solent Channel: one is accompanied transport, which is used by most supermarkets on the IOW, such as Spar, Lidl and Co-operative food; the other is unaccompanied transport, which is used by Aldi. Five transport scenarios are studied based on different transport modes and ferry routes. The aim is to determine an optimal delivery plan for supermarkets of different business scales on IOW, in order to minimise the total running cost, fuel consumptions and carbon emissions. The problem is modelled as a vehicle routing problem with time windows and solved by genetic algorithm. The computing results suggested that accompanied transport is more cost efficient for small and medium business-scale supermarket chains on IOW, while unaccompanied transport has the potential to improve the efficiency and effectiveness of large business scale supermarket chains.

Keywords: Genetic algorithm, intermodal transport system, Isle of Wight, optimization, supermarket.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 954
121 MIMO Antenna Selections using CSI from Reciprocal Channel

Authors: P. Uthansakul, K. Attakitmongkol, N. Promsuvana, M. Uthansakul

Abstract:

It is well known that the channel capacity of Multiple- Input-Multiple-Output (MIMO) system increases as the number of antenna pairs between transmitter and receiver increases but it suffers from multiple expensive RF chains. To reduce the cost of RF chains, Antenna Selection (AS) method can offer a good tradeoff between expense and performance. In a transmitting AS system, Channel State Information (CSI) feedback is necessarily required to choose the best subset of antennas in which the effects of delays and errors occurred in feedback channels are the most dominant factors degrading the performance of the AS method. This paper presents the concept of AS method using CSI from channel reciprocity instead of feedback method. Reciprocity technique can easily archive CSI by utilizing a reverse channel where the forward and reverse channels are symmetrically considered in time, frequency and location. In this work, the capacity performance of MIMO system when using AS method at transmitter with reciprocity channels is investigated by own developing Testbed. The obtained results show that reciprocity technique offers capacity close to a system with a perfect CSI and gains a higher capacity than a system without AS method from 0.9 to 2.2 bps/Hz at SNR 10 dB.

Keywords: Antenna Selection, Capacity, Channel, Measurement, MIMO, Reciprocity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1923
120 Thermo-Sensitive Hydrogel: Control of Hydrophilic-Hydrophobic Transition

Authors: Wanwipa Siriwatwechakul, Nutte Teraphongphom, Vatcharani Ngaotheppitak, Sureeporn Kunataned

Abstract:

The study investigated the hydrophilic to hydrophobic transition of modified polyacrylamide hydrogel with the inclusion of N-isopropylacrylamide (NIAM). The modification was done by mimicking micellar polymerization, which resulted in better arrangement of NIAM chains in the polyacrylamide network. The degree of NIAM arrangement is described by NH number. The hydrophilic to hydrophobic transition was measured through the partition coefficient, K, of Orange II and Methylene Blue in hydrogel and in water. These dyes were chosen as a model for solutes with different degree of hydrophobicity. The study showed that the hydrogel with higher NH values resulted in better solubility of both dyes. Moreover, in temperature above the lower critical solution temperature (LCST) of Poly(N-isopropylacrylamide) (PNIAM)also caused the collapse of NIPAM chains which results in a more hydrophobic environment that increases the solubility of Methylene Blue and decreases the solubility of Orange II in the hydrogels with NIPAM present.

Keywords: Thermo-sensitive hydrogel, partition coefficient, the lower critical solution temperature (LCST), micellar polymerization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2818
119 VISMA: A Method for System Analysis in Early Lifecycle Phases

Authors: Walter Sebron, Hans Tschürtz, Peter Krebs

Abstract:

The choice of applicable analysis methods in safety or systems engineering depends on the depth of knowledge about a system, and on the respective lifecycle phase. However, the analysis method chain still shows gaps as it should support system analysis during the lifecycle of a system from a rough concept in pre-project phase until end-of-life. This paper’s goal is to discuss an analysis method, the VISSE Shell Model Analysis (VISMA) method, which aims at closing the gap in the early system lifecycle phases, like the conceptual or pre-project phase, or the project start phase. It was originally developed to aid in the definition of the system boundary of electronic system parts, like e.g. a control unit for a pump motor. Furthermore, it can be also applied to non-electronic system parts. The VISMA method is a graphical sketch-like method that stratifies a system and its parts in inner and outer shells, like the layers of an onion. It analyses a system in a two-step approach, from the innermost to the outermost components followed by the reverse direction. To ensure a complete view of a system and its environment, the VISMA should be performed by (multifunctional) development teams. To introduce the method, a set of rules and guidelines has been defined in order to enable a proper shell build-up. In the first step, the innermost system, named system under consideration (SUC), is selected, which is the focus of the subsequent analysis. Then, its directly adjacent components, responsible for providing input to and receiving output from the SUC, are identified. These components are the content of the first shell around the SUC. Next, the input and output components to the components in the first shell are identified and form the second shell around the first one. Continuing this way, shell by shell is added with its respective parts until the border of the complete system (external border) is reached. Last, two external shells are added to complete the system view, the environment and the use case shell. This system view is also stored for future use. In the second step, the shells are examined in the reverse direction (outside to inside) in order to remove superfluous components or subsystems. Input chains to the SUC, as well as output chains from the SUC are described graphically via arrows, to highlight functional chains through the system. As a result, this method offers a clear and graphical description and overview of a system, its main parts and environment; however, the focus still remains on a specific SUC. It helps to identify the interfaces and interfacing components of the SUC, as well as important external interfaces of the overall system. It supports the identification of the first internal and external hazard causes and causal chains. Additionally, the method promotes a holistic picture and cross-functional understanding of a system, its contributing parts, internal relationships and possible dangers within a multidisciplinary development team.

Keywords: Analysis methods, functional safety, hazard identification, system and safety engineering, system boundary definition, system safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1078
118 Feature Based Dense Stereo Matching using Dynamic Programming and Color

Authors: Hajar Sadeghi, Payman Moallem, S. Amirhassn Monadjemi

Abstract:

This paper presents a new feature based dense stereo matching algorithm to obtain the dense disparity map via dynamic programming. After extraction of some proper features, we use some matching constraints such as epipolar line, disparity limit, ordering and limit of directional derivative of disparity as well. Also, a coarseto- fine multiresolution strategy is used to decrease the search space and therefore increase the accuracy and processing speed. The proposed method links the detected feature points into the chains and compares some of the feature points from different chains, to increase the matching speed. We also employ color stereo matching to increase the accuracy of the algorithm. Then after feature matching, we use the dynamic programming to obtain the dense disparity map. It differs from the classical DP methods in the stereo vision, since it employs sparse disparity map obtained from the feature based matching stage. The DP is also performed further on a scan line, between any matched two feature points on that scan line. Thus our algorithm is truly an optimization method. Our algorithm offers a good trade off in terms of accuracy and computational efficiency. Regarding the results of our experiments, the proposed algorithm increases the accuracy from 20 to 70%, and reduces the running time of the algorithm almost 70%.

Keywords: Chain Correspondence, Color Stereo Matching, Dynamic Programming, Epipolar Line, Stereo Vision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2301
117 Some Applications of Transition Matrices via Eigen Values

Authors: Adil AL-Rammahi

Abstract:

In this short paper, new properties of transition matrix were introduced. Eigen values for small order transition matrices are calculated in flexible method. For benefit of these properties applications of these properties were studied in the solution of Markov's chain via steady state vector, and information theory via channel entropy. The implemented test examples were promised for usages.

Keywords: Eigen value problem, transition matrix, state vector, information theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2620
116 A Selective Markovianity Approach for Image Segmentation

Authors: A. Melouah, H. Merouani

Abstract:

A new Markovianity approach is introduced in this paper. This approach reduces the response time of classic Markov Random Fields approach. First, one region is determinated by a clustering technique. Then, this region is excluded from the study. The remaining pixel form the study zone and they are selected for a Markovianity segmentation task. With Selective Markovianity approach, segmentation process is faster than classic one.

Keywords: Markovianity, response time, segmentation, study zone.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1415
115 Enhancing Warehousing Operations in Cold Supply Chain through the Use of IoT and LiFi Technologies

Authors: S. El-Gamal, P. Hossam, A. Abd El Aziz, R. Mahmoud, A. Hassan, D. Hilal, E. Ayman, H. Haytham, O. Khamis

Abstract:

Several concerns fall upon the supply chain especially in cold supply chains. These concerns are mainly in the distribution and storage phases. This research focuses on the storage area, which contains several activities such as the picking activity that faces a lot of obstacles and challenges. The implementation of IoT solutions enables businesses to monitor the temperature of food items, which is perhaps the most critical parameter in cold chains. Therefore, the research at hand proposes a practical solution that would help in eliminating the problems related to ineffective picking for products especially fish and seafood products by using IoT technology, most notably LiFi technology; thus, guaranteeing sufficient picking, reducing waste, and consequently lowering costs. A prototype was specially designed and examined. This research is a single case study research. Two methods of data collection were used; observation and semi-structured interviews. Semi-structured interviews were conducted with managers and a decision maker at one of the biggest retail stores Carrefour, Alexandria, Egypt to validate the problem and the proposed practical solution using IoT and LiFi technology. A total of three interviews were conducted. As a result, a SWOT analysis was achieved in order to highlight all the strengths and weaknesses of using the recommended LiFi solution in the picking process. According to the investigations, it was found that, the use of IoT and LiFi technology is cost effective, efficient, and reduces human errors, minimizes the percentage of product waste and thus saves money and cost. Therefore, increasing customer satisfaction and profits could be achieved.

Keywords: Cold supply chain, IoT, LiFi, warehousing operation, picking process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 395
114 An Incomplete Factorization Preconditioner for LMS Adaptive Filter

Authors: Shazia Javed, Noor Atinah Ahmad

Abstract:

In this paper an efficient incomplete factorization preconditioner is proposed for the Least Mean Squares (LMS) adaptive filter. The proposed preconditioner is approximated from a priori knowledge of the factors of input correlation matrix with an incomplete strategy, motivated by the sparsity patter of the upper triangular factor in the QRD-RLS algorithm. The convergence properties of IPLMS algorithm are comparable with those of transform domain LMS(TDLMS) algorithm. Simulation results show efficiency and robustness of the proposed algorithm with reduced computational complexity.

Keywords: Autocorrelation matrix, Cholesky's factor, eigenvalue spread, Markov input.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1744
113 Model Order Reduction of Discrete-Time Systems Using Fuzzy C-Means Clustering

Authors: Anirudha Narain, Dinesh Chandra, Ravindra K. S.

Abstract:

A computationally simple approach of model order reduction for single input single output (SISO) and linear timeinvariant discrete systems modeled in frequency domain is proposed in this paper. Denominator of the reduced order model is determined using fuzzy C-means clustering while the numerator parameters are found by matching time moments and Markov parameters of high order system.

Keywords: Model Order reduction, Discrete-time system, Fuzzy C-Means Clustering, Padé approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2769