Search results for: Decision Points
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2204

Search results for: Decision Points

344 Risk Management Approach for Lean, Agile, Resilient and Green Supply Chain

Authors: Benmoussa Rachid, Deguio Roland, Dubois Sebastien, Rasovska Ivana

Abstract:

Implementation of LARG (Lean, Agile, Resilient, Green) practices in the supply chain management is a complex task mainly because ecological, economical and operational goals are usually in conflict. To implement these LARG practices successfully, companies’ need relevant decision making tools allowing processes performance control and improvement strategies visibility. To contribute to this issue, this work tries to answer the following research question: How to master performance and anticipate problems in supply chain LARG practices implementation? To answer this question, a risk management approach (RMA) is adopted. Indeed, the proposed RMA aims basically to assess the ability of a supply chain, guided by “Lean, Green and Achievement” performance goals, to face “agility and resilience risk” factors. To proof its relevance, a logistics academic case study based on simulation is used to illustrate all its stages. It shows particularly how to build the “LARG risk map” which is the main output of this approach.

Keywords: Risk approach, lean supply chain, agile supply chain, green supply chain, resilient supply chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1893
343 The Effects of Production, Transportation and Storage Conditions on Mold Growth in Compound Feeds

Authors: N. Cetinkaya

Abstract:

The objective of the present study is to determine the critical control points during the production, transportation and storage conditions of compound feeds to be used in the Hazard Analysis Critical Control Point (HACCP) feed safety management system. A total of 40 feed samples were taken after 20 and 40 days of storage periods from the 10 dairy and 10 beef cattle farms following the transportation of the compound feeds from the factory. In addition, before transporting the feeds from factory immediately after production of dairy and beef cattle compound feeds, 10 from each total 20 samples were taken as 0 day. In all feed samples, chemical composition and total aflatoxin levels were determined. The aflatoxin levels in all feed samples with the exception of 2 dairy cattle feeds were below the maximum acceptable level. With the increase in storage period in dairy feeds, the aflatoxin levels were increased to 4.96 ppb only in a BS8 dairy farm. This value is below the maximum permissible level (10 ppb) in beef cattle feed. The aflatoxin levels of dairy feed samples taken after production varied between 0.44 and 2.01 ppb. Aflatoxin levels were found to be between 0.89 and 3.01 ppb in dairy cattle feeds taken on the 20th day of storage at 10 dairy cattle farm. On the 40th day, feed aflatoxin levels in the same dairy cattle farm were found between 1.12 and 7.83 ppb. The aflatoxin levels were increased to 7.83 and 6.31 ppb in 2 dairy farms, after a storage period of 40 days. These obtained aflatoxin values are above the maximum permissible level in dairy cattle feeds. The 40 days storage in pellet form in the HACCP feed safety management system can be considered as a critical control point.

Keywords: Aflatoxin, beef cattle feed, compound feed, dairy cattle feed, HACCP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 853
342 Spatial and Temporal Variability of Fog Over the Indo-Gangetic Plains, India

Authors: Sanjay Kumar Srivastava, Anu Rani Sharma, Kamna Sachdeva

Abstract:

The aim of the paper is to analyze the characteristics of winter fog in terms of its trend and spatial-temporal variability over Indo-Gangetic plains. The study reveals that during last four and half decades (1971-2015), an alarming increasing trend in fog frequency has been observed during the winter months of December and January over the study area. The frequency of fog has increased by 118.4% during the peak winter months of December and January. It has also been observed that on an average central part of IGP has 66.29% fog days followed by west IGP with 41.94% fog days. Further, Empirical Orthogonal Function (EOF) decomposition and Mann-Kendall variation analysis are used to analyze the spatial and temporal patterns of winter fog. The findings have significant implications for the further research of fog over IGP and formulate robust strategies to adapt the fog variability and mitigate its effects. The decision by Delhi Government to implement odd-even scheme to restrict the use of private vehicles in order to reduce pollution and improve quality of air may result in increasing the alarming increasing trend of fog over Delhi and its surrounding areas regions of IGP.

Keywords: Fog, climatology, spatial variability, temporal variability, empirical orthogonal function, visibility, Mann-Kendall test, variation point.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1663
341 Improving Spatiotemporal Change Detection: A High Level Fusion Approach for Discovering Uncertain Knowledge from Satellite Image Database

Authors: Wadii Boulila, Imed Riadh Farah, Karim Saheb Ettabaa, Basel Solaiman, Henda Ben Ghezala

Abstract:

This paper investigates the problem of tracking spa¬tiotemporal changes of a satellite image through the use of Knowledge Discovery in Database (KDD). The purpose of this study is to help a given user effectively discover interesting knowledge and then build prediction and decision models. Unfortunately, the KDD process for spatiotemporal data is always marked by several types of imperfections. In our paper, we take these imperfections into consideration in order to provide more accurate decisions. To achieve this objective, different KDD methods are used to discover knowledge in satellite image databases. Each method presents a different point of view of spatiotemporal evolution of a query model (which represents an extracted object from a satellite image). In order to combine these methods, we use the evidence fusion theory which considerably improves the spatiotemporal knowledge discovery process and increases our belief in the spatiotemporal model change. Experimental results of satellite images representing the region of Auckland in New Zealand depict the improvement in the overall change detection as compared to using classical methods.

Keywords: Knowledge discovery in satellite databases, knowledge fusion, data imperfection, data mining, spatiotemporal change detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1551
340 An Analysis on the Appropriateness and Effectiveness of CCTV Location for Crime Prevention

Authors: Tae-Heon Moon, Sun-Young Heo, Sang-Ho Lee, Youn-Taik Leem, Kwang-Woo Nam

Abstract:

This study aims to investigate the possibility of crime prevention through CCTV by analyzing the appropriateness of the CCTV location, whether it is installed in the hotspot of crime-prone areas, and exploring the crime prevention effect and transition effect. The real crime and CCTV locations of case city were converted into the spatial data by using GIS. The data was analyzed by hotspot analysis and weighted displacement quotient (WDQ). As study methods, it analyzed existing relevant studies for identifying the trends of CCTV and crime studies based on big data from 1800 to 2014 and understanding the relation between CCTV and crime. Second, it investigated the current situation of nationwide CCTVs and analyzed the guidelines of CCTV installation and operation to draw attention to the problems and indicating points of CCTV use. Third, it investigated the crime occurrence in case areas and the current situation of CCTV installation in the spatial aspects, and analyzed the appropriateness and effectiveness of CCTV installation to suggest a rational installation of CCTV and the strategic direction of crime prevention. The results demonstrate that there was no significant effect in the installation of CCTV on crime prevention in the case area. This indicates that CCTV should be installed and managed in a more scientific way reflecting local crime situations. In terms of CCTV, the methods of spatial analysis such as GIS, which can evaluate the installation effect, and the methods of economic analysis like cost-benefit analysis should be developed. In addition, these methods should be distributed to local governments across the nation for the appropriate installation of CCTV and operation. This study intended to find a design guideline of the optimum CCTV installation. In this regard, this study is meaningful in that it will contribute to the creation of a safe city.

Keywords: CCTV, Safe City, Crime Prevention, Spatial Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2686
339 The Libyan Accounting Profession

Authors: Bubaker F. Shareia

Abstract:

The aim of this paper is to trace the historical development of the accounting profession in Libya, in order to identify challenges facing the profession as the country moves from a closed to emerging economy. The study is based on a literature review and archival research. Accounting information has a vital role to play in the achievement of economic goals in developing and emerging economies, but a well qualified accounting profession is required. In the context of institutional instability and unique cultural factors, the accounting profession in Libya faces educational and legal challenges if it is to achieve its potential in assisting the country to reach its economic goals. This study focuses on one country, which does limit its generalisability. However, it also suggests fruitful research areas in considering the impact and challenge of historic factors on the accounting profession in emerging economies. Centrally planned economies require a body of well trained professional accountants if they are to emerge onto the global economic arena. Studies on the accounting profession have focused primarily on those in developed economies, where the need for meaningful accounting information for decision making is taken for granted and there is a well trained, professional workforce. This study of the profession in an emerging economy highlights the efforts that will be needed to ensure the contribution of the profession to the economic wellbeing of other emerging economies.

Keywords: Accounting profession, developing countries, culture, planned economy, emerging economy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3506
338 Overall Function and Symptom Impact of Self-Applied Myofascial Release in Adult Patients with Fibromyalgia: A Seven-Week Pilot Study

Authors: Domenica Tambasco, Riina Bray

Abstract:

Fibromyalgia is a chronic condition characterized by widespread musculoskeletal pain, fatigue, and reduced function. Management of symptoms include medications, physical treatments and mindfulness therapies. Myofascial Release is a modality that has been successfully applied in various musculoskeletal conditions. However, to the author’s best knowledge, it is not yet recognized as a self-management therapy option in Fibromyalgia. In this study, we investigated whether Self-applied Myofascial Release (SMR) is associated with overall improved function and symptoms in Fibromyalgia. Eligible adult patients with a confirmed diagnosis of Fibromyalgia at Women’s College Hospital were recruited to SMR. Sessions ran for 1 hour once a week for 7 weeks, led by the same two physiotherapists knowledgeable in this physical treatment modality. The main outcome measure was an overall impact score for function and symptoms based on the validated assessment tool for fibromyalgia, the Revised Fibromyalgia Impact Questionnaire (FIQR), measured pre- and post-intervention. Both descriptive and analytical methods were applied and reported. We analyzed results using a paired t-test to determine if there was a statistically significant difference in mean FIQR scores between initial (pre-intervention) and final (post-intervention) scores. A clinically significant difference in FIQR was defined as a reduction in score by 10 or more points. Our pilot study showed that SMR appeared to be a safe and effective intervention for our fibromyalgia participants and the overall impact on function and symptoms occurred in only 7 weeks. Further studies with larger sample sizes comparing SMR to other physical treatment modalities (such as stretching) in an randomized control trial (RCT) are recommended.

Keywords: Fibromyalgia, myofascial release, fibromyalgia impact questionnaire, fibromyalgia assessment status.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 308
337 Capacity of Overloaded DS-CDMA System on Rayleigh Fading Channel with Timing Error

Authors: Preetam Kumar

Abstract:

The number of users supported in a DS-CDMA cellular system is typically less than spreading factor (N), and the system is said to be underloaded. Overloading is a technique to accommodate more number of users than the spreading factor N. In O/O overloading scheme, the first set is assigned to the N synchronous users and the second set is assigned to the additional synchronous users. An iterative multistage soft decision interference cancellation (SDIC) receiver is used to remove high level of interference between the two sets. Performance is evaluated in terms of the maximum number acceptable users so that the system performance is degraded slightly compared to the single user performance at a specified BER. In this paper, the capacity of CDMA based O/O overloading scheme is evaluated with SDIC receiver. It is observed that O/O scheme using orthogonal Gold codes provides 25% channel overloading (N=64) for synchronous DS-CDMA system on an AWGN channel in the uplink at a BER of 1e-5.For a Rayleigh faded channel, the critical capacity is 40% at a BER of 5e-5 assuming synchronous users. But in practical systems, perfect chip timing is very difficult to maintain in the uplink.. We have shown that the overloading performance reduces to 11% for a timing synchronization error of 0.02Tc for a BER of 1e-5.

Keywords: DS-CDMA, Interference Cancellation, MultiuserDetection, Orthogonal codes, Overloading.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720
336 Optimizing Dialogue Strategy Learning Using Learning Automata

Authors: G. Kumaravelan, R. Sivakumar

Abstract:

Modeling the behavior of the dialogue management in the design of a spoken dialogue system using statistical methodologies is currently a growing research area. This paper presents a work on developing an adaptive learning approach to optimize dialogue strategy. At the core of our system is a method formalizing dialogue management as a sequential decision making under uncertainty whose underlying probabilistic structure has a Markov Chain. Researchers have mostly focused on model-free algorithms for automating the design of dialogue management using machine learning techniques such as reinforcement learning. But in model-free algorithms there exist a dilemma in engaging the type of exploration versus exploitation. Hence we present a model-based online policy learning algorithm using interconnected learning automata for optimizing dialogue strategy. The proposed algorithm is capable of deriving an optimal policy that prescribes what action should be taken in various states of conversation so as to maximize the expected total reward to attain the goal and incorporates good exploration and exploitation in its updates to improve the naturalness of humancomputer interaction. We test the proposed approach using the most sophisticated evaluation framework PARADISE for accessing to the railway information system.

Keywords: Dialogue management, Learning automata, Reinforcement learning, Spoken dialogue system

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1615
335 The Martingale Options Price Valuation for European Puts Using Stochastic Differential Equation Models

Authors: H. C. Chinwenyi, H. D. Ibrahim, F. A. Ahmed

Abstract:

In modern financial mathematics, valuing derivatives such as options is often a tedious task. This is simply because their fair and correct prices in the future are often probabilistic. This paper examines three different Stochastic Differential Equation (SDE) models in finance; the Constant Elasticity of Variance (CEV) model, the Balck-Karasinski model, and the Heston model. The various Martingales option price valuation formulas for these three models were obtained using the replicating portfolio method. Also, the numerical solution of the derived Martingales options price valuation equations for the SDEs models was carried out using the Monte Carlo method which was implemented using MATLAB. Furthermore, results from the numerical examples using published data from the Nigeria Stock Exchange (NSE), all share index data show the effect of increase in the underlying asset value (stock price) on the value of the European Put Option for these models. From the results obtained, we see that an increase in the stock price yields a decrease in the value of the European put option price. Hence, this guides the option holder in making a quality decision by not exercising his right on the option.

Keywords: Equivalent Martingale Measure, European Put Option, Girsanov Theorem, Martingales, Monte Carlo method, option price valuation, option price valuation formula.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 740
334 A Robust Method for Finding Nearest-Neighbor using Hexagon Cells

Authors: Ahmad Attiq Al-Ogaibi, Ahmad Sharieh, Moh’d Belal Al-Zoubi, R. Bremananth

Abstract:

In pattern clustering, nearest neighborhood point computation is a challenging issue for many applications in the area of research such as Remote Sensing, Computer Vision, Pattern Recognition and Statistical Imaging. Nearest neighborhood computation is an essential computation for providing sufficient classification among the volume of pixels (voxels) in order to localize the active-region-of-interests (AROI). Furthermore, it is needed to compute spatial metric relationships of diverse area of imaging based on the applications of pattern recognition. In this paper, we propose a new methodology for finding the nearest neighbor point, depending on making a virtually grid of a hexagon cells, then locate every point beneath them. An algorithm is suggested for minimizing the computation and increasing the turnaround time of the process. The nearest neighbor query points Φ are fetched by seeking fashion of hexagon holistic. Seeking will be repeated until an AROI Φ is to be expected. If any point Υ is located then searching starts in the nearest hexagons in a circular way. The First hexagon is considered be level 0 (L0) and the surrounded hexagons is level 1 (L1). If Υ is located in L1, then search starts in the next level (L2) to ensure that Υ is the nearest neighbor for Φ. Based on the result and experimental results, we found that the proposed method has an advantage over the traditional methods in terms of minimizing the time complexity required for searching the neighbors, in turn, efficiency of classification will be improved sufficiently.

Keywords: Hexagon cells, k-nearest neighbors, Nearest Neighbor, Pattern recognition, Query pattern, Virtually grid

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2806
333 A Medical Images Based Retrieval System using Soft Computing Techniques

Authors: Pardeep Singh, Sanjay Sharma

Abstract:

Content-Based Image Retrieval (CBIR) has been one on the most vivid research areas in the field of computer vision over the last 10 years. Many programs and tools have been developed to formulate and execute queries based on the visual or audio content and to help browsing large multimedia repositories. Still, no general breakthrough has been achieved with respect to large varied databases with documents of difering sorts and with varying characteristics. Answers to many questions with respect to speed, semantic descriptors or objective image interpretations are still unanswered. In the medical field, images, and especially digital images, are produced in ever increasing quantities and used for diagnostics and therapy. In several articles, content based access to medical images for supporting clinical decision making has been proposed that would ease the management of clinical data and scenarios for the integration of content-based access methods into Picture Archiving and Communication Systems (PACS) have been created. This paper gives an overview of soft computing techniques. New research directions are being defined that can prove to be useful. Still, there are very few systems that seem to be used in clinical practice. It needs to be stated as well that the goal is not, in general, to replace text based retrieval methods as they exist at the moment.

Keywords: CBIR, GA, Rough sets, CBMIR

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2610
332 An AHP-Delphi Multi-Criteria Usage Cases Model with Application to Citrogypsum Decisions, Case Study: Kimia Gharb Gostar Industries Company

Authors: Mohsen Pirdashti, Masoomeh Omidi, Hemmatollah Pidashti

Abstract:

Today, advantage of biotechnology especially in environmental issues compared to other technologies is irrefragable. Kimia Gharb Gostar Industries Company, as a largest producer of citric acid in Middle East, applies biotechnology for this goal. Citrogypsum is a by–product of citric acid production and it considered as a valid residuum of this company. At this paper summary of acid citric production and condition of Citrogypsum production in company were introduced in addition to defmition of Citrogypsum production and its applications in world. According to these information and evaluation of present conditions about Iran needing to Citrogypsum, the best priority was introduced and emphasized on strategy selection and proper programming for self-sufficiency. The Delphi technique was used to elicit expert opinions about criteria for evaluating the usages. The criteria identified by the experts were profitability, capacity of production, the degree of investment, marketable, production ease and time production. The Analytical Hierarchy Process (ARP) and Expert Choice software were used to compare the alternatives on the criteria derived from the Delphi process.

Keywords: Analytical Hierarchy Process, ARP, Delphi, Multi- criteria decision making, Citrogypsum

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2318
331 Mixed Integer Programing for Multi-Tier Rebate with Discontinuous Cost Function

Authors: Y. Long, L. Liu, K. V. Branin

Abstract:

One challenge faced by procurement decision-maker during the acquisition process is how to compare similar products from different suppliers and allocate orders among different products or services. This work focuses on allocating orders among multiple suppliers considering rebate. The objective function is to minimize the total acquisition cost including purchasing cost and rebate benefit. Rebate benefit is complex and difficult to estimate at the ordering step. Rebate rules vary for different suppliers and usually change over time. In this work, we developed a system to collect the rebate policies, standardized the rebate policies and developed two-stage optimization models for ordering allocation. Rebate policy with multi-tiers is considered in modeling. The discontinuous cost function of rebate benefit is formulated for different scenarios. A piecewise linear function is used to approximate the discontinuous cost function of rebate benefit. And a Mixed Integer Programing (MIP) model is built for order allocation problem with multi-tier rebate. A case study is presented and it shows that our optimization model can reduce the total acquisition cost by considering rebate rules.

Keywords: Discontinuous cost function, mixed integer programming, optimization, procurement, rebate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 667
330 Preparation and Characterization of Pure PVA and PVA/MMT Matrix: Effect of Thermal Treatment

Authors: Albana Hasimi, Edlira Tako, Partizan Malkaj, Elvin Çomo, Blerina Papajani, Mirela Ndrita, Ledjan Malaj

Abstract:

Many endeavors have been exerted during the last years for developing new artificial polymeric membranes, which fulfill the demanded conditions for biomedical uses. One of the most tested polymers is Poly(vinyl alcohol) [PVA]. Our teams are based on the possibility of using PVA for personal protective equipment against COVID-19. In personal protective equipment, we explore the possibility of modifying the properties of the polymer by adding Montmorillonite [MMT]. Heat-treatment above the glass transition temperature is used to improve mechanical properties mainly by increasing the crystallinity of the polymer, which acts as a physical network. Temperature-Modulated Differential Scanning Calorimetry (TMDSC) measurements indicated that the presence of 0.5% MMT in PVA causes a higher Tg value and shaped peak of crystallinity. Decomposition is observed at two of the melting points of the crystals during heating 25-240 oC and overlap of the recrystallization ridges during cooling 240-25 oC. This is indicative of the presence of two types (quality or structure) of polymer crystals. On the other hand, some indication of improvement of the quality of the crystals by heat-treatment is given by the distinct non-reversing contribution to melting. Data on sorption and transport of water in PVA films: PVA pure and PVA/MMT matrix, modified by thermal treatment are presented. The membranes become more rigid as a result of the heat treatment and because of this the water uptake is significantly lower in membranes. That is indicated by analysis of the resulting water uptake kinetics. The presence of 0.5% w/w of MMT has no significant impact on the properties of PVA membranes. Water uptake kinetics deviate from Fick’s law due to slow relaxation of glassy polymer matrix for all types of membranes.

Keywords: Crystallinity, montmorillonite, nanocomposite, poly(vinyl alcohol).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 233
329 Hybrid of Hunting Search and Modified Simplex Methods for Grease Position Parameter Design Optimisation

Authors: P. Luangpaiboon, S. Boonhao

Abstract:

This study proposes a multi-response surface optimization problem (MRSOP) for determining the proper choices of a process parameter design (PPD) decision problem in a noisy environment of a grease position process in an electronic industry. The proposed models attempts to maximize dual process responses on the mean of parts between failure on left and right processes. The conventional modified simplex method and its hybridization of the stochastic operator from the hunting search algorithm are applied to determine the proper levels of controllable design parameters affecting the quality performances. A numerical example demonstrates the feasibility of applying the proposed model to the PPD problem via two iterative methods. Its advantages are also discussed. Numerical results demonstrate that the hybridization is superior to the use of the conventional method. In this study, the mean of parts between failure on left and right lines improve by 39.51%, approximately. All experimental data presented in this research have been normalized to disguise actual performance measures as raw data are considered to be confidential.

Keywords: Grease Position Process, Multi-response Surfaces, Modified Simplex Method, Hunting Search Method, Desirability Function Approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1691
328 An Investigation into Kanji Character Discrimination Process from EEG Signals

Authors: Hiroshi Abe, Minoru Nakayama

Abstract:

The frontal area in the brain is known to be involved in behavioral judgement. Because a Kanji character can be discriminated visually and linguistically from other characters, in Kanji character discrimination, we hypothesized that frontal event-related potential (ERP) waveforms reflect two discrimination processes in separate time periods: one based on visual analysis and the other based on lexcical access. To examine this hypothesis, we recorded ERPs while performing a Kanji lexical decision task. In this task, either a known Kanji character, an unknown Kanji character or a symbol was presented and the subject had to report if the presented character was a known Kanji character for the subject or not. The same response was required for unknown Kanji trials and symbol trials. As a preprocessing of signals, we examined the performance of a method using independent component analysis for artifact rejection and found it was effective. Therefore we used it. In the ERP results, there were two time periods in which the frontal ERP wavefoms were significantly different betweeen the unknown Kanji trials and the symbol trials: around 170ms and around 300ms after stimulus onset. This result supported our hypothesis. In addition, the result suggests that Kanji character lexical access may be fully completed by around 260ms after stimulus onset.

Keywords: Character discrimination, Event-related Potential, IndependentComponent Analysis, Kanji, Lexical access.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1788
327 Application of Stochastic Models to Annual Extreme Streamflow Data

Authors: Karim Hamidi Machekposhti, Hossein Sedghi

Abstract:

This study was designed to find the best stochastic model (using of time series analysis) for annual extreme streamflow (peak and maximum streamflow) of Karkheh River at Iran. The Auto-regressive Integrated Moving Average (ARIMA) model used to simulate these series and forecast those in future. For the analysis, annual extreme streamflow data of Jelogir Majin station (above of Karkheh dam reservoir) for the years 1958–2005 were used. A visual inspection of the time plot gives a little increasing trend; therefore, series is not stationary. The stationarity observed in Auto-Correlation Function (ACF) and Partial Auto-Correlation Function (PACF) plots of annual extreme streamflow was removed using first order differencing (d=1) in order to the development of the ARIMA model. Interestingly, the ARIMA(4,1,1) model developed was found to be most suitable for simulating annual extreme streamflow for Karkheh River. The model was found to be appropriate to forecast ten years of annual extreme streamflow and assist decision makers to establish priorities for water demand. The Statistical Analysis System (SAS) and Statistical Package for the Social Sciences (SPSS) codes were used to determinate of the best model for this series.

Keywords: Stochastic models, ARIMA, extreme streamflow, Karkheh River.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 726
326 Application of GIS-Based Construction Engineering: An Electronic Document Management System

Authors: Mansour N. Jadid

Abstract:

This paper describes the implementation of a GIS to provide decision support for successfully monitoring the movements and storage of materials, hence ensuring that finished products travel from the point of origin to the destination construction site through the supply-chain management (SCM) system. This system ensures the efficient operation of suppliers, manufacturers, and distributors by determining the shortest path from the point of origin to the final destination to reduce construction costs, minimize time, and enhance productivity. These systems are essential to the construction industry because they reduce costs and save time, thereby improve productivity and effectiveness. This study describes a typical supply-chain model and a geographical information system (GIS)-based SCM that focuses on implementing an electronic document management system, which maps the application framework to integrate geodetic support with the supply-chain system. This process provides guidance for locating the nearest suppliers to fill the information needs of project members in different locations. Moreover, this study illustrates the use of a GIS-based SCM as a collaborative tool in innovative methods for implementing Web mapping services, as well as aspects of their integration by generating an interactive GIS for the construction industry platform.

Keywords: Construction, coordinate, engineering, GIS, management, map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454
325 A Novel Receiver Algorithm for Coherent Underwater Acoustic Communications

Authors: Liang Zhao, Jianhua Ge

Abstract:

In this paper, we proposed a novel receiver algorithm for coherent underwater acoustic communications. The proposed receiver is composed of three parts: (1) Doppler tracking and correction, (2) Time reversal channel estimation and combining, and (3) Joint iterative equalization and decoding (JIED). To reduce computational complexity and optimize the equalization algorithm, Time reversal (TR) channel estimation and combining is adopted to simplify multi-channel adaptive decision feedback equalizer (ADFE) into single channel ADFE without reducing the system performance. Simultaneously, the turbo theory is adopted to form joint iterative ADFE and convolutional decoder (JIED). In JIED scheme, the ADFE and decoder exchange soft information in an iterative manner, which can enhance the equalizer performance using decoding gain. The simulation results show that the proposed algorithm can reduce computational complexity and improve the performance of equalizer. Therefore, the performance of coherent underwater acoustic communications can be improved greatly.

Keywords: Underwater acoustic communication, Time reversal (TR) combining, joint iterative equalization and decoding (JIED)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731
324 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory

Authors: Chiung-Hui Chen

Abstract:

The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.

Keywords: Behavior, big data, hierarchical Hidden Markov Model, intelligent object.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 766
323 Analyzing Environmental Emotive Triggers in Terrorist Propaganda

Authors: Travis Morris

Abstract:

The purpose of this study is to measure the intersection of environmental security entities in terrorist propaganda. To the best of author’s knowledge, this is the first study of its kind to examine this intersection within terrorist propaganda. Rosoka, natural language processing software and frame analysis are used to advance our understanding of how environmental frames function as emotive triggers. Violent jihadi demagogues use frames to suggest violent and non-violent solutions to their grievances. Emotive triggers are framed in a way to leverage individual and collective attitudes in psychological warfare. A comparative research design is used because of the differences and similarities that exist between two variants of violent jihadi propaganda that target western audiences. Analysis is based on salience and network text analysis, which generates violent jihadi semantic networks. Findings indicate that environmental frames are used as emotive triggers across both data sets, but also as tactical and information data points. A significant finding is that certain core environmental emotive triggers like “water,” “soil,” and “trees” are significantly salient at the aggregate level across both data sets. All environmental entities can be classified into two categories, symbolic and literal. Importantly, this research illustrates how demagogues use environmental emotive triggers in cyber space from a subcultural perspective to mobilize target audiences to their ideology and praxis. Understanding the anatomy of propaganda construction is necessary in order to generate effective counter narratives in information operations. This research advances an additional method to inform practitioners and policy makers of how environmental security and propaganda intersect.

Keywords: Emotive triggers, environmental security, natural language processing, propaganda analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 958
322 Machine Learning for Music Aesthetic Annotation Using MIDI Format: A Harmony-Based Classification Approach

Authors: Lin Yang, Zhian Mi, Jiacheng Xiao, Rong Li

Abstract:

Swimming with the tide of deep learning, the field of music information retrieval (MIR) experiences parallel development and a sheer variety of feature-learning models has been applied to music classification and tagging tasks. Among those learning techniques, the deep convolutional neural networks (CNNs) have been widespreadly used with better performance than the traditional approach especially in music genre classification and prediction. However, regarding the music recommendation, there is a large semantic gap between the corresponding audio genres and the various aspects of a song that influence user preference. In our study, aiming to bridge the gap, we strive to construct an automatic music aesthetic annotation model with MIDI format for better comparison and measurement of the similarity between music pieces in the way of harmonic analysis. We use the matrix of qualification converted from MIDI files as input to train two different classifiers, support vector machine (SVM) and Decision Tree (DT). Experimental results in performance of a tag prediction task have shown that both learning algorithms are capable of extracting high-level properties in an end-to end manner from music information. The proposed model is helpful to learn the audience taste and then the resulting recommendations are likely to appeal to a niche consumer.

Keywords: Harmonic analysis, machine learning, music classification and tagging, MIDI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 763
321 Accurate And Efficient Global Approximation using Adaptive Polynomial RSM for Complex Mechanical and Vehicular Performance Models

Authors: Y. Z. Wu, Z. Dong, S. K. You

Abstract:

Global approximation using metamodel for complex mathematical function or computer model over a large variable domain is often needed in sensibility analysis, computer simulation, optimal control, and global design optimization of complex, multiphysics systems. To overcome the limitations of the existing response surface (RS), surrogate or metamodel modeling methods for complex models over large variable domain, a new adaptive and regressive RS modeling method using quadratic functions and local area model improvement schemes is introduced. The method applies an iterative and Latin hypercube sampling based RS update process, divides the entire domain of design variables into multiple cells, identifies rougher cells with large modeling error, and further divides these cells along the roughest dimension direction. A small number of additional sampling points from the original, expensive model are added over the small and isolated rough cells to improve the RS model locally until the model accuracy criteria are satisfied. The method then combines local RS cells to regenerate the global RS model with satisfactory accuracy. An effective RS cells sorting algorithm is also introduced to improve the efficiency of model evaluation. Benchmark tests are presented and use of the new metamodeling method to replace complex hybrid electrical vehicle powertrain performance model in vehicle design optimization and optimal control are discussed.

Keywords: Global approximation, polynomial response surface, domain decomposition, domain combination, multiphysics modeling, hybrid powertrain optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911
320 A Machine Learning Approach for Anomaly Detection in Environmental IoT-Driven Wastewater Purification Systems

Authors: Giovanni Cicceri, Roberta Maisano, Nathalie Morey, Salvatore Distefano

Abstract:

The main goal of this paper is to present a solution for a water purification system based on an Environmental Internet of Things (EIoT) platform to monitor and control water quality and machine learning (ML) models to support decision making and speed up the processes of purification of water. A real case study has been implemented by deploying an EIoT platform and a network of devices, called Gramb meters and belonging to the Gramb project, on wastewater purification systems located in Calabria, south of Italy. The data thus collected are used to control the wastewater quality, detect anomalies and predict the behaviour of the purification system. To this extent, three different statistical and machine learning models have been adopted and thus compared: Autoregressive Integrated Moving Average (ARIMA), Long Short Term Memory (LSTM) autoencoder, and Facebook Prophet (FP). The results demonstrated that the ML solution (LSTM) out-perform classical statistical approaches (ARIMA, FP), in terms of both accuracy, efficiency and effectiveness in monitoring and controlling the wastewater purification processes.

Keywords: EIoT, machine learning, anomaly detection, environment monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1037
319 Improvement of Water Distillation Plant by Using Statistical Process Control System

Authors: Qasim Kriri, Harsh B. Desai

Abstract:

Water supply and sanitation in Saudi Arabia is portrayed by difficulties and accomplishments. One of the fundamental difficulties is water shortage. With a specific end goal to beat water shortage, significant ventures have been attempted in sea water desalination, water circulation, sewerage, and wastewater treatment. The motivation behind Statistical Process Control (SPC) is to decide whether the execution of a procedure is keeping up an acceptable quality level [AQL]. SPC is an analytical decision-making method. A fundamental apparatus in the SPC is the Control Charts, which follow the inconstancy in the estimations of the item quality attributes. By utilizing the suitable outline, administration can decide whether changes should be made with a specific end goal to keep the procedure in charge. The two most important quality factors in the distilled water which were taken into consideration were pH (Potential of Hydrogen) and TDS (Total Dissolved Solids). There were three stages at which the quality checks were done. The stages were as follows: (1) Water at the source, (2) water after chemical treatment & (3) water which is sent for packing. The upper specification limit, central limit and lower specification limit are taken as per Saudi water standards. The procedure capacity to accomplish the particulars set for the quality attributes of Berain water Factory chose to be focused by the proposed SPC system.

Keywords: Acceptable quality level, statistical quality control, control charts, process charts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1070
318 Using the Combined Model of PROMETHEE and Fuzzy Analytic Network Process for Determining Question Weights in Scientific Exams through Data Mining Approach

Authors: Hassan Haleh, Amin Ghaffari, Parisa Farahpour

Abstract:

Need for an appropriate system of evaluating students- educational developments is a key problem to achieve the predefined educational goals. Intensity of the related papers in the last years; that tries to proof or disproof the necessity and adequacy of the students assessment; is the corroborator of this matter. Some of these studies tried to increase the precision of determining question weights in scientific examinations. But in all of them there has been an attempt to adjust the initial question weights while the accuracy and precision of those initial question weights are still under question. Thus In order to increase the precision of the assessment process of students- educational development, the present study tries to propose a new method for determining the initial question weights by considering the factors of questions like: difficulty, importance and complexity; and implementing a combined method of PROMETHEE and fuzzy analytic network process using a data mining approach to improve the model-s inputs. The result of the implemented case study proves the development of performance and precision of the proposed model.

Keywords: Assessing students, Analytic network process, Clustering, Data mining, Fuzzy sets, Multi-criteria decision making, and Preference function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1585
317 Defining a Pathway to Zero Energy Building: A Case Study on Retrofitting an Old Office Building into a Net Zero Energy Building for Hot-Humid Climate

Authors: Kwame B. O. Amoah

Abstract:

This paper focuses on retrofitting an old existing office building to a net-zero energy building (NZEB). An existing small office building in Melbourne, Florida, was chosen as a case study to integrate state-of-the-art design strategies and energy-efficient building systems to improve building performance and reduce energy consumption. The study aimed to explore possible ways to maximize energy savings and renewable energy generation sources to cover the building's remaining energy needs necessary to achieve net-zero energy goals. A series of retrofit options were reviewed and adopted with some significant additional decision considerations. Detailed processes and considerations leading to zero energy are well documented in this study, with lessons learned adequately outlined. Based on building energy simulations, multiple design considerations were investigated, such as emerging state-of-the-art technologies, material selection, improvements to the building envelope, optimization of the HVAC, lighting systems, and occupancy loads analysis, as well as the application of renewable energy sources. The comparative analysis of simulation results was used to determine how specific techniques led to energy saving and cost reductions. The research results indicate that this small office building can meet net-zero energy use after appropriate design manipulations and renewable energy sources.

Keywords: Energy consumption, building energy analysis, energy retrofits, energy-efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 346
316 Application of Machine Learning Methods to Online Test Error Detection in Semiconductor Test

Authors: Matthias Kirmse, Uwe Petersohn, Elief Paffrath

Abstract:

As in today's semiconductor industries test costs can make up to 50 percent of the total production costs, an efficient test error detection becomes more and more important. In this paper, we present a new machine learning approach to test error detection that should provide a faster recognition of test system faults as well as an improved test error recall. The key idea is to learn a classifier ensemble, detecting typical test error patterns in wafer test results immediately after finishing these tests. Since test error detection has not yet been discussed in the machine learning community, we define central problem-relevant terms and provide an analysis of important domain properties. Finally, we present comparative studies reflecting the failure detection performance of three individual classifiers and three ensemble methods based upon them. As base classifiers we chose a decision tree learner, a support vector machine and a Bayesian network, while the compared ensemble methods were simple and weighted majority vote as well as stacking. For the evaluation, we used cross validation and a specially designed practical simulation. By implementing our approach in a semiconductor test department for the observation of two products, we proofed its practical applicability.

Keywords: Ensemble methods, fault detection, machine learning, semiconductor test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2280
315 Aircraft Automatic Collision Avoidance Using Spiral Geometric Approach

Authors: M. Orefice, V. Di Vito

Abstract:

This paper provides a description of a Collision Avoidance algorithm that has been developed starting from the mathematical modeling of the flight of insects, in terms of spirals and conchospirals geometric paths. It is able to calculate a proper avoidance manoeuver aimed to prevent the infringement of a predefined distance threshold between ownship and the considered intruder, while minimizing the ownship trajectory deviation from the original path and in compliance with the aircraft performance limitations and dynamic constraints. The algorithm is designed in order to be suitable for real-time applications, so that it can be considered for the implementation in the most recent airborne automatic collision avoidance systems using the traffic data received through an ADS-B IN device. The presented approach is able to take into account the rules-of-the-air, due to the possibility to select, through specifically designed decision making logic based on the consideration of the encounter geometry, the direction of the calculated collision avoidance manoeuver that allows complying with the rules-of-the-air, as for instance the fundamental right of way rule. In the paper, the proposed collision avoidance algorithm is presented and its preliminary design and software implementation is described. The applicability of this method has been proved through preliminary simulation tests performed in a 2D environment considering single intruder encounter geometries, as reported and discussed in the paper.

Keywords: collision avoidance, RPAS, spiral geometry, ADS-B based application

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1669