Search results for: stochastic subspace identification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3409

Search results for: stochastic subspace identification

2569 Identification of Risks Associated with Process Automation Systems

Authors: J. K. Visser, H. T. Malan

Abstract:

A need exists to identify the sources of risks associated with the process automation systems within petrochemical companies or similar energy related industries. These companies use many different process automation technologies in its value chain. A crucial part of the process automation system is the information technology component featuring in the supervisory control layer. The ever-changing technology within the process automation layers and the rate at which it advances pose a risk to safe and predictable automation system performance. The age of the automation equipment also provides challenges to the operations and maintenance managers of the plant due to obsolescence and unavailability of spare parts. The main objective of this research was to determine the risk sources associated with the equipment that is part of the process automation systems. A secondary objective was to establish whether technology managers and technicians were aware of the risks and share the same viewpoint on the importance of the risks associated with automation systems. A conceptual model for risk sources of automation systems was formulated from models and frameworks in literature. This model comprised six categories of risk which forms the basis for identifying specific risks. This model was used to develop a questionnaire that was sent to 172 instrument technicians and technology managers in the company to obtain primary data. 75 completed and useful responses were received. These responses were analyzed statistically to determine the highest risk sources and to determine whether there was difference in opinion between technology managers and technicians. The most important risks that were revealed in this study are: 1) the lack of skilled technicians, 2) integration capability of third-party system software, 3) reliability of the process automation hardware, 4) excessive costs pertaining to performing maintenance and migrations on process automation systems, and 5) requirements of having third-party communication interfacing compatibility as well as real-time communication networks.

Keywords: distributed control system, identification of risks, information technology, process automation system

Procedia PDF Downloads 139
2568 Identifying Lead Poisoning Risk Factors among Non-Pregnant Adults in New York City through Motivational Interviewing Techniques

Authors: Nevila Bardhi, Joanna Magda, Kolapo Alex-Oni, Slavenka Sedlar, Paromita Hore

Abstract:

The New York City Department of Health and Mental Hygiene (NYC DOHMH) receives blood lead test results for NYC residents and conducts lead poisoning case investigations for individuals with elevated blood lead levels exposed to lead occupationally and non-occupationally. To (1) improve participant engagement, (2) aid the identification of potential lead sources, and (3) better tailor recommendations to reduce lead exposure, Motivational Interviewing (MI) techniques were incorporated during risk assessment interviews of non-pregnant adults by DOHMH’s Adult Lead Poisoning Prevention (ALP) Program. MI is an evidence-based counselling method used in clinical settings that have been effective in promoting behavior change by resolving ambivalence and enhancing motivation in treating both physiological and psychological health conditions. The incorporation of MI techniques in the ALP risk assessment interview was effective in improving the identification of lead sources for non-pregnant adult cases, thus, allowing for the opportunity to better tailor lead poisoning prevention recommendations. The embedding of MI cues in the ALP risk assessment interview also significantly increased engagement in the interview process, resulting in approximately 50 more interviews conducted per year and a decrease in interview refusals during case investigations. Additionally, the pre-MI interview completion rate was 57%, while the post-MI Interview completion rate was 68%. We recommend MI techniques to be used by other lead poisoning prevention programs during lead poisoning investigations in similar diverse populations.

Keywords: lead poisoning prevention, motivational interviewing, behavior change, lead poisoning risk factors, self-efficacy

Procedia PDF Downloads 89
2567 Process for Analyzing Information Security Risks Associated with the Incorporation of Online Dispute Resolution Systems in the Context of Conciliation in Colombia

Authors: Jefferson Camacho Mejia, Jenny Paola Forero Pachon, Luis Carlos Gomez Florez

Abstract:

The innumerable possibilities offered by the use of Information Technology (IT) in the development of different socio-economic activities has made a change in the social paradigm and the emergence of the so-called information and knowledge society. The Colombian government, aware of this reality, has been promoting the use of IT as part of the E-government strategy adopted in the country. However, it is well known that the use of IT implies the existence of certain threats that put the security of information in the digital environment at risk. One of the priorities of the Colombian government is to improve access to alternative justice through IT, in particular, access to Alternative Dispute Resolution (ADR): conciliation, arbitration and friendly composition; by means of which it is sought that the citizens directly resolve their differences. To this end, a trend has been identified in the use of Online Dispute Resolution (ODR) systems, which extend the benefits of ADR to the digital environment through the use of IT. This article presents a process for the analysis of information security risks associated with the incorporation of ODR systems in the context of conciliation in Colombia, based on four fundamental stages identified in the literature: (I) Identification of assets, (II) Identification of threats and vulnerabilities (III) Estimation of the impact and 4) Estimation of risk levels. The methodological design adopted for this research was the grounded theory, since it involves interactions that are applied to a specific context and from the perspective of diverse participants. As a result of this investigation, the activities to be followed are defined to carry out an analysis of information security risks, in the context of the conciliation in Colombia supported by ODR systems, thus contributing to the estimation of the risks to make possible its subsequent treatment.

Keywords: alternative dispute resolution, conciliation, information security, online dispute resolution systems, process, risk analysis

Procedia PDF Downloads 239
2566 Methods for Solving Identification Problems

Authors: Fadi Awawdeh

Abstract:

In this work, we highlight the key concepts in using semigroup theory as a methodology used to construct efficient formulas for solving inverse problems. The proposed method depends on some results concerning integral equations. The experimental results show the potential and limitations of the method and imply directions for future work.

Keywords: identification problems, semigroup theory, methods for inverse problems, scientific computing

Procedia PDF Downloads 481
2565 One-Step Time Series Predictions with Recurrent Neural Networks

Authors: Vaidehi Iyer, Konstantin Borozdin

Abstract:

Time series prediction problems have many important practical applications, but are notoriously difficult for statistical modeling. Recently, machine learning methods have been attracted significant interest as a practical tool applied to a variety of problems, even though developments in this field tend to be semi-empirical. This paper explores application of Long Short Term Memory based Recurrent Neural Networks to the one-step prediction of time series for both trend and stochastic components. Two types of data are analyzed - daily stock prices, that are often considered to be a typical example of a random walk, - and weather patterns dominated by seasonal variations. Results from both analyses are compared, and reinforced learning framework is used to select more efficient between Recurrent Neural Networks and more traditional auto regression methods. It is shown that both methods are able to follow long-term trends and seasonal variations closely, but have difficulties with reproducing day-to-day variability. Future research directions and potential real world applications are briefly discussed.

Keywords: long short term memory, prediction methods, recurrent neural networks, reinforcement learning

Procedia PDF Downloads 228
2564 Cas9-Assisted Direct Cloning and Refactoring of a Silent Biosynthetic Gene Cluster

Authors: Peng Hou

Abstract:

Natural products produced from marine bacteria serve as an immense reservoir for anti-infective drugs and therapeutic agents. Nowadays, heterologous expression of gene clusters of interests has been widely adopted as an effective strategy for natural product discovery. Briefly, the heterologous expression flowchart would be: biosynthetic gene cluster identification, pathway construction and expression, and product detection. However, gene cluster capture using traditional Transformation-associated recombination (TAR) protocol is low-efficient (0.5% positive colony rate). To make things worse, most of these putative new natural products are only predicted by bioinformatics analysis such as antiSMASH, and their corresponding natural products biosynthetic pathways are either not expressed or expressed at very low levels under laboratory conditions. Those setbacks have inspired us to focus on seeking new technologies to efficiently edit and refractor of biosynthetic gene clusters. Recently, two cutting-edge techniques have attracted our attention - the CRISPR-Cas9 and Gibson Assembly. By now, we have tried to pretreat Brevibacillus laterosporus strain genomic DNA with CRISPR-Cas9 nucleases that specifically generated breaks near the gene cluster of interest. This trial resulted in an increase in the efficiency of gene cluster capture (9%). Moreover, using Gibson Assembly by adding/deleting certain operon and tailoring enzymes regardless of end compatibility, the silent construct (~80kb) has been successfully refactored into an active one, yielded a series of analogs expected. With the appearances of the novel molecular tools, we are confident to believe that development of a high throughput mature pipeline for DNA assembly, transformation, product isolation and identification would no longer be a daydream for marine natural product discovery.

Keywords: biosynthesis, CRISPR-Cas9, DNA assembly, refactor, TAR cloning

Procedia PDF Downloads 282
2563 Changes in Geospatial Structure of Households in the Czech Republic: Findings from Population and Housing Census

Authors: Jaroslav Kraus

Abstract:

Spatial information about demographic processes are a standard part of outputs in the Czech Republic. That was also the case of Population and Housing Census which was held on 2011. This is a starting point for a follow up study devoted to two basic types of households: single person households and households of one completed family. Single person households and one family households create more than 80 percent of all households, but the share and spatial structure is in long-term changing. The increase of single households is results of long-term fertility decrease and divorce increase, but also possibility of separate living. There are regions in the Czech Republic with traditional demographic behavior, and regions like capital Prague and some others with changing pattern. Population census is based - according to international standards - on the concept of currently living population. Three types of geospatial approaches will be used for analysis: (i) firstly measures of geographic distribution, (ii) secondly mapping clusters to identify the locations of statistically significant hot spots, cold spots, spatial outliers, and similar features and (iii) finally analyzing pattern approach as a starting point for more in-depth analyses (geospatial regression) in the future will be also applied. For analysis of this type of data, number of households by types should be distinct objects. All events in a meaningful delimited study region (e.g. municipalities) will be included in an analysis. Commonly produced measures of central tendency and spread will include: identification of the location of the center of the point set (by NUTS3 level); identification of the median center and standard distance, weighted standard distance and standard deviational ellipses will be also used. Identifying that clustering exists in census households datasets does not provide a detailed picture of the nature and pattern of clustering but will be helpful to apply simple hot-spot (and cold spot) identification techniques to such datasets. Once the spatial structure of households will be determined, any particular measure of autocorrelation can be constructed by defining a way of measuring the difference between location attribute values. The most widely used measure is Moran’s I that will be applied to municipal units where numerical ratio is calculated. Local statistics arise naturally out of any of the methods for measuring spatial autocorrelation and will be applied to development of localized variants of almost any standard summary statistic. Local Moran’s I will give an indication of household data homogeneity and diversity on a municipal level.

Keywords: census, geo-demography, households, the Czech Republic

Procedia PDF Downloads 96
2562 Descriptive Study of Tropical Tree Species in Commercial Interest Biosphere Reserve Luki in the Democratic Republic of Congo (DRC)

Authors: Armand Okende, Joëlle De Weerdt, Esther Fichtler, Maaike De Ridder, Hans Beeckman

Abstract:

The rainforest plays a crucial role in regulating the climate balance. The biodiversity of tropical rainforests is undeniable, but many aspects remain poorly known, which directly influences its management. Despite the efforts of sustainable forest management, human pressure in terms of exploitation and smuggling of timber forms a problem compared to exploited species whose status is considered "vulnerable" on the IUCN red list compiled by. Commercial species in Class III of the Democratic Republic of Congo are the least known in the market operating, and their biology is unknown or non-existent. Identification of wood in terms of descriptions and anatomical measurements of the wood is in great demand for various stakeholders such as scientists, customs, IUCN, etc. The objective of this study is the qualitative and quantitative description of the anatomical characteristics of commercial species in Class III of DR Congo. The site of the Luki Biosphere Reserve was chosen because of its high tree species richness. This study focuses on the wood anatomy of 14 commercial species of Class III of DR Congo. Thirty-four wooden discs were collected for these species. The following parameters were measured in the field: Diameter at breast height (DBH), total height and geographic coordinates. Microtomy, identification of vessel parameters (diameter, density and grouping) and photograph of the microscopic sections and determining age were performed in this study. The results obtained are detailed anatomical descriptions of species in Class III of the Democratic Republic of Congo.

Keywords: sustainable management of forest, rainforest, commercial species of class iii, vessel diameter, vessel density, grouping vessel

Procedia PDF Downloads 214
2561 Non-Destructive Static Damage Detection of Structures Using Genetic Algorithm

Authors: Amir Abbas Fatemi, Zahra Tabrizian, Kabir Sadeghi

Abstract:

To find the location and severity of damage that occurs in a structure, characteristics changes in dynamic and static can be used. The non-destructive techniques are more common, economic, and reliable to detect the global or local damages in structures. This paper presents a non-destructive method in structural damage detection and assessment using GA and static data. Thus, a set of static forces is applied to some of degrees of freedom and the static responses (displacements) are measured at another set of DOFs. An analytical model of the truss structure is developed based on the available specification and the properties derived from static data. The damages in structure produce changes to its stiffness so this method used to determine damage based on change in the structural stiffness parameter. Changes in the static response which structural damage caused choose to produce some simultaneous equations. Genetic Algorithms are powerful tools for solving large optimization problems. Optimization is considered to minimize objective function involve difference between the static load vector of damaged and healthy structure. Several scenarios defined for damage detection (single scenario and multiple scenarios). The static damage identification methods have many advantages, but some difficulties still exist. So it is important to achieve the best damage identification and if the best result is obtained it means that the method is Reliable. This strategy is applied to a plane truss. This method is used for a plane truss. Numerical results demonstrate the ability of this method in detecting damage in given structures. Also figures show damage detections in multiple damage scenarios have really efficient answer. Even existence of noise in the measurements doesn’t reduce the accuracy of damage detections method in these structures.

Keywords: damage detection, finite element method, static data, non-destructive, genetic algorithm

Procedia PDF Downloads 237
2560 Duplex Real-Time Loop-Mediated Isothermal Amplification Assay for Simultaneous Detection of Beef and Pork

Authors: Mi-Ju Kim, Hae-Yeong Kim

Abstract:

Product mislabeling and adulteration have been increasing the concerns in processed meat products. Relatively inexpensive pork meat compared to meat such as beef was adulterated for economic benefit. These food fraud incidents related to pork were concerned due to economic, religious and health reasons. In this study, a rapid on-site detection method using loop-mediated isothermal amplification (LAMP) was developed for the simultaneous identification of beef and pork. Each specific LAMP primer for beef and pork was designed targeting on mitochondrial D-loop region. The LAMP assay reaction was performed at 65 ℃ for 40 min. The specificity of each primer for beef and pork was evaluated using DNAs extracted from 13 animal species including beef and pork. The sensitivity of duplex LAMP assay was examined by serial dilution of beef and pork DNAs, and reference binary mixtures. This assay was applied to processed meat products including beef and pork meat for monitoring. Each set of primers amplified only the targeted species with no cross-reactivity with animal species. The limit of detection of duplex real-time LAMP was 1 pg for each DNA of beef and pork and 1% pork in a beef-meat mixture. Commercial meat products that declared the presence of beef and/or pork meat on the label showed positive results for those species. This method was successfully applied to detect simultaneous beef and pork meats in processed meat products. The optimized duplex LAMP assay can identify simultaneously beef and pork meat within less than 40 min. A portable real-time fluorescence device used in this study is applicable for on-site detection of beef and pork in processed meat products. Thus, this developed assay was considered to be an efficient tool for monitoring meat products.

Keywords: beef, duplex real-time LAMP, meat identification, pork

Procedia PDF Downloads 224
2559 Analysis of Real Time Seismic Signal Dataset Using Machine Learning

Authors: Sujata Kulkarni, Udhav Bhosle, Vijaykumar T.

Abstract:

Due to the closeness between seismic signals and non-seismic signals, it is vital to detect earthquakes using conventional methods. In order to distinguish between seismic events and non-seismic events depending on their amplitude, our study processes the data that come from seismic sensors. The authors suggest a robust noise suppression technique that makes use of a bandpass filter, an IIR Wiener filter, recursive short-term average/long-term average (STA/LTA), and Carl short-term average (STA)/long-term average for event identification (LTA). The trigger ratio used in the proposed study to differentiate between seismic and non-seismic activity is determined. The proposed work focuses on significant feature extraction for machine learning-based seismic event detection. This serves as motivation for compiling a dataset of all features for the identification and forecasting of seismic signals. We place a focus on feature vector dimension reduction techniques due to the temporal complexity. The proposed notable features were experimentally tested using a machine learning model, and the results on unseen data are optimal. Finally, a presentation using a hybrid dataset (captured by different sensors) demonstrates how this model may also be employed in a real-time setting while lowering false alarm rates. The planned study is based on the examination of seismic signals obtained from both individual sensors and sensor networks (SN). A wideband seismic signal from BSVK and CUKG station sensors, respectively located near Basavakalyan, Karnataka, and the Central University of Karnataka, makes up the experimental dataset.

Keywords: Carl STA/LTA, features extraction, real time, dataset, machine learning, seismic detection

Procedia PDF Downloads 124
2558 A Knowledge-Based Development of Risk Management Approaches for Construction Projects

Authors: Masoud Ghahvechi Pour

Abstract:

Risk management is a systematic and regular process of identifying, analyzing and responding to risks throughout the project's life cycle in order to achieve the optimal level of elimination, reduction or control of risk. The purpose of project risk management is to increase the probability and effect of positive events and reduce the probability and effect of unpleasant events on the project. Risk management is one of the most fundamental parts of project management, so that unmanaged or untransmitted risks can be one of the primary factors of failure in a project. Effective risk management does not apply to risk regression, which is apparently the cheapest option of the activity. However, the main problem with this option is the economic sensitivity, because what is potentially profitable is by definition risky, and what does not pose a risk is economically interesting and does not bring tangible benefits. Therefore, in relation to the implemented project, effective risk management is finding a "middle ground" in its management, which includes, on the one hand, protection against risk from a negative direction by means of accurate identification and classification of risk, which leads to analysis And it becomes a comprehensive analysis. On the other hand, management using all mathematical and analytical tools should be based on checking the maximum benefits of these decisions. Detailed analysis, taking into account all aspects of the company, including stakeholder analysis, will allow us to add what will become tangible benefits for our project in the future to effective risk management. Identifying the risk of the project is based on the theory that which type of risk may affect the project, and also refers to specific parameters and estimating the probability of their occurrence in the project. These conditions can be divided into three groups: certainty, uncertainty, and risk, which in turn support three types of investment: risk preference, risk neutrality, specific risk deviation, and its measurement. The result of risk identification and project analysis is a list of events that indicate the cause and probability of an event, and a final assessment of its impact on the environment.

Keywords: risk, management, knowledge, risk management

Procedia PDF Downloads 66
2557 Measurement of Sarcopenia Associated with the Extent of Gastrointestinal Oncological Disease

Authors: Adrian Hang Yue Siu, Matthew Holyland, Sharon Carey, Daniel Steffens, Nabila Ansari, Cherry E. Koh

Abstract:

Introduction: Peritoneal malignancies are challenging cancers to manage. While cytoreductive surgery and hyperthermic intraperitoneal chemotherapy (CRS and HIPEC) may offer a cure, it’s considered radical and morbid. Pre-emptive identification of deconditioned patients for optimization may mitigate the risks of surgery. However, the difficulty lies in the scarcity of validated predictive tools to identify high-risk patients. In recent times, there has been growing interest in sarcopenia, which can occur as a result of malnutrition and malignancies. Therefore, the purpose of this study was to assess the utility of sarcopenia in predicting post-operative outcomes. Methods: A single quaternary-center retrospective study of CRS and HIPEC patients between 2017-2020 was conducted to determine the association between pre-operative sarcopenia and post-operative outcomes. Lumbar CT images were analyzed using Slice-o-matic® to measure sarcopenia. Results : Cohort (n=94) analysis found that 40% had sarcopenia, with a majority being female (53.2%) and a mean age of 55 years. Sarcopenia was statistically associated with decreased weight compared to non-sarcopenia patients, 72.7kg vs. 82.2kg (p=0.014) and shorter overall survival, 1.4 years vs. 2.1 years (p=0.032). Post-operatively, patients with sarcopenia experienced more post-operative complications (p=0.001). Conclusion: Complex procedures often require optimization to prevent complications and improve survival. While patient biomarkers – BMI and weight – are used for optimization, this research advocates for the identification of sarcopenia status for pre-operative planning. Sarcopenia may be an indicator of advanced disease requiring further treatment and is an emerging area of research. Larger studies are required to confirm these findings and to assess the reversibility of sarcopenia after surgery.

Keywords: sarcopaenia, cytoreductive surgery, hyperthermic intraperitoneal chemotherapy, surgical oncology

Procedia PDF Downloads 85
2556 Simulating Economic Order Quantity and Reorder Point Policy for a Repairable Items Inventory System

Authors: Mojahid F. Saeed Osman

Abstract:

Repairable items inventory system is a management tool used to incorporate all information concerning inventory levels and movements for repaired and new items. This paper presents development of an effective simulation model for managing the inventory of repairable items for a production system where production lines send their faulty items to a repair shop considering the stochastic failure behavior and repair times. The developed model imitates the process of handling the on-hand inventory of repaired items and the replenishment of the inventory of new items using Economic Order Quantity and Reorder Point ordering policy in a flexible and risk-free environment. We demonstrate the appropriateness and effectiveness of the proposed simulation model using an illustrative case problem. The developed simulation model can be used as a reliable tool for estimating a healthy on-hand inventory of new and repaired items, backordered items, and downtime due to unavailability of repaired items, and validating and examining Economic Order Quantity and Reorder Point ordering policy, which would further be compared with other ordering strategies as future work.

Keywords: inventory system, repairable items, simulation, maintenance, economic order quantity, reorder point

Procedia PDF Downloads 144
2555 Modeling and Temperature Control of Water-cooled PEMFC System Using Intelligent Algorithm

Authors: Chen Jun-Hong, He Pu, Tao Wen-Quan

Abstract:

Proton exchange membrane fuel cell (PEMFC) is the most promising future energy source owing to its low operating temperature, high energy efficiency, high power density, and environmental friendliness. In this paper, a comprehensive PEMFC system control-oriented model is developed in the Matlab/Simulink environment, which includes the hydrogen supply subsystem, air supply subsystem, and thermal management subsystem. Besides, Improved Artificial Bee Colony (IABC) is used in the parameter identification of PEMFC semi-empirical equations, making the maximum relative error between simulation data and the experimental data less than 0.4%. Operation temperature is essential for PEMFC, both high and low temperatures are disadvantageous. In the thermal management subsystem, water pump and fan are both controlled with the PID controller to maintain the appreciate operation temperature of PEMFC for the requirements of safe and efficient operation. To improve the control effect further, fuzzy control is introduced to optimize the PID controller of the pump, and the Radial Basis Function (RBF) neural network is introduced to optimize the PID controller of the fan. The results demonstrate that Fuzzy-PID and RBF-PID can achieve a better control effect with 22.66% decrease in Integral Absolute Error Criterion (IAE) of T_st (Temperature of PEMFC) and 77.56% decrease in IAE of T_in (Temperature of inlet cooling water) compared with traditional PID. In the end, a novel thermal management structure is proposed, which uses the cooling air passing through the main radiator to continue cooling the secondary radiator. In this thermal management structure, the parasitic power dissipation can be reduced by 69.94%, and the control effect can be improved with a 52.88% decrease in IAE of T_in under the same controller.

Keywords: PEMFC system, parameter identification, temperature control, Fuzzy-PID, RBF-PID, parasitic power

Procedia PDF Downloads 85
2554 Parametrical Simulation of Sheet Metal Forming Process to Control the Localized Thinning

Authors: Hatem Mrad, Alban Notin, Mohamed Bouazara

Abstract:

Sheet metal forming process has a multiple successive steps starting from sheets fixation to sheets evacuation. Often after forming operation, the sheet has defects requiring additional corrections steps. For example, in the drawing process, the formed sheet may have several defects such as springback, localized thinning and bends. All these defects are directly dependent on process, geometric and material parameters. The prediction and elimination of these defects requires the control of most sensitive parameters. The present study is concerned with a reliable parametric study of deep forming process in order to control the localized thinning. The proposed approach will be based on stochastic finite element method. Especially, the polynomial Chaos development will be used to establish a reliable relationship between input (process, geometric and material parameters) and output variables (sheet thickness). The commercial software Abaqus is used to conduct numerical finite elements simulations. The automatized parametrical modification is provided by coupling a FORTRAN routine, a PYTHON script and input Abaqus files.

Keywords: sheet metal forming, reliability, localized thinning, parametric simulation

Procedia PDF Downloads 423
2553 A Bayesian Network Approach to Customer Loyalty Analysis: A Case Study of Home Appliances Industry in Iran

Authors: Azam Abkhiz, Abolghasem Nasir

Abstract:

To achieve sustainable competitive advantage in the market, it is necessary to provide and improve customer satisfaction and Loyalty. To reach this objective, companies need to identify and analyze their customers. Thus, it is critical to measure the level of customer satisfaction and Loyalty very carefully. This study attempts to build a conceptual model to provide clear insights of customer loyalty. Using Bayesian networks (BNs), a model is proposed to evaluate customer loyalty and its consequences, such as repurchase and positive word-of-mouth. BN is a probabilistic approach that predicts the behavior of a system based on observed stochastic events. The most relevant determinants of customer loyalty are identified by the literature review. Perceived value, service quality, trust, corporate image, satisfaction, and switching costs are the most important variables that explain customer loyalty. The data are collected by use of a questionnaire-based survey from 1430 customers of a home appliances manufacturer in Iran. Four scenarios and sensitivity analyses are performed to run and analyze the impact of different determinants on customer loyalty. The proposed model allows businesses to not only set their targets but proactively manage their customer behaviors as well.

Keywords: customer satisfaction, customer loyalty, Bayesian networks, home appliances industry

Procedia PDF Downloads 139
2552 Competition and Cooperation of Prosumers in Cournot Games with Uncertainty

Authors: Yong-Heng Shi, Peng Hao, Bai-Chen Xie

Abstract:

Solar prosumers are playing increasingly prominent roles in the power system. However, its uncertainty affects the outcomes and functions of the power market, especially in the asymmetric information environment. Therefore, an important issue is how to take effective measures to reduce the impact of uncertainty on market equilibrium. We propose a two-level stochastic differential game model to explore the Cournot decision problem of prosumers. In particular, we study the impact of punishment and cooperation mechanisms on the efficiency of the Cournot game in which prosumers face uncertainty. The results show that under the penalty mechanism of fixed and variable rates, producers and consumers tend to take conservative actions to hedge risks, and the variable rates mechanism is more reasonable. Compared with non-cooperative situations, prosumers can improve the efficiency of the game through cooperation, which we attribute to the superposition of market power and uncertainty reduction. In addition, the market environment of asymmetric information intensifies the role of uncertainty. It reduces social welfare but increases the income of prosumers. For regulators, promoting alliances is an effective measure to realize the integration, optimization, and stable grid connection of producers and consumers.

Keywords: Cournot games, power market, uncertainty, prosumer cooperation

Procedia PDF Downloads 107
2551 Automated Method Time Measurement System for Redesigning Dynamic Facility Layout

Authors: Salam Alzubaidi, G. Fantoni, F. Failli, M. Frosolini

Abstract:

The dynamic facility layout problem is a really critical issue in the competitive industrial market; thus, solving this problem requires robust design and effective simulation systems. The sustainable simulation requires inputting reliable and accurate data into the system. So this paper describes an automated system integrated into the real environment to measure the duration of the material handling operations, collect the data in real-time, and determine the variances between the actual and estimated time schedule of the operations in order to update the simulation software and redesign the facility layout periodically. The automated method- time measurement system collects the real data through using Radio Frequency-Identification (RFID) and Internet of Things (IoT) technologies. Hence, attaching RFID- antenna reader and RFID tags enables the system to identify the location of the objects and gathering the time data. The real duration gathered will be manipulated by calculating the moving average duration of the material handling operations, choosing the shortest material handling path, and then updating the simulation software to redesign the facility layout accommodating with the shortest/real operation schedule. The periodic simulation in real-time is more sustainable and reliable than the simulation system relying on an analysis of historical data. The case study of this methodology is in cooperation with a workshop team for producing mechanical parts. Although there are some technical limitations, this methodology is promising, and it can be significantly useful in the redesigning of the manufacturing layout.

Keywords: dynamic facility layout problem, internet of things, method time measurement, radio frequency identification, simulation

Procedia PDF Downloads 120
2550 The Construct of Personal Choice within Individual Language Shift: A Phenomenological Qualitative Study

Authors: Kira Gulko Morse

Abstract:

Choosing one’s primary language may not be as common as choosing an additional foreign language to study or use during travel. In some instances, however, it becomes a matter of internal personal struggle, as language is tied not only to specific circumstances but also to human background and identity. This phenomenological qualitative study focuses on the factors affecting the decision of a person to undergo a language shift. Specifically, it considers how these factors relate to identity negotiation and expression. The data for the study include the analysis of published autobiographical narratives and personal interviews conducted using the Responsive Interviewing model. While research participants come from a variety of geographical locations and have used different reasons for undergoing their individual language shift, the study identifies a number of common features shared by all the participants. Specifically, while all the participants have been able to maintain their first language to varying degrees of proficiency, they have all completed the shift to establish a primary language different from their first. Additionally, the process of self-identification is found to be directly connected to the phenomenon of language choice for each of the participants. The findings of the study further tie the phenomenon of individual language shift to a more comprehensive issue of individual life choices – ethnic revival, immigration, and inter-cultural marriage among others. The study discusses varying language roles and the data indicate that language shift may occur whether it is a symbolic driving force or a secondary means in fulfilling a set life goal. The concept of language addition is suggested as an alternative to the arbitrariness of language shift. Thus, instead of focusing on subtractive bilingualism or language loss, the emphasis becomes the integration of languages within the individual. The study emphasizes the importance of the construct of personal choice in its connection to individual language shift. It places the focus from society onto an individual and the ability of an individual to make decisions in matters of linguistic identification.

Keywords: choice theory, identity negotiation, language shift, psycholinguistics

Procedia PDF Downloads 135
2549 A Taxonomic Study on Cephalopods (Mollusca: Cephalopoda) from the Northern Bay of Bengal

Authors: Foyezunnesa Setu, S. M. Sharifuzzaman

Abstract:

Cephalopods, belonging to the taxonomic class Cephalopoda under the phylum Mollusca, have a global distribution and are particularly common in the coastal waters of Bangladesh, specifically in the southeast and southwest regions. Identifying them can be difficult due to their pliable anatomical characteristics. Due to the presence of concealed cephalopod species within the orders Sepioidea, Teuthoidea, and Octopoda, these groupings of invertebrates, which share common characteristics, are frequently misidentified as distinct entities. Until now, cephalopods have been ignored because there is not enough knowledge about the specific species and the necessary preliminary research has not been done. This study offers a systematic description of various cephalopod species found along the south eastern coast of Bangladesh. A combined total of 25 cuttlefish specimens, four squid specimens, and five octopus specimens were gathered from the shores of Saint Martin's Island and Cox's Bazar. Based on morphological analysis, a total of 14 cephalopod species are identified. These species include Sepia aculeata, Sepia esculenta, Sepia pharaonis, Sepia prashadi, Sepiella inermis, Sepiella japonica, Uroteuthis duvauceli, Doryteuthis singhalensis, Sepioteuthis sepioidea, Eupryma stenodactyla, Amphioctopus aegina, Callistoctopus macropus, Octopus ceynea, and Octopus vulgaris. Six newly discovered species, including Sepia prashadi, Sepiella japonica, Sepioteuthis sepioidea, Eupryma stenodactyla, Callistoctopus macropus, and Octopus ceynea, have been identified in Bangladesh. Taxonomically, the identification of cephalopods is difficult due to the significant resemblance between species and the scarcity of information and preparatory research. This study offers significant insights about the cephalopod fauna found in the northern region of the Bay of Bengal.

Keywords: cephalopods, new records, northern bay of bengal, taxonomic identification

Procedia PDF Downloads 89
2548 Revisiting the Fiscal Theory of Sovereign Risk from the DSGE View

Authors: Eiji Okano, Kazuyuki Inagaki

Abstract:

We revisit Uribe's `Fiscal Theory of Sovereign Risk' advocating that there is a trade-off between stabilizing inflation and suppressing default. We develop a class of dynamic stochastic general equilibrium (DSGE) model with nominal rigidities and compare two de facto inflation stabilization policies, optimal monetary policy and optimal monetary and fiscal policy with the minimizing interest rate spread policy which completely suppress the default. Under the optimal monetary and fiscal policy, not only the nominal interest rate but also the tax rate work to minimize welfare costs through stabilizing inflation. Under the optimal monetary both inflation and output gap are completely stabilized although those are fluctuating under the optimal monetary policy. In addition, volatility in the default rate under the optimal monetary policy is considerably lower than one under the optimal monetary policy. Thus, there is not the SI-SD trade-off. In addition, while the minimizing interest rate spread policy makes inflation rate severely volatile, the optimal monetary and fiscal policy stabilize both the inflation and the default. A trade-off between stabilizing inflation and suppressing default is not so severe what pointed out by Uribe.

Keywords: sovereign risk, optimal monetary policy, fiscal theory of the price level, DSGE

Procedia PDF Downloads 321
2547 Identification of Promiscuous Epitopes for Cellular Immune Responses in the Major Antigenic Protein Rv3873 Encoded by Region of Difference 1 of Mycobacterium tuberculosis

Authors: Abu Salim Mustafa

Abstract:

Rv3873 is a relatively large size protein (371 amino acids in length) and its gene is located in the immunodominant genomic region of difference (RD)1 that is present in the genome of Mycobacterium tuberculosis but deleted from the genomes of all the vaccine strains of Bacillus Calmette Guerin (BCG) and most other mycobacteria. However, when tested for cellular immune responses using peripheral blood mononuclear cells from tuberculosis patients and BCG-vaccinated healthy subjects, this protein was found to be a major stimulator of cell mediated immune responses in both groups of subjects. In order to further identify the sequence of immunodominant epitopes and explore their Human Leukocyte Antigen (HLA)-restriction for epitope recognition, 24 peptides (25-mers overlapping with the neighboring peptides by 10 residues) covering the sequence of Rv3873 were synthesized chemically using fluorenylmethyloxycarbonyl chemistry and tested in cell mediated immune responses. The results of these experiments helped in the identification of an immunodominant peptide P9 that was recognized by people expressing varying HLA-DR types. Furthermore, it was also predicted to be a promiscuous binder with multiple epitopes for binding to HLA-DR, HLA-DP and HLA-DQ alleles of HLA-class II molecules that present antigens to T helper cells, and to HLA-class I molecules that present antigens to T cytotoxic cells. In addition, the evaluation of peptide P9 using an immunogenicity predictor server yielded a high score (0.94), which indicated a greater probability of this peptide to elicit a protective cellular immune response. In conclusion, P9, a peptide with multiple epitopes and ability to bind several HLA class I and class II molecules for presentation to cells of the cellular immune response, may be useful as a peptide-based vaccine against tuberculosis.

Keywords: mycobacterium tuberculosis, PPE68, peptides, vaccine

Procedia PDF Downloads 135
2546 Comparative Correlation Investigation of Polynuclear Aromatic Hydrocarbons (PAHs) in Soils of Different Land Uses: Sources Evaluation Perspective

Authors: O. Onoriode Emoyan, E. Eyitemi Akporhonor, Charles Otobrise

Abstract:

Polycyclic Aromatic Hydrocarbons (PAHs) are formed mainly as a result of incomplete combustion of organic materials during industrial, domestic activities or natural occurrence. Their toxicity and contamination of terrestrial and aquatic ecosystem have been established. Though with limited validity index, previous research has focused on PAHs isomer pair ratios of variable physicochemical properties in source identification. The objective of this investigation was to determine the empirical validity of Pearson correlation coefficient (PCC) and cluster analysis (CA) in PAHs source identification along soil samples of different land uses. Therefore, 16 PAHs grouped as endocrine disruption substances (EDSs) were determined in 10 sample stations in top and sub soils seasonally. PAHs was determined the use of Varian 300 gas chromatograph interfaced with flame ionization detector. Instruments and reagents used are of standard and chromatographic grades respectively. PCC and CA results showed that the classification of PAHs along kinetically and thermodyanamically-favoured and those derived directly from plants product through biologically mediated processes used in source signature is about the predominance PAHs are likely to be. Therefore the observed PAHs in the studied stations have trace quantities of the vast majority of the sixteen un-substituted PAHs which may ultimately inhabit the actual source signature authentication. Type and extent of bacterial metabolism, transformation products/substrates, and environmental factors such as: salinity, pH, oxygen concentration, nutrients, light intensity, temperature, co-substrates and environmental medium are hereby recommended as factors to be considered when evaluating possible sources of PAHs.

Keywords: comparative correlation, kinetically and thermodynamically-favored PAHs, pearson correlation coefficient, cluster analysis, sources evaluation

Procedia PDF Downloads 419
2545 Bayesian Value at Risk Forecast Using Realized Conditional Autoregressive Expectiel Mdodel with an Application of Cryptocurrency

Authors: Niya Chen, Jennifer Chan

Abstract:

In the financial market, risk management helps to minimize potential loss and maximize profit. There are two ways to assess risks; the first way is to calculate the risk directly based on the volatility. The most common risk measurements are Value at Risk (VaR), sharp ratio, and beta. Alternatively, we could look at the quantile of the return to assess the risk. Popular return models such as GARCH and stochastic volatility (SV) focus on modeling the mean of the return distribution via capturing the volatility dynamics; however, the quantile/expectile method will give us an idea of the distribution with the extreme return value. It will allow us to forecast VaR using return which is direct information. The advantage of using these non-parametric methods is that it is not bounded by the distribution assumptions from the parametric method. But the difference between them is that expectile uses a second-order loss function while quantile regression uses a first-order loss function. We consider several quantile functions, different volatility measures, and estimates from some volatility models. To estimate the expectile of the model, we use Realized Conditional Autoregressive Expectile (CARE) model with the bayesian method to achieve this. We would like to see if our proposed models outperform existing models in cryptocurrency, and we will test it by using Bitcoin mainly as well as Ethereum.

Keywords: expectile, CARE Model, CARR Model, quantile, cryptocurrency, Value at Risk

Procedia PDF Downloads 109
2544 Off-Farm Work and Cost Efficiency in Staple Food Production among Small-Scale Farmers in North Central Nigeria

Authors: C. E. Ogbanje, S. A. N. D. Chidebelu, N. J. Nweze

Abstract:

The study evaluated off-farm work and cost efficiency in staple food production among small-scale farmers in North Central Nigeria. Multistage sampling technique was used to select 360 respondents (participants and non-participants in off-farm work). Primary data obtained were analysed using stochastic cost frontier and test of means’ difference. Capital input was lower for participants (N2,596.58) than non-participants (N11,099.14). Gamma (γ) was statistically significant. Farm size significantly (p<0.01) increased cost outlay for participants and non-participants. Average input prices of enterprises one and two significantly (p<0.01) increased cost. Sex, household size, credit obtained, formal education, farming experience, and farm income significantly (p<0.05) reduced cost inefficiency for non-participants. Average cost efficiency was 11%. Farm capital was wasted. Participants’ substitution of capital for labour did not put them at a disadvantage. Extension agents should encourage farmers to obtain financial relief from off-farm work but not to the extent of endangering farm cost efficiency.

Keywords: cost efficiency, mean difference, North Central Nigeria, off-farm work, participants and non-participants, small-scale farmers

Procedia PDF Downloads 362
2543 The Crossroads of Corruption and Terrorism in the Global South

Authors: Stephen M. Magu

Abstract:

The 9/11 and Christmas bombing attacks in the United States are mostly associated with the inability of intelligence agencies to connect dots based on intelligence that was already available. The 1998, 2002, 2013 and several 2014 terrorist attacks in Kenya, on the other hand, are probably driven by a completely different dynamic: the invisible hand of corruption. The World Bank and Transparency International annually compute the Worldwide Governance Indicators and the Corruption Perception Index respectively. What perhaps is not adequately captured in the corruption metrics is the impact of corruption on terrorism. The World Bank data includes variables such as the control of corruption, (estimates of) government effectiveness, political stability and absence of violence/terrorism, regulatory quality, rule of law and voice and accountability. TI's CPI does not include measures related to terrorism, but it is plausible that there is an expectation of some terrorism impact arising from corruption. This paper, by examining the incidence, frequency and total number of terrorist attacks that have occurred especially since 1990, and further examining the specific cases of Kenya and Nigeria, argues that in addition to having major effects on governance, corruption has an even more frightening impact: that of facilitating and/or violating security mechanisms to the extent that foreign nationals can easily obtain identification that enables them to perpetuate major events, targeting powerful countries' interests in countries with weak corruption-fighting mechanisms. The paper aims to model interactions that demonstrate the cost/benefit analysis and agents' rational calculations as being non-rational calculations, given the ultimate impact. It argues that eradication of corruption is not just a matter of a better business environment, but that it is implicit in national security, and that for anti-corruption crusaders, this is an argument more potent than the economic cost / cost of doing business argument.

Keywords: corruption, global south, identification, passports, terrorism

Procedia PDF Downloads 422
2542 Security Model for RFID Systems

Authors: John Ayoade

Abstract:

Radio Frequency Identification (RFID) has gained a lot of popularity in all walks of life due to its usefulness and diverse use of the technology in almost every application. However, there have been some security concerns most especially in regards to how authentic readers and tags can confirm their authenticity before confidential data is exchanged between them. In this paper, Kerberos protocol is adopted for the mutual authentication of RFID system components in order to ensure the secure communication between those components and to realize the authenticity of the communicating components.

Keywords: RFID, security, mutual authentication, Kerberos

Procedia PDF Downloads 469
2541 Optimizing the Passenger Throughput at an Airport Security Checkpoint

Authors: Kun Li, Yuzheng Liu, Xiuqi Fan

Abstract:

High-security standard and high efficiency of screening seem to be contradictory to each other in the airport security check process. Improving the efficiency as far as possible while maintaining the same security standard is significantly meaningful. This paper utilizes the knowledge of Operation Research and Stochastic Process to establish mathematical models to explore this problem. We analyze the current process of airport security check and use the M/G/1 and M/G/k models in queuing theory to describe the process. Then we find the least efficient part is the pre-check lane, the bottleneck of the queuing system. To improve passenger throughput and reduce the variance of passengers’ waiting time, we adjust our models and use Monte Carlo method, then put forward three modifications: adjust the ratio of Pre-Check lane to regular lane flexibly, determine the optimal number of security check screening lines based on cost analysis and adjust the distribution of arrival and service time based on Monte Carlo simulation results. We also analyze the impact of cultural differences as the sensitivity analysis. Finally, we give the recommendations for the current process of airport security check process.

Keywords: queue theory, security check, stochatic process, Monte Carlo simulation

Procedia PDF Downloads 200
2540 Robust Processing of Antenna Array Signals under Local Scattering Environments

Authors: Ju-Hong Lee, Ching-Wei Liao

Abstract:

An adaptive array beamformer is designed for automatically preserving the desired signals while cancelling interference and noise. Providing robustness against model mismatches and tracking possible environment changes calls for robust adaptive beamforming techniques. The design criterion yields the well-known generalized sidelobe canceller (GSC) beamformer. In practice, the knowledge of the desired steering vector can be imprecise, which often occurs due to estimation errors in the DOA of the desired signal or imperfect array calibration. In these situations, the SOI is considered as interference, and the performance of the GSC beamformer is known to degrade. This undesired behavior results in a reduction of the array output signal-to-interference plus-noise-ratio (SINR). Therefore, it is worth developing robust techniques to deal with the problem due to local scattering environments. As to the implementation of adaptive beamforming, the required computational complexity is enormous when the array beamformer is equipped with massive antenna array sensors. To alleviate this difficulty, a generalized sidelobe canceller (GSC) with partially adaptivity for less adaptive degrees of freedom and faster adaptive response has been proposed in the literature. Unfortunately, it has been shown that the conventional GSC-based adaptive beamformers are usually very sensitive to the mismatch problems due to local scattering situations. In this paper, we present an effective GSC-based beamformer against the mismatch problems mentioned above. The proposed GSC-based array beamformer adaptively estimates the actual direction of the desired signal by using the presumed steering vector and the received array data snapshots. We utilize the predefined steering vector and a presumed angle tolerance range to carry out the required estimation for obtaining an appropriate steering vector. A matrix associated with the direction vector of signal sources is first created. Then projection matrices related to the matrix are generated and are utilized to iteratively estimate the actual direction vector of the desired signal. As a result, the quiescent weight vector and the required signal blocking matrix required for performing adaptive beamforming can be easily found. By utilizing the proposed GSC-based beamformer, we find that the performance degradation due to the considered local scattering environments can be effectively mitigated. To further enhance the beamforming performance, a signal subspace projection matrix is also introduced into the proposed GSC-based beamformer. Several computer simulation examples show that the proposed GSC-based beamformer outperforms the existing robust techniques.

Keywords: adaptive antenna beamforming, local scattering, signal blocking, steering mismatch

Procedia PDF Downloads 112