Search results for: fuzzy sets.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1444

Search results for: fuzzy sets.

544 Optimizing and Evaluating Performance Quality Control of the Production Process of Disposable Essentials Using Approach Vague Goal Programming

Authors: Hadi Gholizadeh, Ali Tajdin

Abstract:

To have effective production planning, it is necessary to control the quality of processes. This paper aims at improving the performance of the disposable essentials process using statistical quality control and goal programming in a vague environment. That is expressed uncertainty because there is always a measurement error in the real world. Therefore, in this study, the conditions are examined in a vague environment that is a distance-based environment. The disposable essentials process in Kach Company was studied. Statistical control tools were used to characterize the existing process for four factor responses including the average of disposable glasses’ weights, heights, crater diameters, and volumes. Goal programming was then utilized to find the combination of optimal factors setting in a vague environment which is measured to apply uncertainty of the initial information when some of the parameters of the models are vague; also, the fuzzy regression model is used to predict the responses of the four described factors. Optimization results show that the process capability index values for disposable glasses’ average of weights, heights, crater diameters and volumes were improved. Such increasing the quality of the products and reducing the waste, which will reduce the cost of the finished product, and ultimately will bring customer satisfaction, and this satisfaction, will mean increased sales.

Keywords: Goal programming, quality control, vague environment, disposable glasses’ optimization, fuzzy regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1039
543 Reducing the Need for Multi-Input Multi-Output in Multi-Beam Base Transceiver Station Antennas Using Orthogonally-Polarized Feeds with an Arbitrary Number of Ports

Authors: Mohamed Sanad, Noha Hassan

Abstract:

A multi-beam BTS (Base Transceiver Station) antenna has been developed using dual parabolic cylindrical reflectors. The ±45° polarization feeds are used in spatial diversity MIMO (Multi-Input Multi-Output). They can be replaced by single-port orthogonally polarized feeds. Then, with two sets of beams generated above each other, the ± 45° polarization ports of any conventional transceiver can be connected to two of these beam sets. Thus, with two-port transceivers, the system will be equivalent to 4x4 MIMO, instead of 2x2. Radio Frequency (RF) power combiners/splitters can also be used to combine the multiple beams into a single beam or any arbitrary number of beams/ports. The gain of the combined-beam will be more than 20-24 dBi instead of 17-18 dBi of conventional wide-beam antennas. Furthermore, the gain of the combined beam will be high over the whole beam angle. Moreover, the users will always be close to the peak gain value of the combined beam regardless of their location within the combined beam angle. The frequency bands of all the combined beams are adjusted such that they all have the same frequency band. Different configurations of RF power splitter/combiners can be used to provide any arbitrary number of beams/ports according to the requirements of any existing base station configuration.

Keywords: 5G mobile communications, BTS antennas, MIMO, orthogonally polarized antennas, multi-beam antennas.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 703
542 Expert Based System Design for Integrated Waste Management

Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy

Abstract:

Recently, an increasing number of researchers have been focusing on working out realistic solutions to sustainability problems. As sustainability issues gain higher importance for organisations, the management of such decisions becomes critical. Knowledge representation is a fundamental issue of complex knowledge based systems. Many types of sustainability problems would benefit from models based on experts’ knowledge. Cognitive maps have been used for analyzing and aiding decision making. A cognitive map can be made of almost any system or problem. A fuzzy cognitive map (FCM) can successfully represent knowledge and human experience, introducing concepts to represent the essential elements and the cause and effect relationships among the concepts to model the behaviour of any system. Integrated waste management systems (IWMS) are complex systems that can be decomposed to non-related and related subsystems and elements, where many factors have to be taken into consideration that may be complementary, contradictory, and competitive; these factors influence each other and determine the overall decision process of the system. The goal of the present paper is to construct an efficient IWMS which considers various factors. The authors’ intention is to propose an expert based system design approach for implementing expert decision support in the area of IWMSs and introduces an appropriate methodology for the development and analysis of group FCM. A framework for such a methodology consisting of the development and application phases is presented.

Keywords: Factors, fuzzy cognitive map, group decision, integrated waste management system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1961
541 Eliciting and Confirming Data, Information, Knowledge and Wisdom in a Specialist Health Care Setting: The WICKED Method

Authors: S. Impey, D. Berry, S. Furtado, M. Galvin, L. Grogan, O. Hardiman, L. Hederman, M. Heverin, V. Wade, L. Douris, D. O'Sullivan, G. Stephens

Abstract:

Healthcare is a knowledge-rich environment. This knowledge, while valuable, is not always accessible outside the borders of individual clinics. This research aims to address part of this problem (at a study site) by constructing a maximal data set (knowledge artefact) for motor neurone disease (MND). This data set is proposed as an initial knowledge base for a concurrent project to develop an MND patient data platform. It represents the domain knowledge at the study site for the duration of the research (12 months). A knowledge elicitation method was also developed from the lessons learned during this process - the WICKED method. WICKED is an anagram of the words: eliciting and confirming data, information, knowledge, wisdom. But it is also a reference to the concept of wicked problems, which are complex and challenging, as is eliciting expert knowledge. The method was evaluated at a second site, and benefits and limitations were noted. Benefits include that the method provided a systematic way to manage data, information, knowledge and wisdom (DIKW) from various sources, including healthcare specialists and existing data sets. Limitations surrounded the time required and how the data set produced only represents DIKW known during the research period. Future work is underway to address these limitations.

Keywords: Healthcare, knowledge acquisition, maximal data sets, action design science.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 539
540 Market Feasibility for New Brand Coffee House: The Case Study of Thailand

Authors: Pongsiri K.

Abstract:

This research aimed to study the market feasibility for new brand coffee house, the case study of Thailand.. This study is a mixed methods research combining quantitative research and the qualitative research. Primary data 350 sets of questionnaires were distributed, and the high quality completed questionnaires of 320 sets returned. Research samples are identified as customers’ of Hi-end department stores in Thailand. The sources of secondary data were critical selected from highly reliable sources, both from public and private sectors. The results were used to classify the customer group into two main groups, the younger than 25 and the older than 25years old. Results of the younger group, are give priority to the dimension of coffee house and its services dimension more than others, then branding dimension and the product dimension respectively. On the other hand, the older group give the difference result as they rate the important of the branding, coffee house and its services, then the product respectively. Coffee consuming is not just the trend but it has become part of people lifestyle. And the new cultures also created by the wise businessman. Coffee was long produced and consumed in Thailand. But it is surprisingly the hi-end brand coffee houses in Thai market are mostly imported brands. The café business possibility for Thai brand coffee house in Thai market were discussed in the paper.

Keywords: Coffee House, Café, Coffee Consuming and new entry branding, market feasibility

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14043
539 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery

Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene

Abstract:

Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.

Keywords: Multi-objective decision support, analysis, data validation, freight delivery, multi-modal transportation, genetic programming methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 483
538 A Comparative Analysis Approach Based on Fuzzy AHP, TOPSIS and PROMETHEE for the Selection Problem of GSCM Solutions

Authors: Omar Boutkhoum, Mohamed Hanine, Abdessadek Bendarag

Abstract:

Sustainable economic growth is nowadays driving firms to extend toward the adoption of many green supply chain management (GSCM) solutions. However, the evaluation and selection of these solutions is a matter of concern that needs very serious decisions, involving complexity owing to the presence of various associated factors. To resolve this problem, a comparative analysis approach based on multi-criteria decision-making methods is proposed for adequate evaluation of sustainable supply chain management solutions. In the present paper, we propose an integrated decision-making model based on FAHP (Fuzzy Analytic Hierarchy Process), TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) and PROMETHEE (Preference Ranking Organisation METHod for Enrichment Evaluations) to contribute to a better understanding and development of new sustainable strategies for industrial organizations. Due to the varied importance of the selected criteria, FAHP is used to identify the evaluation criteria and assign the importance weights for each criterion, while TOPSIS and PROMETHEE methods employ these weighted criteria as inputs to evaluate and rank the alternatives. The main objective is to provide a comparative analysis based on TOPSIS and PROMETHEE processes to help make sound and reasoned decisions related to the selection problem of GSCM solution.

Keywords: GSCM solutions, multi-criteria analysis, FAHP, TOPSIS, PROMETHEE, decision support system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 937
537 Dynamic Features Selection for Heart Disease Classification

Authors: Walid MOUDANI

Abstract:

The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the Coronary Heart Disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts- knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.

Keywords: Multi-Classifier Decisions Tree, Features Reduction, Dynamic Programming, Rough Sets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2531
536 Computing Entropy for Ortholog Detection

Authors: Hsing-Kuo Pao, John Case

Abstract:

Biological sequences from different species are called or-thologs if they evolved from a sequence of a common ancestor species and they have the same biological function. Approximations of Kolmogorov complexity or entropy of biological sequences are already well known to be useful in extracting similarity information between such sequences -in the interest, for example, of ortholog detection. As is well known, the exact Kolmogorov complexity is not algorithmically computable. In prac-tice one can approximate it by computable compression methods. How-ever, such compression methods do not provide a good approximation to Kolmogorov complexity for short sequences. Herein is suggested a new ap-proach to overcome the problem that compression approximations may notwork well on short sequences. This approach is inspired by new, conditional computations of Kolmogorov entropy. A main contribution of the empir-ical work described shows the new set of entropy-based machine learning attributes provides good separation between positive (ortholog) and nega-tive (non-ortholog) data - better than with good, previously known alter-natives (which do not employ some means to handle short sequences well).Also empirically compared are the new entropy based attribute set and a number of other, more standard similarity attributes sets commonly used in genomic analysis. The various similarity attributes are evaluated by cross validation, through boosted decision tree induction C5.0, and by Receiver Operating Characteristic (ROC) analysis. The results point to the conclu-sion: the new, entropy based attribute set by itself is not the one giving the best prediction; however, it is the best attribute set for use in improving the other, standard attribute sets when conjoined with them.

Keywords: compression, decision tree, entropy, ortholog, ROC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1825
535 Personalized Applications for Advanced Healthcare through AI-ML and Blockchain

Authors: Anuja Vyas, Aikel Indurkhya, Hari Krishna Garg

Abstract:

Nearly 25 years have passed since the landmark publication of the Human Genome Project, yet scientists have only begun to scratch the surface of its potential benefits. To bridge this gap, a personalized genomic application has been envisioned as a transformative tool accessible to people worldwide. This innovative solution proposes an integrated framework combining blockchain technology, genome-specific applications, and data compression techniques, ensuring operations to be swift, secure, transparent, and space-efficient. The software harnesses advanced Artificial Intelligence and Machine Learning methodologies, such as neural networks, evaluation matrices, fuzzy logic, and expert systems, to analyze individual genomic data. It generates personalized reports by comparing a user's genome with a reference genome, highlighting significant differences. Blockchain technology, with its inherent security, encryption, and immutability features, is leveraged for robust data transport and storage. In addition, a 'Data Abbreviation' technique ensures that genetic data and reports occupy minimal space. This integrated approach promises to be a significant leap forward, potentially transforming human health and well-being on a global scale.

Keywords: Artificial intelligence in genomics, blockchain technology, data abbreviation, data compression, data security in genomics, data storage, expert systems, fuzzy logic, genome applications, genomic data analysis, human genome project, neural networks, personalized genomics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35
534 New Wavelet Indices to Assess Muscle Fatigue during Dynamic Contractions

Authors: González-Izal M., Rodríguez-Carreño I, Mallor-Giménez F, Malanda A, Izquierdo M

Abstract:

The purpose of this study was to evaluate and compare new indices based on the discrete wavelet transform with another spectral parameters proposed in the literature as mean average voltage, median frequency and ratios between spectral moments applied to estimate acute exercise-induced changes in power output, i.e., to assess peripheral muscle fatigue during a dynamic fatiguing protocol. 15 trained subjects performed 5 sets consisting of 10 leg press, with 2 minutes rest between sets. Surface electromyography was recorded from vastus medialis (VM) muscle. Several surface electromyographic parameters were compared to detect peripheral muscle fatigue. These were: mean average voltage (MAV), median spectral frequency (Fmed), Dimitrov spectral index of muscle fatigue (FInsm5), as well as other five parameters obtained from the discrete wavelet transform (DWT) as ratios between different scales. The new wavelet indices achieved the best results in Pearson correlation coefficients with power output changes during acute dynamic contractions. Their regressions were significantly different from MAV and Fmed. On the other hand, they showed the highest robustness in presence of additive white gaussian noise for different signal to noise ratios (SNRs). Therefore, peripheral impairments assessed by sEMG wavelet indices may be a relevant factor involved in the loss of power output after dynamic high-loading fatiguing task.

Keywords: Median Frequency, EMG, wavelet transform, muscle fatigue

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1866
533 Applications of Support Vector Machines on Smart Phone Systems for Emotional Speech Recognition

Authors: Wernhuar Tarng, Yuan-Yuan Chen, Chien-Lung Li, Kun-Rong Hsie, Mingteh Chen

Abstract:

An emotional speech recognition system for the applications on smart phones was proposed in this study to combine with 3G mobile communications and social networks to provide users and their groups with more interaction and care. This study developed a mechanism using the support vector machines (SVM) to recognize the emotions of speech such as happiness, anger, sadness and normal. The mechanism uses a hierarchical classifier to adjust the weights of acoustic features and divides various parameters into the categories of energy and frequency for training. In this study, 28 commonly used acoustic features including pitch and volume were proposed for training. In addition, a time-frequency parameter obtained by continuous wavelet transforms was also used to identify the accent and intonation in a sentence during the recognition process. The Berlin Database of Emotional Speech was used by dividing the speech into male and female data sets for training. According to the experimental results, the accuracies of male and female test sets were increased by 4.6% and 5.2% respectively after using the time-frequency parameter for classifying happy and angry emotions. For the classification of all emotions, the average accuracy, including male and female data, was 63.5% for the test set and 90.9% for the whole data set.

Keywords: Smart phones, emotional speech recognition, socialnetworks, support vector machines, time-frequency parameter, Mel-scale frequency cepstral coefficients (MFCC).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1841
532 Using Artificial Neural Network to Forecast Groundwater Depth in Union County Well

Authors: Zahra Ghadampour, Gholamreza Rakhshandehroo

Abstract:

A concern that researchers usually face in different applications of Artificial Neural Network (ANN) is determination of the size of effective domain in time series. In this paper, trial and error method was used on groundwater depth time series to determine the size of effective domain in the series in an observation well in Union County, New Jersey, U.S. different domains of 20, 40, 60, 80, 100, and 120 preceding day were examined and the 80 days was considered as effective length of the domain. Data sets in different domains were fed to a Feed Forward Back Propagation ANN with one hidden layer and the groundwater depths were forecasted. Root Mean Square Error (RMSE) and the correlation factor (R2) of estimated and observed groundwater depths for all domains were determined. In general, groundwater depth forecast improved, as evidenced by lower RMSEs and higher R2s, when the domain length increased from 20 to 120. However, 80 days was selected as the effective domain because the improvement was less than 1% beyond that. Forecasted ground water depths utilizing measured daily data (set #1) and data averaged over the effective domain (set #2) were compared. It was postulated that more accurate nature of measured daily data was the reason for a better forecast with lower RMSE (0.1027 m compared to 0.255 m) in set #1. However, the size of input data in this set was 80 times the size of input data in set #2; a factor that may increase the computational effort unpredictably. It was concluded that 80 daily data may be successfully utilized to lower the size of input data sets considerably, while maintaining the effective information in the data set.

Keywords: Neural networks, groundwater depth, forecast.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2514
531 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores

Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay

Abstract:

Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies  the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.

Keywords: Retail stores, Faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 581
530 Weakly Generalized Closed Map

Authors: R. Parimelazhagan, N. Nagaveni

Abstract:

In this paper we introduce a new class of mg-continuous mapping and studied some of its basic properties.We obtain some characterizations of such functions. Moreover we define sub minimal structure and further study certain properties of mg-closed sets.

Keywords: M-structure, mg-continuous mapping, minimal structure, mg T2 space, sub minimal structure, T12 space, mg-compact set.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
529 Quality Fed-Batch Bioprocess Control A Case Study

Authors: Mihai Caramihai, Irina Severin

Abstract:

Bioprocesses are appreciated as difficult to control because their dynamic behavior is highly nonlinear and time varying, in particular, when they are operating in fed batch mode. The research objective of this study was to develop an appropriate control method for a complex bioprocess and to implement it on a laboratory plant. Hence, an intelligent control structure has been designed in order to produce biomass and to maximize the specific growth rate.

Keywords: Fed batch bioprocess, mass-balance model, fuzzy control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1467
528 Some Properties of IF Rough Relational Algebraic Operators in Medical Databases

Authors: Chhaya Gangwal, R. N. Bhaumik, Shishir Kumar

Abstract:

Some properties of Intuitionistic Fuzzy (IF) rough relational algebraic operators under an IF rough relational data model are investigated and illustrated using diabetes and heart disease databases. These properties are important and desirable for processing queries in an effective and efficient manner.

 

Keywords: IF Set, Rough Set, IF Rough Relational Database, IF rough Relational Operators.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1453
527 Overloading Scheme for Cellular DS-CDMA using Quasi-Orthogonal Sequences and Iterative Interference Cancellation Receiver

Authors: Preetam Kumar, Saswat Chakrabarti

Abstract:

Overloading is a technique to accommodate more number of users than the spreading factor N. This is a bandwidth efficient scheme to increase the number users in a fixed bandwidth. One of the efficient schemes to overload a CDMA system is to use two sets of orthogonal signal waveforms (O/O). The first set is assigned to the N users and the second set is assigned to the additional M users. An iterative interference cancellation technique is used to cancel interference between the two sets of users. In this paper, the performance of an overloading scheme in which the first N users are assigned Walsh-Hadamard orthogonal codes and extra users are assigned the same WH codes but overlaid by a fixed (quasi) bent sequence [11] is evaluated. This particular scheme is called Quasi- Orthogonal Sequence (QOS) O/O scheme, which is a part of cdma2000 standard [12] to provide overloading in the downlink using single user detector. QOS scheme are balance O/O scheme, where the correlation between any set-1 and set-2 users are equalized. The allowable overload of this scheme is investigated in the uplink on an AWGN and Rayleigh fading channels, so that the uncoded performance with iterative multistage interference cancellation detector remains close to the single user bound. It is shown that this scheme provides 19% and 11% overloading with SDIC technique for N= 16 and 64 respectively, with an SNR degradation of less than 0.35 dB as compared to single user bound at a BER of 0.00001. But on a Rayleigh fading channel, the channel overloading is 45% (29 extra users) at a BER of 0.0005, with an SNR degradation of about 1 dB as compared to single user performance for N=64. This is a significant amount of channel overloading on a Rayleigh fading channel.

Keywords: DS-CDMA, Iterative Interference CancellationOrthogonal codes, Overloading.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
526 Bioprocess Intelligent Control: A Case Study

Authors: Mihai Caramihai Ana A Chirvase, Irina Severin

Abstract:

Bioprocesses are appreciated as difficult to control because their dynamic behavior is highly nonlinear and time varying, in particular, when they are operating in fed batch mode. The research objective of this study was to develop an appropriate control method for a complex bioprocess and to implement it on a laboratory plant. Hence, an intelligent control structure has been designed in order to produce biomass and to maximize the specific growth rate.

Keywords: Fed batch bioprocess, mass-balance model, fuzzy control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1561
525 An Intelligent Controller Augmented with Variable Zero Lag Compensation for Antilock Braking System

Authors: Benjamin C. Agwah, Paulinus C. Eze

Abstract:

Antilock braking system (ABS) is one of the important contributions by the automobile industry, designed to ensure road safety in such way that vehicles are kept steerable and stable when during emergency braking. This paper presents a wheel slip-based intelligent controller with variable zero lag compensation for ABS. It is required to achieve a very fast perfect wheel slip tracking during hard braking condition and eliminate chattering with improved transient and steady state performance, while shortening the stopping distance using effective braking torque less than maximum allowable torque to bring a braking vehicle to a stop. The dynamic of a vehicle braking with a braking velocity of 30 ms⁻¹ on a straight line was determined and modelled in MATLAB/Simulink environment to represent a conventional ABS system without a controller. Simulation results indicated that system without a controller was not able to track desired wheel slip and the stopping distance was 135.2 m. Hence, an intelligent control based on fuzzy logic controller (FLC) was designed with a variable zero lag compensator (VZLC) added to enhance the performance of FLC control variable by eliminating steady state error, provide improve bandwidth to eliminate the effect of high frequency noise such as chattering during braking. The simulation results showed that FLC-VZLC provided fast tracking of desired wheel slip, eliminated chattering, and reduced stopping distance by 70.5% (39.92 m), 63.3% (49.59 m), 57.6% (57.35 m) and 50% (69.13 m) on dry, wet, cobblestone and snow road surface conditions respectively. Generally, the proposed system used effective braking torque that is less than the maximum allowable braking torque to achieve efficient wheel slip tracking and overall robust control performance on different road surfaces.

Keywords: ABS, Fuzzy Logic Controller, Variable Zero Lag Compensator, Wheel Slip Tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 341
524 Evaluating Refactoring with a Quality Index

Authors: Crt Gerlec, Marjan Hericko

Abstract:

The aim of every software product is to achieve an appropriate level of software quality. Developers and designers are trying to produce readable, reliable, maintainable, reusable and testable code. To help achieve these goals, several approaches have been utilized. In this paper, refactoring technique was used to evaluate software quality with a quality index. It is composed of different metric sets which describes various quality aspects.

Keywords: Refactoring, Software Metrics, Software Quality, Quality Index, Agile methodologies

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1621
523 A Study of Priority Evaluation and Resource Allocation for Revitalization of Cultural Heritages in the Urban Development

Authors: Wann-Ming Wey, Yi-Chih Huang

Abstract:

Proper maintenance and preservation of significant cultural heritages or historic buildings is necessary. It can not only enhance environmental benefits and a sense of community, but also preserve a city's history and people’s memory. It allows the next generation to be able to get a glimpse of our past, and achieve the goal of sustainable preserved cultural assets. However, the management of maintenance work has not been appropriate for many designated heritages or historic buildings so far. The planning and implementation of the reuse has yet to have a breakthrough specification. It leads the heritages to a mere formality of being “reserved”, instead of the real meaning of “conservation”. For the restoration and preservation of cultural heritages study issues, it is very important due to the consideration of historical significance, symbolism, and economic benefits effects. However, the decision makers such as the officials from public sector they often encounter which heritage should be prioritized to be restored first under the available limited budgets. Only very few techniques are available today to determine the appropriately restoration priorities for the diverse historical heritages, perhaps because of a lack of systematized decision-making aids been proposed before. In the past, the discussions of management and maintenance towards cultural assets were limited to the selection of reuse alternatives instead of the allocation of resources. In view of this, this research will adopt some integrated research methods to solve the existing problems that decision-makers might encounter when allocating resources in the management and maintenance of heritages and historic buildings.

The purpose of this study is to develop a sustainable decision making model for local governments to resolve these problems. We propose an alternative decision support model to prioritize restoration needs within the limited budgets. The model is constructed based on fuzzy Delphi, fuzzy analysis network process (FANP) and goal programming (GP) methods. In order to avoid misallocate resources; this research proposes a precise procedure that can take multi-stakeholders views, limited costs and resources into consideration. Also, the combination of many factors and goals has been taken into account to find the highest priority and feasible solution results. To illustrate the approach we propose in this research, seven cultural heritages in Taipei city as one example has been used as an empirical study, and the results are in depth analyzed to explain the application of our proposed approach.

Keywords: Cultural Heritage, Historic Buildings, Priority Evaluation, Multi-Criteria Decision Making, Goal Programming, Fuzzy Analytic Network Process, Resource Allocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2311
522 Analyzing Environmental Emotive Triggers in Terrorist Propaganda

Authors: Travis Morris

Abstract:

The purpose of this study is to measure the intersection of environmental security entities in terrorist propaganda. To the best of author’s knowledge, this is the first study of its kind to examine this intersection within terrorist propaganda. Rosoka, natural language processing software and frame analysis are used to advance our understanding of how environmental frames function as emotive triggers. Violent jihadi demagogues use frames to suggest violent and non-violent solutions to their grievances. Emotive triggers are framed in a way to leverage individual and collective attitudes in psychological warfare. A comparative research design is used because of the differences and similarities that exist between two variants of violent jihadi propaganda that target western audiences. Analysis is based on salience and network text analysis, which generates violent jihadi semantic networks. Findings indicate that environmental frames are used as emotive triggers across both data sets, but also as tactical and information data points. A significant finding is that certain core environmental emotive triggers like “water,” “soil,” and “trees” are significantly salient at the aggregate level across both data sets. All environmental entities can be classified into two categories, symbolic and literal. Importantly, this research illustrates how demagogues use environmental emotive triggers in cyber space from a subcultural perspective to mobilize target audiences to their ideology and praxis. Understanding the anatomy of propaganda construction is necessary in order to generate effective counter narratives in information operations. This research advances an additional method to inform practitioners and policy makers of how environmental security and propaganda intersect.

Keywords: Emotive triggers, environmental security, natural language processing, propaganda analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 952
521 Entropy Measures on Neutrosophic Soft Sets and Its Application in Multi Attribute Decision Making

Authors: I. Arockiarani

Abstract:

The focus of the paper is to furnish the entropy measure for a neutrosophic set and neutrosophic soft set which is a measure of uncertainty and it permeates discourse and system. Various characterization of entropy measures are derived. Further we exemplify this concept by applying entropy in various real time decision making problems.

Keywords: Entropy measure, Hausdorff distance, neutrosophic set, soft set.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 932
520 Remaining Useful Life Estimation of Bearings Based on Nonlinear Dimensional Reduction Combined with Timing Signals

Authors: Zhongmin Wang, Wudong Fan, Hengshan Zhang, Yimin Zhou

Abstract:

In data-driven prognostic methods, the prediction accuracy of the estimation for remaining useful life of bearings mainly depends on the performance of health indicators, which are usually fused some statistical features extracted from vibrating signals. However, the existing health indicators have the following two drawbacks: (1) The differnet ranges of the statistical features have the different contributions to construct the health indicators, the expert knowledge is required to extract the features. (2) When convolutional neural networks are utilized to tackle time-frequency features of signals, the time-series of signals are not considered. To overcome these drawbacks, in this study, the method combining convolutional neural network with gated recurrent unit is proposed to extract the time-frequency image features. The extracted features are utilized to construct health indicator and predict remaining useful life of bearings. First, original signals are converted into time-frequency images by using continuous wavelet transform so as to form the original feature sets. Second, with convolutional and pooling layers of convolutional neural networks, the most sensitive features of time-frequency images are selected from the original feature sets. Finally, these selected features are fed into the gated recurrent unit to construct the health indicator. The results state that the proposed method shows the enhance performance than the related studies which have used the same bearing dataset provided by PRONOSTIA.

Keywords: Continuous wavelet transform, convolution neural network, gated recurrent unit, health indicators, remaining useful life.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 766
519 Load Flow Analysis: An Overview

Authors: P. S. Bhowmik, D. V. Rajan, S. P. Bose

Abstract:

The load flow study in a power system constitutes a study of paramount importance. The study reveals the electrical performance and power flows (real and reactive) for specified condition when the system is operating under steady state. This paper gives an overview of different techniques used for load flow study under different specified conditions.

Keywords: Load Flow Studies, Y-matrix and Z-matrix iteration, Newton-Raphson method, Fast Decoupled method, Fuzzy logic, Artificial Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6855
518 Exploring Management of the Fuzzy Front End of Innovation in a Product Driven Startup Company

Authors: Dmitry K. Shaytan, Georgy D. Laptev

Abstract:

In our research we aimed to test a managerial approach for the fuzzy front end (FFE) of innovation by creating controlled experiment/ business case in a breakthrough innovation development. The experiment was in the sport industry and covered all aspects of the customer discovery stage from ideation to prototyping followed by patent application. In the paper we describe and analyze mile stones, tasks, management challenges, decisions made to create the break through innovation, evaluate overall managerial efficiency that was at the considered FFE stage. We set managerial outcome of the FFE stage as a valid product concept in hand. In our paper we introduce hypothetical construct “Q-factor” that helps us in the experiment to distinguish quality of FFE outcomes. The experiment simulated for entrepreneur the FFE of innovation and put on his shoulders responsibility for the outcome of valid product concept. While developing managerial approach to reach the outcome there was a decision to look on product concept from the cognitive psychology and cognitive science point of view. This view helped us to develop the profile of a person whose projection (mental representation) of a new product could optimize for a manager or entrepreneur FFE activities. In the experiment this profile was tested to develop breakthrough innovation for swimmers. Following the managerial approach the product concept was created to help swimmers to feel/sense water. The working prototype was developed to estimate the product concept validity and value added effect for customers. Based on feedback from coachers and swimmers there were strong positive effect that gave high value for customers, and for the experiment – the valid product concept being developed by proposed managerial approach for the FFE. In conclusions there is a suggestion of managerial approach that was derived from experiment.

Keywords: Concept development, concept testing, customer discovery, entrepreneurship, entrepreneurial management, idea generation, idea screening, startup management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1852
517 Pruning Algorithm for the Minimum Rule Reduct Generation

Authors: Şahin Emrah Amrahov, Fatih Aybar, Serhat Doğan

Abstract:

In this paper we consider the rule reduct generation problem. Rule Reduct Generation (RG) and Modified Rule Generation (MRG) algorithms, that are used to solve this problem, are well-known. Alternative to these algorithms, we develop Pruning Rule Generation (PRG) algorithm. We compare the PRG algorithm with RG and MRG.

Keywords: Rough sets, Decision rules, Rule induction, Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2048
516 Inverse Sets-based Recognition of Video Clips

Authors: Alexei M. Mikhailov

Abstract:

The paper discusses the mathematics of pattern indexing and its applications to recognition of visual patterns that are found in video clips. It is shown that (a) pattern indexes can be represented by collections of inverted patterns, (b) solutions to pattern classification problems can be found as intersections and histograms of inverted patterns and, thus, matching of original patterns avoided.

Keywords: Artificial neural cortex, computational biology, data mining, pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2114
515 A Practical Approach for Testing the Process Quality

Authors: Mou-Yuan Liao, Chien-Wei Wu, Chien-Hua Lin

Abstract:

Process capability index Cpk is the most widely used index in making managerial decisions since it provides bounds on the process yield for normally distributed processes. However, existent methods for assessing process performance which constructed by statistical inference may unfortunately lead to fine results, because uncertainties exist in most real-world applications. Thus, this study adopts fuzzy inference to deal with testing of Cpk . A brief score is obtained for assessing a supplier’s process instead of a severe evaluation.

Keywords: Process capability analysis, quality control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1425