Search results for: associated probabilities of a fuzzy measure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4027

Search results for: associated probabilities of a fuzzy measure

3487 An Integrated Label Propagation Network for Structural Condition Assessment

Authors: Qingsong Xiong, Cheng Yuan, Qingzhao Kong, Haibei Xiong

Abstract:

Deep-learning-driven approaches based on vibration responses have attracted larger attention in rapid structural condition assessment while obtaining sufficient measured training data with corresponding labels is relevantly costly and even inaccessible in practical engineering. This study proposes an integrated label propagation network for structural condition assessment, which is able to diffuse the labels from continuously-generating measurements by intact structure to those of missing labels of damage scenarios. The integrated network is embedded with damage-sensitive features extraction by deep autoencoder and pseudo-labels propagation by optimized fuzzy clustering, the architecture and mechanism which are elaborated. With a sophisticated network design and specified strategies for improving performance, the present network achieves to extends the superiority of self-supervised representation learning, unsupervised fuzzy clustering and supervised classification algorithms into an integration aiming at assessing damage conditions. Both numerical simulations and full-scale laboratory shaking table tests of a two-story building structure were conducted to validate its capability of detecting post-earthquake damage. The identifying accuracy of a present network was 0.95 in numerical validations and an average 0.86 in laboratory case studies, respectively. It should be noted that the whole training procedure of all involved models in the network stringently doesn’t rely upon any labeled data of damage scenarios but only several samples of intact structure, which indicates a significant superiority in model adaptability and feasible applicability in practice.

Keywords: autoencoder, condition assessment, fuzzy clustering, label propagation

Procedia PDF Downloads 81
3486 An Attempt to Measure Afro-Polychronism Empirically

Authors: Aïda C. Terblanché-Greeff

Abstract:

Afro-polychronism is a unique amalgamated cultural value of social self-construal and time orientation. As such, the construct Afro-polychronism is conceptually analysed by focusing on the aspects of Ubuntu as collectivism and African time as polychronism. It is argued that these cultural values have a reciprocal and thus inseparable relationship. As it is general practice to measure cultural values empirically, the author conducted empirically engaged philosophy and aimed to develop a scale to measure Afro-polychronism based on its two dimensions of Ubuntu as social self-construal and African time as time orientation. From the scale’s psychometric properties, it was determined that the scale was, in fact, not reliable and valid. It was found that the correlation between the Ubuntu dimension and the African time is moderate (albeit statistically significant). In conclusion, the author abduced why this cultural value cannot be empirically measured based on its theoretical definition and indicated which different path would be more promising.

Keywords: African time, Afro-polychronism, empirically engaged African philosophy, Ubuntu

Procedia PDF Downloads 127
3485 Analysis of Critical Success Factors for Implementing Industry 4.0 and Circular Economy to Enhance Food Traceability

Authors: Mahsa Pishdar

Abstract:

Food traceability through the supply chain is facing increased demand. IoT and blockchain are among the tools under consideration in the Industry 4.0 era that could be integrated to help implementation of the Circular Economy (CE) principles while enhancing food traceability solutions. However, such tools need intellectual system, and infrastructureto be settled as guidance through the way, helping overcoming obstacles. That is why the critical success factors for implementing Industry 4.0 and circular economy principles in food traceability concept are analyzed in this paper by combination of interval type 2 fuzzy Worst Best Method and Measurement Alternatives and Ranking according to Compromise Solution (Interval Type 2 fuzzy WBM-MARCOS). Results indicate that “Knowledge of Industry 4.0 obligations and CE principle” is the most important factor that is the basis of success following by “Management commitment and support”. This will assist decision makers to seize success in gaining a competitive advantage while reducing costs through the supply chain.

Keywords: food traceability, industry 4.0, internet of things, block chain, best worst method, marcos

Procedia PDF Downloads 182
3484 Music Genre Classification Based on Non-Negative Matrix Factorization Features

Authors: Soyon Kim, Edward Kim

Abstract:

In order to retrieve information from the massive stream of songs in the music industry, music search by title, lyrics, artist, mood, and genre has become more important. Despite the subjectivity and controversy over the definition of music genres across different nations and cultures, automatic genre classification systems that facilitate the process of music categorization have been developed. Manual genre selection by music producers is being provided as statistical data for designing automatic genre classification systems. In this paper, an automatic music genre classification system utilizing non-negative matrix factorization (NMF) is proposed. Short-term characteristics of the music signal can be captured based on the timbre features such as mel-frequency cepstral coefficient (MFCC), decorrelated filter bank (DFB), octave-based spectral contrast (OSC), and octave band sum (OBS). Long-term time-varying characteristics of the music signal can be summarized with (1) the statistical features such as mean, variance, minimum, and maximum of the timbre features and (2) the modulation spectrum features such as spectral flatness measure, spectral crest measure, spectral peak, spectral valley, and spectral contrast of the timbre features. Not only these conventional basic long-term feature vectors, but also NMF based feature vectors are proposed to be used together for genre classification. In the training stage, NMF basis vectors were extracted for each genre class. The NMF features were calculated in the log spectral magnitude domain (NMF-LSM) as well as in the basic feature vector domain (NMF-BFV). For NMF-LSM, an entire full band spectrum was used. However, for NMF-BFV, only low band spectrum was used since high frequency modulation spectrum of the basic feature vectors did not contain important information for genre classification. In the test stage, using the set of pre-trained NMF basis vectors, the genre classification system extracted the NMF weighting values of each genre as the NMF feature vectors. A support vector machine (SVM) was used as a classifier. The GTZAN multi-genre music database was used for training and testing. It is composed of 10 genres and 100 songs for each genre. To increase the reliability of the experiments, 10-fold cross validation was used. For a given input song, an extracted NMF-LSM feature vector was composed of 10 weighting values that corresponded to the classification probabilities for 10 genres. An NMF-BFV feature vector also had a dimensionality of 10. Combined with the basic long-term features such as statistical features and modulation spectrum features, the NMF features provided the increased accuracy with a slight increase in feature dimensionality. The conventional basic features by themselves yielded 84.0% accuracy, but the basic features with NMF-LSM and NMF-BFV provided 85.1% and 84.2% accuracy, respectively. The basic features required dimensionality of 460, but NMF-LSM and NMF-BFV required dimensionalities of 10 and 10, respectively. Combining the basic features, NMF-LSM and NMF-BFV together with the SVM with a radial basis function (RBF) kernel produced the significantly higher classification accuracy of 88.3% with a feature dimensionality of 480.

Keywords: mel-frequency cepstral coefficient (MFCC), music genre classification, non-negative matrix factorization (NMF), support vector machine (SVM)

Procedia PDF Downloads 278
3483 Influence of Transportation Mode to the Deterioration Rate: Case Study of Food Transport by Ship

Authors: Danijela Tuljak-Suban, Valter Suban

Abstract:

Food as perishable goods represents a specific and sensitive part in the supply chain theory, since changing of its physical or chemical characteristics considerably influences the approach to stock management. The most delicate phase of this process is transportation, where it becomes difficult to ensure stability conditions that limit the deterioration, since the value of the deterioration rate could be easily influenced by the transportation mode. Fuzzy definition of variables allows taking into account these variations. Furthermore an appropriate choice of the defuzzification method permits to adapt results, as much as possible, to real conditions. In the article will be applied the those methods to the relationship between the deterioration rate of perishable goods and transportation by ship, with the aim: (a) to minimize the total costs function, defined as the sum of the ordering cost, holding cost, disposing cost and transportation costs, and (b) to improve supply chain sustainability by reducing the environmental impact and waste disposal costs.

Keywords: perishable goods, fuzzy reasoning, transport by ship, supply chain sustainability

Procedia PDF Downloads 529
3482 Coupling Fuzzy Analytic Hierarchy Process with Storm Water Management Model for Site Selection of Appropriate Adaptive Measures

Authors: Negin Binesh, Mohammad Hossein Niksokhan, Amin Sarang

Abstract:

Best Management Practices (BMPs) are considered as one of the most important structural adaptive measures to climate change and urban development challenges in recent decades. However, not every location is appropriate for applying BMPs in the watersheds. In this paper, location prioritization of two kinds of BMPs was done: Pourous pavement and Detention pond. West Flood-Diversion (WFD) catchment in northern parts of Tehran, Iran, was considered as the case study. The methodology includes integrating the results of Storm Water Management Model (SWMM) into Fuzzy Analytic Hierarchy Process (FAHP) method using Geographic Information System (GIS). The results indicate that mostly suburban areas of the watershed in northern parts are appropriate for applying detention basin, and downstream high-density urban areas are more suitable for using permeable pavement.

Keywords: adaptive measures, BMPs, location prioritization, urban flooding

Procedia PDF Downloads 345
3481 Assessing Knowledge Management Impacts: Challenges, Limits and Base for a New Framework

Authors: Patrick Mbassegue, Mickael Gardoni

Abstract:

In a market environment centered more and more on services and the digital economy, knowledge management becomes a framework that can help organizations to create value and to improve their overall performance. Based on an optimal allocation of scarce resources, managers are interested in demonstrating the added value generated by knowledge management projects. One of the challenges faced by organizations is the difficulty in measuring impacts and concrete results of knowledge management initiatives. The present article concerns the measure of concrete results coming from knowledge management projects based on balance scorecard model. One of the goals is to underline what can be done based on this model but also to highlight the limits associated. The present article is structured in five parts; 1-knowledge management projects and organizational impacts; 2- a framework and a methodology to measure organizational impacts; 3- application illustrated in two case studies; 4- limits concerning the proposed framework; 5- the proposal of a new framework to measure organizational impacts.

Keywords: knowledge management, project, balance scorecard, impacts

Procedia PDF Downloads 248
3480 Scoring System for the Prognosis of Sepsis Patients in Intensive Care Units

Authors: Javier E. García-Gallo, Nelson J. Fonseca-Ruiz, John F. Duitama-Munoz

Abstract:

Sepsis is a syndrome that occurs with physiological and biochemical abnormalities induced by severe infection and carries a high mortality and morbidity, therefore the severity of its condition must be interpreted quickly. After patient admission in an intensive care unit (ICU), it is necessary to synthesize the large volume of information that is collected from patients in a value that represents the severity of their condition. Traditional severity of illness scores seeks to be applicable to all patient populations, and usually assess in-hospital mortality. However, the use of machine learning techniques and the data of a population that shares a common characteristic could lead to the development of customized mortality prediction scores with better performance. This study presents the development of a score for the one-year mortality prediction of the patients that are admitted to an ICU with a sepsis diagnosis. 5650 ICU admissions extracted from the MIMICIII database were evaluated, divided into two groups: 70% to develop the score and 30% to validate it. Comorbidities, demographics and clinical information of the first 24 hours after the ICU admission were used to develop a mortality prediction score. LASSO (least absolute shrinkage and selection operator) and SGB (Stochastic Gradient Boosting) variable importance methodologies were used to select the set of variables that make up the developed score; each of this variables was dichotomized and a cut-off point that divides the population into two groups with different mean mortalities was found; if the patient is in the group that presents a higher mortality a one is assigned to the particular variable, otherwise a zero is assigned. These binary variables are used in a logistic regression (LR) model, and its coefficients were rounded to the nearest integer. The resulting integers are the point values that make up the score when multiplied with each binary variables and summed. The one-year mortality probability was estimated using the score as the only variable in a LR model. Predictive power of the score, was evaluated using the 1695 admissions of the validation subset obtaining an area under the receiver operating characteristic curve of 0.7528, which outperforms the results obtained with Sequential Organ Failure Assessment (SOFA), Oxford Acute Severity of Illness Score (OASIS) and Simplified Acute Physiology Score II (SAPSII) scores on the same validation subset. Observed and predicted mortality rates within estimated probabilities deciles were compared graphically and found to be similar, indicating that the risk estimate obtained with the score is close to the observed mortality, it is also observed that the number of events (deaths) is indeed increasing as the outcome go from the decile with the lowest probabilities to the decile with the highest probabilities. Sepsis is a syndrome that carries a high mortality, 43.3% for the patients included in this study; therefore, tools that help clinicians to quickly and accurately predict a worse prognosis are needed. This work demonstrates the importance of customization of mortality prediction scores since the developed score provides better performance than traditional scoring systems.

Keywords: intensive care, logistic regression model, mortality prediction, sepsis, severity of illness, stochastic gradient boosting

Procedia PDF Downloads 199
3479 Investigating the Relationship between Place Attachment and Sustainable Development of Urban Spaces

Authors: Hamid Reza Zeraatpisheh, Ali Akbar Heidari, Soleiman Mohammadi Doust

Abstract:

This study has examined the relationship between place attachment and sustainable development of urban spaces. To perform this, the components of place identity, emotional attachment, place attachment and social bonding which totally constitute the output of place attachment, by means of the standardized questionnaire measure place attachment in three domains of (cognitive) the place identity, (affective) emotional attachment and (behavioral) place attachment and social bonding. To measure sustainable development, three components of sustainable development, including society, economy and environment has been considered. The study is descriptive. The assessment instrument is the standard questionnaire of Safarnia which has been used to measure the variable of place attachment and to measure the variable of sustainable development, a questionnaire has been made by the researcher and been based on the combined theoretical framework. The statistical population of this research has been the city of Shiraz. The statistical sample has been Hafeziyeh. SPSS software has been used to analyze the data and examined the results of both descriptive and inferential statistics. In inferential statistics, Pearson correlation coefficient has been used to examine the hypotheses. In this study, the variable of place attachment is high and sustainable development is also in a high level. These results suggest a positive relationship between attachment to place and sustainable development.

Keywords: place attachment, sustainable development, economy-society-environment, Hafez's tomb

Procedia PDF Downloads 684
3478 Modeling and Temperature Control of Water-cooled PEMFC System Using Intelligent Algorithm

Authors: Chen Jun-Hong, He Pu, Tao Wen-Quan

Abstract:

Proton exchange membrane fuel cell (PEMFC) is the most promising future energy source owing to its low operating temperature, high energy efficiency, high power density, and environmental friendliness. In this paper, a comprehensive PEMFC system control-oriented model is developed in the Matlab/Simulink environment, which includes the hydrogen supply subsystem, air supply subsystem, and thermal management subsystem. Besides, Improved Artificial Bee Colony (IABC) is used in the parameter identification of PEMFC semi-empirical equations, making the maximum relative error between simulation data and the experimental data less than 0.4%. Operation temperature is essential for PEMFC, both high and low temperatures are disadvantageous. In the thermal management subsystem, water pump and fan are both controlled with the PID controller to maintain the appreciate operation temperature of PEMFC for the requirements of safe and efficient operation. To improve the control effect further, fuzzy control is introduced to optimize the PID controller of the pump, and the Radial Basis Function (RBF) neural network is introduced to optimize the PID controller of the fan. The results demonstrate that Fuzzy-PID and RBF-PID can achieve a better control effect with 22.66% decrease in Integral Absolute Error Criterion (IAE) of T_st (Temperature of PEMFC) and 77.56% decrease in IAE of T_in (Temperature of inlet cooling water) compared with traditional PID. In the end, a novel thermal management structure is proposed, which uses the cooling air passing through the main radiator to continue cooling the secondary radiator. In this thermal management structure, the parasitic power dissipation can be reduced by 69.94%, and the control effect can be improved with a 52.88% decrease in IAE of T_in under the same controller.

Keywords: PEMFC system, parameter identification, temperature control, Fuzzy-PID, RBF-PID, parasitic power

Procedia PDF Downloads 64
3477 Using “Eckel” Model to Measure Income Smoothing Practices: The Case of French Companies

Authors: Feddaoui Amina

Abstract:

Income smoothing represents an attempt on the part of the company's management to reduce variations in earnings through the manipulation of the accounting principles. In this study, we aimed to measure income smoothing practices in a sample of 30 French joint stock companies during the period (2007-2009), we used Dummy variables method and “ECKEL” model to measure income smoothing practices and Binomial test accourding to SPSS program, to confirm or refute our hypothesis. This study concluded that there are no significant statistical indicators of income smoothing practices in the sample studied of French companies during the period (2007-2009), so the income series in the same sample studied of is characterized by stability and non-volatility without any intervention of management through accounting manipulation. However, this type of accounting manipulation should be taken into account and efforts should be made by control bodies to apply Eckel model and generalize its use at the global level.

Keywords: income, smoothing, 'Eckel', French companies

Procedia PDF Downloads 141
3476 A Closed-Loop Design Model for Sustainable Manufacturing by Integrating Forward Design and Reverse Design

Authors: Yuan-Jye Tseng, Yi-Shiuan Chen

Abstract:

In this paper, a new concept of closed-loop design model is presented. The closed-loop design model is developed by integrating forward design and reverse design. Based on this new concept, a closed-loop design model for sustainable manufacturing by integrated evaluation of forward design, reverse design, and green manufacturing using a fuzzy analytic network process is developed. In the design stage of a product, with a given product requirement and objective, there can be different ways to design the detailed components and specifications. Therefore, there can be different design cases to achieve the same product requirement and objective. Thus, in the design evaluation stage, it is required to analyze and evaluate the different design cases. The purpose of this research is to develop a model for evaluating the design cases by integrated evaluation of forward design, reverse design, and green manufacturing models. A fuzzy analytic network process model is presented for integrated evaluation of the criteria in the three models. The comparison matrices for evaluating the criteria in the three groups are established. The total relational values among the three groups represent the total relational effects. In application, a super matrix can be created and the total relational values can be used to evaluate the design cases for decision-making to select the final design case. An example product is demonstrated in this presentation. It shows that the model is useful for integrated evaluation of forward design, reverse design, and green manufacturing to achieve a closed-loop design for sustainable manufacturing objective.

Keywords: design evaluation, forward design, reverse design, closed-loop design, supply chain management, closed-loop supply chain, fuzzy analytic network process

Procedia PDF Downloads 657
3475 Clustering Based Level Set Evaluation for Low Contrast Images

Authors: Bikshalu Kalagadda, Srikanth Rangu

Abstract:

The important object of images segmentation is to extract objects with respect to some input features. One of the important methods for image segmentation is Level set method. Generally medical images and synthetic images with low contrast of pixel profile, for such images difficult to locate interested features in images. In conventional level set function, develops irregularity during its process of evaluation of contour of objects, this destroy the stability of evolution process. For this problem a remedy is proposed, a new hybrid algorithm is Clustering Level Set Evolution. Kernel fuzzy particles swarm optimization clustering with the Distance Regularized Level Set (DRLS) and Selective Binary, and Gaussian Filtering Regularized Level Set (SBGFRLS) methods are used. The ability of identifying different regions becomes easy with improved speed. Efficiency of the modified method can be evaluated by comparing with the previous method for similar specifications. Comparison can be carried out by considering medical and synthetic images.

Keywords: segmentation, clustering, level set function, re-initialization, Kernel fuzzy, swarm optimization

Procedia PDF Downloads 334
3474 Uncertain Time-Cost Trade off Problems of Construction Projects Using Fuzzy Set Theory

Authors: V. S. S. Kumar, B. Vikram

Abstract:

The development of effective decision support tools that adopted in the construction industry is vital in the world we live in today, since it can lead to substantial cost reduction and efficient resource consumption. Solving the time-cost trade off problems and its related variants is at the heart of scientific research for optimizing construction planning problems. In general, the classical optimization techniques have difficulties in dealing with TCT problems. One of the main reasons of their failure is that they can easily be entrapped in local minima. This paper presents an investigation on the application of meta-heuristic techniques to two particular variants of the time-cost trade of analysis, the time-cost trade off problem (TCT), and time-cost trade off optimization problem (TCO). In first problem, the total project cost should be minimized, and in the second problem, the total project cost and total project duration should be minimized simultaneously. Finally it is expected that, the optimization models developed in this paper will contribute significantly for efficient planning and management of construction project.

Keywords: fuzzy sets, uncertainty, optimization, time cost trade off problems

Procedia PDF Downloads 338
3473 Passenger Flow Characteristics of Seoul Metropolitan Subway Network

Authors: Kang Won Lee, Jung Won Lee

Abstract:

Characterizing the network flow is of fundamental importance to understand the complex dynamics of networks. And passenger flow characteristics of the subway network are very relevant for an effective transportation management in urban cities. In this study, passenger flow of Seoul metropolitan subway network is investigated and characterized through statistical analysis. Traditional betweenness centrality measure considers only topological structure of the network and ignores the transportation factors. This paper proposes a weighted betweenness centrality measure that incorporates monthly passenger flow volume. We apply the proposed measure on the Seoul metropolitan subway network involving 493 stations and 16 lines. Several interesting insights about the network are derived from the new measures. Using Kolmogorov-Smirnov test, we also find out that monthly passenger flow between any two stations follows a power-law distribution and other traffic characteristics such as congestion level and throughflow traffic follow exponential distribution.

Keywords: betweenness centrality, correlation coefficient, power-law distribution, Korea traffic DB

Procedia PDF Downloads 270
3472 On Coverage Probability of Confidence Intervals for the Normal Mean with Known Coefficient of Variation

Authors: Suparat Niwitpong, Sa-aat Niwitpong

Abstract:

Statistical inference of normal mean with known coefficient of variation has been investigated recently. This phenomenon occurs normally in environment and agriculture experiments when the scientist knows the coefficient of variation of their experiments. In this paper, we constructed new confidence intervals for the normal population mean with known coefficient of variation. We also derived analytic expressions for the coverage probability of each confidence interval. To confirm our theoretical results, Monte Carlo simulation will be used to assess the performance of these intervals based on their coverage probabilities.

Keywords: confidence interval, coverage probability, expected length, known coefficient of variation

Procedia PDF Downloads 373
3471 A Geospatial Consumer Marketing Campaign Optimization Strategy: Case of Fuzzy Approach in Nigeria Mobile Market

Authors: Adeolu O. Dairo

Abstract:

Getting the consumer marketing strategy right is a crucial and complex task for firms with a large customer base such as mobile operators in a competitive mobile market. While empirical studies have made efforts to identify key constructs, no geospatial model has been developed to comprehensively assess the viability and interdependency of ground realities regarding the customer, competition, channel and the network quality of mobile operators. With this research, a geo-analytic framework is proposed for strategy formulation and allocation for mobile operators. Firstly, a fuzzy analytic network using a self-organizing feature map clustering technique based on inputs from managers and literature, which depicts the interrelationships amongst ground realities is developed. The model is tested with a mobile operator in the Nigeria mobile market. As a result, a customer-centric geospatial and visualization solution is developed. This provides a consolidated and integrated insight that serves as a transparent, logical and practical guide for strategic, tactical and operational decision making.

Keywords: geospatial, geo-analytics, self-organizing map, customer-centric

Procedia PDF Downloads 162
3470 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments

Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic

Abstract:

Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.

Keywords: time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder

Procedia PDF Downloads 278
3469 Application Research of Stilbene Crystal for the Measurement of Accelerator Neutron Sources

Authors: Zhao Kuo, Chen Liang, Zhang Zhongbing, Ruan Jinlu. He Shiyi, Xu Mengxuan

Abstract:

Stilbene, C₁₄H₁₂, is well known as one of the most useful organic scintillators for pulse shape discrimination (PSD) technique for its good scintillation properties. An on-line acquisition system and an off-line acquisition system were developed with several CAMAC standard plug-ins, NIM plug-ins, neutron/γ discriminating plug-in named 2160A and a digital oscilloscope with high sampling rate respectively for which stilbene crystals and photomultiplier tube detectors (PMT) as detector for accelerator neutron sources measurement carried out in China Institute of Atomic Energy. Pulse amplitude spectrums and charge amplitude spectrums were real-time recorded after good neutron/γ discrimination whose best PSD figure-of-merits (FoMs) are 1.756 for D-D accelerator neutron source and 1.393 for D-T accelerator neutron source. The probability of neutron events in total events was 80%, and neutron detection efficiency was 5.21% for D-D accelerator neutron sources, which were 50% and 1.44% for D-T accelerator neutron sources after subtracting the background of scattering observed by the on-line acquisition system. Pulse waveform signals were acquired by the off-line acquisition system randomly while the on-line acquisition system working. The PSD FoMs obtained by the off-line acquisition system were 2.158 for D-D accelerator neutron sources and 1.802 for D-T accelerator neutron sources after waveform digitization off-line processing named charge integration method for just 1000 pulses. In addition, the probabilities of neutron events in total events obtained by the off-line acquisition system matched very well with the probabilities of the on-line acquisition system. The pulse information recorded by the off-line acquisition system could be repetitively used to adjust the parameters or methods of PSD research and obtain neutron charge amplitude spectrums or pulse amplitude spectrums after digital analysis with a limited number of pulses. The off-line acquisition system showed equivalent or better measurement effects compared with the online system with a limited number of pulses which indicated a feasible method based on stilbene crystals detectors for the measurement of prompt neutrons neutron sources like prompt accelerator neutron sources emit a number of neutrons in a short time.

Keywords: stilbene crystal, accelerator neutron source, neutron / γ discrimination, figure-of-merits, CAMAC, waveform digitization

Procedia PDF Downloads 172
3468 Heart Rate Variability as a Measure of Dairy Calf Welfare

Authors: J. B. Clapp, S. Croarkin, C. Dolphin, S. K. Lyons

Abstract:

Chronic pain or stress in farm animals impacts both on their welfare and productivity. Measuring chronic pain or stress can be problematic using hormonal or behavioural changes because hormones are modulated by homeostatic mechanisms and observed behaviour can be highly subjective. We propose that heart rate variability (HRV) can quantify chronic pain or stress in farmed animal and represents a more robust and objective measure of their welfare.

Keywords: dairy calf, welfare, heart rate variability, non-invasive, biomonitor

Procedia PDF Downloads 584
3467 Proposing an Index for Determining Key Knowledge Management Processes in Decision Making Units Using Fuzzy Quality Function Deployment (QFD), Data Envelopment Analysis (DEA) Method

Authors: Sadegh Abedi, Ali Yaghoubi, Hamidreza Mashatzadegan

Abstract:

This paper proposes an approach to identify key processes required by an organization in the field of knowledge management and aligning them with organizational objectives. For this purpose, first, organization’s most important non-financial objectives which are impacted by knowledge management processes are identified and then, using a quality house, are linked with knowledge management processes which are regarded as technical elements. Using this method, processes that are in need of improvement and more attention are prioritized based on their significance. This means that if a process has more influence on organization’s objectives and is in a dire situation comparing to others, is prioritized for choice and improvement. In this research process dominance is considered to be an influential element in process ranking (in addition to communication matrix). This is the reason for utilizing DEA techniques for prioritizing processes in quality house. Results of implementing the method in Khuzestan steel company represents this method’s capability of identifying key processes that require improvements in organization’s knowledge management system.

Keywords: knowledge management, organizational performance, fuzzy data, envelopment analysis

Procedia PDF Downloads 402
3466 Evaluation of the Matching Optimization of Human-Machine Interface Matching in the Cab

Authors: Yanhua Ma, Lu Zhai, Xinchen Wang, Hongyu Liang

Abstract:

In this paper, by understanding the development status of the human-machine interface in today's automobile cab, a subjective and objective evaluation system for evaluating the optimization of human-machine interface matching in automobile cab was established. The man-machine interface of the car cab was divided into a software interface and a hard interface. Objective evaluation method of software human factor analysis is used to evaluate the hard interface matching; The analytic hierarchy process is used to establish the evaluation index system for the software interface matching optimization, and the multi-level fuzzy comprehensive evaluation method is used to evaluate hard interface machine. This article takes Dongfeng Sokon (DFSK) C37 model automobile as an example. The evaluation method given in the paper is used to carry out relevant analysis and evaluation, and corresponding optimization suggestions are given, which have certain reference value for designers.

Keywords: analytic hierarchy process, fuzzy comprehension evaluation method, human-machine interface, matching optimization, software human factor analysis

Procedia PDF Downloads 126
3465 Measure the Gas to Dust Ratio Towards Bright Sources in the Galactic Bulge

Authors: Jun Yang, Norbert Schulz, Claude Canizares

Abstract:

Knowing the dust content in the interstellar matter is necessary to understand the composition and evolution of the interstellar medium (ISM). The metal composition of the ISM enables us to study the cooling and heating processes that dominate the star formation rates in our Galaxy. The Chandra High Energy Transmission Grating (HETG) Spectrometer provides a unique opportunity to measure element dust compositions through X-ray edge absorption structure. We measure gas to dust optical depth ratios towards 9 bright Low-Mass X-ray Binaries (LMXBs) in the Galactic Bulge with the highest precision so far. Well calibrated and pile-up free optical depths are measured with the HETG spectrometer with respect to broadband hydrogen equivalent absorption in bright LMXBs: 4U 1636-53, Ser X-1, GX 3+1, 4U 1728-34, 4U 1705-44, GX 340+0, GX 13+1, GX 5-1, and GX 349+2. From the optical depths results, we deduce gas to dust ratios for various silicates in the ISM and present our results for the Si K edge in different lines of sight towards the Galactic Bulge.

Keywords: low-mass X-ray binaries, interstellar medium, gas to dust ratio, spectrometer

Procedia PDF Downloads 131
3464 Optimizing Operation of Photovoltaic System Using Neural Network and Fuzzy Logic

Authors: N. Drir, L. Barazane, M. Loudini

Abstract:

It is well known that photovoltaic (PV) cells are an attractive source of energy. Abundant and ubiquitous, this source is one of the important renewable energy sources that have been increasing worldwide year by year. However, in the V-P characteristic curve of GPV, there is a maximum point called the maximum power point (MPP) which depends closely on the variation of atmospheric conditions and the rotation of the earth. In fact, such characteristics outputs are nonlinear and change with variations of temperature and irradiation, so we need a controller named maximum power point tracker MPPT to extract the maximum power at the terminals of photovoltaic generator. In this context, the authors propose here to study the modeling of a photovoltaic system and to find an appropriate method for optimizing the operation of the PV generator using two intelligent controllers respectively to track this point. The first one is based on artificial neural networks and the second on fuzzy logic. After the conception and the integration of each controller in the global process, the performances are examined and compared through a series of simulation. These two controller have prove by their results good tracking of the MPPT compare with the other method which are proposed up to now.

Keywords: maximum power point tracking, neural networks, photovoltaic, P&O

Procedia PDF Downloads 321
3463 Hybrid Wavelet-Adaptive Neuro-Fuzzy Inference System Model for a Greenhouse Energy Demand Prediction

Authors: Azzedine Hamza, Chouaib Chakour, Messaoud Ramdani

Abstract:

Energy demand prediction plays a crucial role in achieving next-generation power systems for agricultural greenhouses. As a result, high prediction quality is required for efficient smart grid management and therefore low-cost energy consumption. The aim of this paper is to investigate the effectiveness of a hybrid data-driven model in day-ahead energy demand prediction. The proposed model consists of Discrete Wavelet Transform (DWT), and Adaptive Neuro-Fuzzy Inference System (ANFIS). The DWT is employed to decompose the original signal in a set of subseries and then an ANFIS is used to generate the forecast for each subseries. The proposed hybrid method (DWT-ANFIS) was evaluated using a greenhouse energy demand data for a week and compared with ANFIS. The performances of the different models were evaluated by comparing the corresponding values of Mean Absolute Percentage Error (MAPE). It was demonstrated that discret wavelet transform can improve agricultural greenhouse energy demand modeling.

Keywords: wavelet transform, ANFIS, energy consumption prediction, greenhouse

Procedia PDF Downloads 68
3462 Rehabilitation of CP Using Pediatric Functional Independent Measure (WeeFIM) as Indicator Instruments Suitable for CP: Saudi's Perspective

Authors: Bara M. Yousef

Abstract:

Kingdome of Saudi Arabia (KSA). High numbers of traffic accidents with sever, moderate and mild level of impairments admits to Sultan bin Abdulaziz humanitarian city. Over a period of 4 months the city received 111 male and 79 female subjects with CP, who received 4-6 weeks of rehabilitation and using WeeFIM score to measure rehabilitation outcomes. WeeFIM measures and covers various domains, such as: self-care, mobility, locomotion, communication and other psycho-social aspects. Our findings shed the light on the fact that nearly 85% of people at admission got better after rehabilitation program services at individual sever moderate and mild and has arrange of (59 out of 128 WeeFIM score) and by the time of discharge they leave the city with better FIM score close to (72 out of 128 WeeFIM score) for the entire study sample. WeeFIM score is providing fair evidence to rehabilitation specialists to assess their outcomes. However there is a need to implement other instruments and compare it to WeeFIM in order to reach better outcomes at discharge level.

Keywords: Cerepral Palsy (CP), pediatric Functional Independent Measure (WeeFIM), rehabilitation, disability

Procedia PDF Downloads 207
3461 A Multicriteria Model for Sustainable Management in Agriculture

Authors: Basil Manos, Thomas Bournaris, Christina Moulogianni

Abstract:

The European agricultural policy supports all member states to apply agricultural development plans for the development of their agricultural sectors. A specific measure of the agricultural development plans refers to young people in order to enter into the agricultural sector. This measure helps the participating young farmers in achieving maximum efficiency, using methods and environmentally friendly practices, by altering their farm plans. This study applies a Multicriteria Mathematical Programming (MCDA) model for the young farmers to find farm plans that achieve the maximum gross margin and the minimum environmental impacts (less use of fertilizers and irrigation water). The analysis was made in the region of Central Macedonia, Greece, among young farmers who have participated in the “Setting up Young Farmers” measure during 2007-2010. The analysis includes the implementation of the MCDA model for the farm plans optimization and the comparison of selected environmental indicators with those of the existent situation.

Keywords: multicriteria, optimum farm plans, environmental impacts, sustainable management

Procedia PDF Downloads 322
3460 Application of Two Stages Adaptive Neuro-Fuzzy Inference System to Improve Dissolved Gas Analysis Interpretation Techniques

Authors: Kharisma Utomo Mulyodinoto, Suwarno, A. Abu-Siada

Abstract:

Dissolved Gas Analysis is one of impressive technique to detect and predict internal fault of transformers by using gas generated by transformer oil sample. A number of methods are used to interpret the dissolved gas from transformer oil sample: Doernenberg Ratio Method, IEC (International Electrotechnical Commission) Ratio Method, and Duval Triangle Method. While the assessment of dissolved gas within transformer oil samples has been standardized over the past two decades, analysis of the results is not always straight forward as it depends on personnel expertise more than mathematical formulas. To get over this limitation, this paper is aimed at improving the interpretation of Doernenberg Ratio Method, IEC Ratio Method, and Duval Triangle Method using Two Stages Adaptive Neuro-Fuzzy Inference System (ANFIS). Dissolved gas analysis data from 520 faulty transformers was analyzed to establish the proposed ANFIS model. Results show that the developed ANFIS model is accurate and can standardize the dissolved gas interpretation process with accuracy higher than 90%.

Keywords: ANFIS, dissolved gas analysis, Doernenberg ratio method, Duval triangular method, IEC ratio method, transformer

Procedia PDF Downloads 133
3459 Analysis of Six Sigma in the Aerospace Industry

Authors: Masimuddin Mohd Khaled

Abstract:

This paper subsidizes to the discussion of Six Sigma in the Aerospace Industry. The main aim of this report is to study the literature review of Six Sigma emphasizing on the aerospace industry. The implementation of Six Sigma stages are studied and how the improvement cycle ‘Define, Measure, Analyze, Improve, and Control cycle’ (DMAIC) and the design process is ‘Define, Measure, Analyze, Design, and Verify Cycle’ (DMADV) is used. The focus is also done by studying how the implementation of Six Sigma on an aerospace company has brought a positive effect to the company.

Keywords: six sigma, DMAIC, DMADV, aerospace

Procedia PDF Downloads 352
3458 Multi-Spectral Medical Images Enhancement Using a Weber’s law

Authors: Muna F. Al-Sammaraie

Abstract:

The aim of this research is to present a multi spectral image enhancement methods used to achieve highly real digital image populates only a small portion of the available range of digital values. Also, a quantitative measure of image enhancement is presented. This measure is related with concepts of the Webers Low of the human visual system. For decades, several image enhancement techniques have been proposed. Although most techniques require profuse amount of advance and critical steps, the result for the perceive image are not as satisfied. This study involves changing the original values so that more of the available range is used; then increases the contrast between features and their backgrounds. It consists of reading the binary image on the basis of pixels taking them byte-wise and displaying it, calculating the statistics of an image, automatically enhancing the color of the image based on statistics calculation using algorithms and working with RGB color bands. Finally, the enhanced image is displayed along with image histogram. A number of experimental results illustrated the performance of these algorithms. Particularly the quantitative measure has helped to select optimal processing parameters: the best parameters and transform.

Keywords: image enhancement, multi-spectral, RGB, histogram

Procedia PDF Downloads 311