Search results for: weighted permutation entropy (WPE)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 891

Search results for: weighted permutation entropy (WPE)

711 Cold Spray High Entropy Alloy Coating Surface Microstructural Characterization and Mechanical Testing

Authors: Raffaella Sesana, Nazanin Sheibanian, Luca Corsaro, Sedat Özbilen, Rocco Lupoi, Francesco Artusio

Abstract:

High Entropy Alloy (HEA) coatings of Al0.1-0.5CoCrCuFeNi and MnCoCrCuFeNi on Mg substrates were prepared from mechanically alloyed HEA powder feedstocks and at three different Cold Spray (CS) process gas (N2) temperatures (650, 750 and 850°C). Mechanically alloyed and cold-sprayed HEA coatings were characterized by macro photography, OM, SEM+EDS study, micro-hardness testing, roughness, and porosity measurements. As a result of mechanical alloying (MA), harder particles are deformed and fractured. The particles in the Cu-rich region were coarser and more globular than those in the A1 phase, which is relatively soft and ductile. In addition to the A1 particles, there were some separate Cu-rich regions. Due to the brittle nature of the powder and the acicular shape, Mn-HEA powder exhibited a different trend with smaller particle sizes. It is observed that MA results in a loose structure characterized by many gaps, cracks, signs of plastic deformation, and small particles attached to the surface of the particle. Considering the experimental results obtained, it is not possible to conclude that the chemical composition of the high entropy alloy influences the roughness of the coating. It has been observed that the deposited volume increases with temperature only in the case of Al0.1 and Mg-based HEA, while for the rest of the Al-based HEA, there are no noticeable changes. There is a direct correlation between micro-hardness and the chemical composition of a coating: the micro-hardness of a coating increases as the percentage of aluminum increases in the sample. Compared to the substrate, the coating has a much higher hardness, and the hardness measured at the interface is intermediate.

Keywords: characterisation, cold spraying, HEA coatings, SEM+EDS

Procedia PDF Downloads 33
710 The Bayesian Premium Under Entropy Loss

Authors: Farouk Metiri, Halim Zeghdoudi, Mohamed Riad Remita

Abstract:

Credibility theory is an experience rating technique in actuarial science which can be seen as one of quantitative tools that allows the insurers to perform experience rating, that is, to adjust future premiums based on past experiences. It is used usually in automobile insurance, worker's compensation premium, and IBNR (incurred but not reported claims to the insurer) where credibility theory can be used to estimate the claim size amount. In this study, we focused on a popular tool in credibility theory which is the Bayesian premium estimator, considering Lindley distribution as a claim distribution. We derive this estimator under entropy loss which is asymmetric and squared error loss which is a symmetric loss function with informative and non-informative priors. In a purely Bayesian setting, the prior distribution represents the insurer’s prior belief about the insured’s risk level after collection of the insured’s data at the end of the period. However, the explicit form of the Bayesian premium in the case when the prior is not a member of the exponential family could be quite difficult to obtain as it involves a number of integrations which are not analytically solvable. The paper finds a solution to this problem by deriving this estimator using numerical approximation (Lindley approximation) which is one of the suitable approximation methods for solving such problems, it approaches the ratio of the integrals as a whole and produces a single numerical result. Simulation study using Monte Carlo method is then performed to evaluate this estimator and mean squared error technique is made to compare the Bayesian premium estimator under the above loss functions.

Keywords: bayesian estimator, credibility theory, entropy loss, monte carlo simulation

Procedia PDF Downloads 298
709 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 240
708 An EWMA P-Chart Based on Improved Square Root Transformation

Authors: Saowanit Sukparungsee

Abstract:

Generally, the traditional Shewhart p chart has been developed by for charting the binomial data. This chart has been developed using the normal approximation with condition as low defect level and the small to moderate sample size. In real applications, however, are away from these assumptions due to skewness in the exact distribution. In this paper, a modified Exponentially Weighted Moving Average (EWMA) control chat for detecting a change in binomial data by improving square root transformations, namely ISRT p EWMA control chart. The numerical results show that ISRT p EWMA chart is superior to ISRT p chart for small to moderate shifts, otherwise, the latter is better for large shifts.

Keywords: number of defects, exponentially weighted moving average, average run length, square root transformations

Procedia PDF Downloads 405
707 Influence of Sintering Temperature on Microhardness and Tribological Properties of Equi-Atomic Ti-Al-Mo-Si-W Multicomponent Alloy

Authors: Rudolf L. Kanyane, Nicolaus Malatji, Patritia A. Popoola

Abstract:

Tribological failure of materials during application can lead to catastrophic events which also carry economic penalties. High entropy alloys (HEAs) have shown outstanding tribological properties in applications such as mechanical parts were moving parts under high friction are required. This work aims to investigate the effect of sintering temperature on microhardness properties and tribological properties of novel equiatomic TiAlMoSiW HEAs fabricated via spark plasma sintering. The effect of Spark plasma sintering temperature on morphological evolution and phase formation was also investigated. The microstructure and the phases formed for the developed HEAs were examined using scanning electron microscopy (SEM) and X-ray diffractometry (XRD) respectively. The microhardness and tribological properties were studied using a diamond base microhardness tester Rtec tribometer. The developed HEAs showed improved mechanical properties as the sintering temperature increases.

Keywords: sintering, high entropy alloy, microhardness, tribology

Procedia PDF Downloads 106
706 Study on Hydrogen Isotope Permeability of High Entropy Alloy Coating

Authors: Long Wang, Yongjin Feng, Xiaofang Luo

Abstract:

Tritium permeation through structural materials is a significant issue for fusion demonstration (DEMO) reactor blankets in terms of fuel cycle efficiency and radiological safety. Reduced activation ferritic (RAFM) steel CLF-1 is a prime candidate for the China’s CFETR blanket structural material, facing high permeability of hydrogen isotopes at reactor operational temperature. To confine tritium as much as possible in the reactor, surface modification of the steels including fabrication of tritium permeation barrier (TPB) attracts much attention. As a new alloy system, high entropy alloy (HEA) contains at least five principal elements, each of which ranges from 5 at% to 35 at%. This high mixing effect entitles HEA extraordinary comprehensive performance. So it is attractive to lead HEA into surface alloying for protective use. At present, studies on the hydrogen isotope permeability of HEA coatings is still insufficient and corresponding mechanism isn’t clear. In our study, we prepared three kinds of HEA coatings, including AlCrTaTiZr, (AlCrTaTiZr)N and (AlCrTaTiZr)O. After comprehensive characterization of SEM, XPS, AFM, XRD and TEM, the structure and composition of the HEA coatings were obtained. Deuterium permeation tests were conducted to evaluate the hydrogen isotope permeability of AlCrTaTiZr, (AlCrTaTiZr)N and (AlCrTaTiZr)O HEA coatings. Results proved that the (AlCrTaTiZr)N and (AlCrTaTiZr)O HEA coatings had better hydrogen isotope permeation resistance. Through analyzing and characterizing the hydrogen isotope permeation results of the corroded samples, an internal link between hydrogen isotope permeation behavior and structure of HEA coatings was established. The results provide valuable reference in engineering design of structural and TPB materials for future fusion device.

Keywords: high entropy alloy, hydrogen isotope permeability, tritium permeation barrier, fusion demonstration reactor

Procedia PDF Downloads 142
705 Statistical Convergence of the Szasz-Mirakjan-Kantorovich-Type Operators

Authors: Rishikesh Yadav, Ramakanta Meher, Vishnu Narayan Mishra

Abstract:

The main aim of this article is to investigate the statistical convergence of the summation of integral type operators and to obtain the weighted statistical convergence. The rate of statistical convergence by means of modulus of continuity and function belonging to the Lipschitz class are also studied. We discuss the convergence of the defined operators by graphical representation and put a better rate of convergence than the Szasz-Mirakjan-Kantorovich operators. In the last section, we extend said operators into bivariate operators to study about the rate of convergence in sense of modulus of continuity and by means of Lipschitz class by using function of two variables.

Keywords: The Szasz-Mirakjan-Kantorovich operators, statistical convergence, modulus of continuity, Peeters K-functional, weighted modulus of continuity

Procedia PDF Downloads 168
704 Hybrid Artificial Bee Colony and Least Squares Method for Rule-Based Systems Learning

Authors: Ahcene Habbi, Yassine Boudouaoui

Abstract:

This paper deals with the problem of automatic rule generation for fuzzy systems design. The proposed approach is based on hybrid artificial bee colony (ABC) optimization and weighted least squares (LS) method and aims to find the structure and parameters of fuzzy systems simultaneously. More precisely, two ABC based fuzzy modeling strategies are presented and compared. The first strategy uses global optimization to learn fuzzy models, the second one hybridizes ABC and weighted least squares estimate method. The performances of the proposed ABC and ABC-LS fuzzy modeling strategies are evaluated on complex modeling problems and compared to other advanced modeling methods.

Keywords: automatic design, learning, fuzzy rules, hybrid, swarm optimization

Procedia PDF Downloads 409
703 Weighted-Distance Sliding Windows and Cooccurrence Graphs for Supporting Entity-Relationship Discovery in Unstructured Text

Authors: Paolo Fantozzi, Luigi Laura, Umberto Nanni

Abstract:

The problem of Entity relation discovery in structured data, a well covered topic in literature, consists in searching within unstructured sources (typically, text) in order to find connections among entities. These can be a whole dictionary, or a specific collection of named items. In many cases machine learning and/or text mining techniques are used for this goal. These approaches might be unfeasible in computationally challenging problems, such as processing massive data streams. A faster approach consists in collecting the cooccurrences of any two words (entities) in order to create a graph of relations - a cooccurrence graph. Indeed each cooccurrence highlights some grade of semantic correlation between the words because it is more common to have related words close each other than having them in the opposite sides of the text. Some authors have used sliding windows for such problem: they count all the occurrences within a sliding windows running over the whole text. In this paper we generalise such technique, coming up to a Weighted-Distance Sliding Window, where each occurrence of two named items within the window is accounted with a weight depending on the distance between items: a closer distance implies a stronger evidence of a relationship. We develop an experiment in order to support this intuition, by applying this technique to a data set consisting in the text of the Bible, split into verses.

Keywords: cooccurrence graph, entity relation graph, unstructured text, weighted distance

Procedia PDF Downloads 115
702 Active Thermography Technique for High-Entropy Alloy Characterization Deposited with Cold Spray Technique

Authors: Nazanin Sheibanian, Raffaella Sesana, Sedat Ozbilen

Abstract:

In recent years, high-entropy alloys (HEAs) have attracted considerable attention due to their unique properties and potential applications. In this study, novel HEA coatings were prepared on Mg substrates using mechanically alloyed HEA powder feedstocks based on Al_(0.1-0.5)CoCrCuFeNi and MnCoCrCuFeNi multi-material systems. The coatings were deposited by the Cold Spray (CS) process using three different temperatures of the process gas (N2) (650°C, 750°C, and 850°C) to examine the effect of gas temperature on coating properties. In this study, Infrared Thermography (non-destructive) was examined as a possible quality control technique for HEA coatings applied to magnesium substrates. Active Thermography was employed to characterize coating properties using the thermal response of the coating. Various HEA chemical compositions and deposition temperatures have been investigated. As a part of this study, a comprehensive macro and microstructural analysis of Cold Spray (CS) HEA coatings has been conducted using macrophotography, optical microscopy, scanning electron microscopy with energy-dispersive X-ray spectroscopy (SEM+EDS), X-ray diffraction (XRD), X-ray photoelectron spectroscopy (XPS), microhardness tests, roughness measurements, and porosity assessments. These analyses provided insight into phase identification, microstructure characterization, deposition, particle deformation behavior, bonding mechanisms, and identifying a possible relationship between physical properties and thermal responses. Based on the figures and tables, it is evident that the Maximum Relative Radiance (∆RMax) of each sample differs depending on both the chemical composition of HEA and the temperature at which Cold Spray is applied.

Keywords: active thermography, coating, cold spray, high- entropy alloy, material characterization

Procedia PDF Downloads 44
701 Breast Cancer Survivability Prediction via Classifier Ensemble

Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia

Abstract:

This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.

Keywords: classifier ensemble, breast cancer survivability, data mining, SEER

Procedia PDF Downloads 293
700 Using Scale Invariant Feature Transform Features to Recognize Characters in Natural Scene Images

Authors: Belaynesh Chekol, Numan Çelebi

Abstract:

The main purpose of this work is to recognize individual characters extracted from natural scene images using scale invariant feature transform (SIFT) features as an input to K-nearest neighbor (KNN); a classification learner algorithm. For this task, 1,068 and 78 images of English alphabet characters taken from Chars74k data set is used to train and test the classifier respectively. For each character image, We have generated describing features by using SIFT algorithm. This set of features is fed to the learner so that it can recognize and label new images of English characters. Two types of KNN (fine KNN and weighted KNN) were trained and the resulted classification accuracy is 56.9% and 56.5% respectively. The training time taken was the same for both fine and weighted KNN.

Keywords: character recognition, KNN, natural scene image, SIFT

Procedia PDF Downloads 253
699 Understanding the Processwise Entropy Framework in a Heat-powered Cooling Cycle

Authors: P. R. Chauhan, S. K. Tyagi

Abstract:

Adsorption refrigeration technology offers a sustainable and energy-efficient cooling alternative over traditional refrigeration technologies for meeting the fast-growing cooling demands. With its ability to utilize natural refrigerants, low-grade heat sources, and modular configurations, it has the potential to revolutionize the cooling industry. Despite these benefits, the commercial viability of this technology is hampered by several fundamental limiting constraints, including its large size, low uptake capacity, and poor performance as a result of deficient heat and mass transfer characteristics. The primary cause of adequate heat and mass transfer characteristics and magnitude of exergy loss in various real processes of adsorption cooling system can be assessed by the entropy generation rate analysis, i. e. Second law of Thermodynamics. Therefore, this article presents the second law of thermodynamic-based investigation in terms of entropy generation rate (EGR) to identify the energy losses in various processes of the HPCC-based adsorption system using MATLAB R2021b software. The adsorption technology-based cooling system consists of two beds made up of silica gel and arranged in a single stage, while the water is employed as a refrigerant, coolant, and hot fluid. The variation in process-wise EGR is examined corresponding to cycle time, and a comparative analysis is also presented. Moreover, the EGR is also evaluated in the external units, such as the heat source and heat sink unit used for regeneration and heat dump, respectively. The research findings revealed that the combination of adsorber and desorber, which operates across heat reservoirs with a higher temperature gradient, shares more than half of the total amount of EGR. Moreover, the EGR caused by the heat transfer process is determined to be the highest, followed by a heat sink, heat source, and mass transfer, respectively. in case of heat transfer process, the operation of the valve is determined to be responsible for more than half (54.9%) of the overall EGR during the heat transfer. However, the combined contribution of the external units, such as the source (18.03%) and sink (21.55%), to the total EGR, is 35.59%. The analysis and findings of the present research are expected to pinpoint the source of the energy waste in HPCC based adsorption cooling systems.

Keywords: adsorption cooling cycle, heat transfer, mass transfer, entropy generation, silica gel-water

Procedia PDF Downloads 76
698 Cognitive Weighted Polymorphism Factor: A New Cognitive Complexity Metric

Authors: T. Francis Thamburaj, A. Aloysius

Abstract:

Polymorphism is one of the main pillars of the object-oriented paradigm. It induces hidden forms of class dependencies which may impact software quality, resulting in higher cost factor for comprehending, debugging, testing, and maintaining the software. In this paper, a new cognitive complexity metric called Cognitive Weighted Polymorphism Factor (CWPF) is proposed. Apart from the software structural complexity, it includes the cognitive complexity on the basis of type. The cognitive weights are calibrated based on 27 empirical studies with 120 persons. A case study and experimentation of the new software metric shows positive results. Further, a comparative study is made and the correlation test has proved that CWPF complexity metric is a better, more comprehensive, and more realistic indicator of the software complexity than Abreu’s Polymorphism Factor (PF) complexity metric.

Keywords: cognitive complexity metric, object-oriented metrics, polymorphism factor, software metrics

Procedia PDF Downloads 411
697 Utilizing Waste Heat from Thermal Power Plants to Generate Power by Modelling an Atmospheric Vortex Engine

Authors: Mohammed Nabeel Khan, C. Perisamy

Abstract:

Convective vortices are normal highlights of air that ingest lower-entropy-energy at higher temperatures than they dismiss higher-entropy-energy to space. By means of the thermodynamic proficiency, it has been anticipated that the force of convective vortices relies upon the profundity of the convective layer. The atmospheric vortex engine is proposed as a gadget for delivering mechanical energy by methods for artificially produced vortex. The task of the engine is in view of the certainties that the environment is warmed from the base and cooled from the top. By generation of the artificial vortex, it is planned to take out the physical solar updraft tower and decrease the capital of the solar chimney power plants. The study shows the essentials of the atmospheric vortex engine, furthermore, audits the cutting edge in subject. Moreover, the study talks about a thought on using the solar energy as heat source to work the framework. All in all, the framework is attainable and promising for electrical power production.

Keywords: AVE, atmospheric vortex engine, atmosphere, updraft, vortex

Procedia PDF Downloads 127
696 The Cardiac Diagnostic Prediction Applied to a Designed Holter

Authors: Leonardo Juan Ramírez López, Javier Oswaldo Rodriguez Velasquez

Abstract:

We have designed a Holter that measures the heart´s activity for over 24 hours, implemented a prediction methodology, and generate alarms as well as indicators to patients and treating physicians. Various diagnostic advances have been developed in clinical cardiology thanks to Holter implementation; however, their interpretation has largely been conditioned to clinical analysis and measurements adjusted to diverse population characteristics, thus turning it into a subjective examination. This, however, requires vast population studies to be validated that, in turn, have not achieved the ultimate goal: mortality prediction. Given this context, our Insight Research Group developed a mathematical methodology that assesses cardiac dynamics through entropy and probability, creating a numerical and geometrical attractor which allows quantifying the normalcy of chronic and acute disease as well as the evolution between such states, and our Tigum Research Group developed a holter device with 12 channels and advanced computer software. This has been shown in different contexts with 100% sensitivity and specificity results.

Keywords: attractor , cardiac, entropy, holter, mathematical , prediction

Procedia PDF Downloads 136
695 Spatio-temporal Distribution of the Groundwater Quality in the El Milia Plain, Kebir Rhumel Basin, Algeria

Authors: Lazhar Belkhiri, Ammar Tiri, Lotfi Mouni

Abstract:

In this research, we analyzed the groundwater quality index in the El Milia plain, Kebir Rhumel Basin, Algeria. Thirty-three groundwater samples were collected from wells in the El Milia plain during April 2015. In this study, pH and electrical conductivity (EC) were conducted at each sampling well. Eight hydrochemical parameters such as calcium (Ca), magnesium (Mg), sodium (Na), potassium (K), chlorid (Cl), sulfate (SO4), bicarbonate (HCO3), and Nnitrate (NO3) were analysed. The entropy water quality index (EWQI) method was employed to evaluate the groundwater quality in the study area. Moran’s I and the ordinary kriging (OK) interpolation technique were used to examine the spatial distribution pattern of the hydrochemical parameters in the groundwater. It was found that the hydrochemical parameters Ca, Cl, and HCO3 showed strong spatial autocorrelation in the El Milia plain, indicating a spatial dependence and clustering of these parameters in the groundwater. The groundwater quality was evaluated using the entropy water quality index (EWQI). The results showed that approximately 86% of the total groundwater samples in the study area fall within the moderate groundwater quality category. The spatial map of the EWQI values indicated an increasing trend from the south-west to the northeast, following the direction of groundwater flow. The highest EWQI values were observed near El Milia city in the center of the plain. This spatial pattern suggests variations in groundwater quality across the study area, with potentially higher risks near the city center. Therefore, the results obtained in this research provide very useful information to decision-makers.

Keywords: entropy water quality index (EWQI), moran’s i, ordinary kriging interpolation, el milia plain

Procedia PDF Downloads 22
694 Extended Constraint Mask Based One-Bit Transform for Low-Complexity Fast Motion Estimation

Authors: Oğuzhan Urhan

Abstract:

In this paper, an improved motion estimation (ME) approach based on weighted constrained one-bit transform is proposed for block-based ME employed in video encoders. Binary ME approaches utilize low bit-depth representation of the original image frames with a Boolean exclusive-OR based hardware efficient matching criterion to decrease computational burden of the ME stage. Weighted constrained one-bit transform (WC‑1BT) based approach improves the performance of conventional C-1BT based ME employing 2-bit depth constraint mask instead of a 1-bit depth mask. In this work, the range of constraint mask is further extended to increase ME performance of WC-1BT approach. Experiments reveal that the proposed method provides better ME accuracy compared existing similar ME methods in the literature.

Keywords: fast motion estimation; low-complexity motion estimation, video coding

Procedia PDF Downloads 292
693 The Microstructure and Corrosion Behavior of High Entropy Metallic Layers Electrodeposited by Low and High-Temperature Methods

Authors: Zbigniew Szklarz, Aldona Garbacz-Klempka, Magdalena Bisztyga-Szklarz

Abstract:

Typical metallic alloys bases on one major alloying component, where the addition of other elements is intended to improve or modify certain properties, most of all the mechanical properties. However, in 1995 a new concept of metallic alloys was described and defined. High Entropy Alloys (HEA) contains at least five alloying elements in an amount from 5 to 20 at.%. A common feature this type of alloys is an absence of intermetallic phases, high homogeneity of the microstructure and unique chemical composition, what leads to obtaining materials with very high strength indicators, stable structures (also at high temperatures) and excellent corrosion resistance. Hence, HEA can be successfully used as a substitutes for typical metallic alloys in various applications where a sufficiently high properties are desirable. For fabricating HEA, a few ways are applied: 1/ from liquid phase i.e. casting (usually arc melting); 2/ from solid phase i.e. powder metallurgy (sintering methods preceded by mechanical synthesis) and 3/ from gas phase e.g. sputtering or 4/ other deposition methods like electrodeposition from liquids. Application of different production methods creates different microstructures of HEA, which can entail differences in their properties. The last two methods also allows to obtain coatings with HEA structures, hereinafter referred to as High Entropy Films (HEF). With reference to above, the crucial aim of this work was the optimization of the manufacturing process of the multi-component metallic layers (HEF) by the low- and high temperature electrochemical deposition ( ED). The low-temperature deposition process was crried out at ambient or elevated temperature (up to 100 ᵒC) in organic electrolyte. The high-temperature electrodeposition (several hundred Celcius degrees), in turn, allowed to form the HEF layer by electrochemical reduction of metals from molten salts. The basic chemical composition of the coatings was CoCrFeMnNi (known as Cantor’s alloy). However, it was modified by other, selected elements like Al or Cu. The optimization of the parameters that allow to obtain as far as it possible homogeneous and equimolar composition of HEF is the main result of presented studies. In order to analyse and compare the microstructure, SEM/EBSD, TEM and XRD techniques were employed. Morover, the determination of corrosion resistance of the CoCrFeMnNi(Cu or Al) layers in selected electrolytes (i.e. organic and non-organic liquids) was no less important than the above mentioned objectives.

Keywords: high entropy alloys, electrodeposition, corrosion behavior, microstructure

Procedia PDF Downloads 53
692 Entropy Generation Analyze Due to the Steady Natural Convection of Newtonian Fluid in a Square Enclosure

Authors: T. T. Naas, Y. Lasbet, C. Kezrane

Abstract:

The thermal control in many systems is widely accomplished applying mixed convection process due to its low cost, reliability and easy maintenance. Typical applications include the aircraft electronic equipment, rotating-disc heat exchangers, turbo machinery, and nuclear reactors, etc. Natural convection in an inclined square enclosure heated via wall heater has been studied numerically. Finite volume method is used for solving momentum and energy equations in the form of stream function–vorticity. The right and left walls are kept at a constant temperature, while the other parts are adiabatic. The range of the inclination angle covers a whole revolution. The method is validated for a vertical cavity. A general power law dependence of the Nusselt number with respect to the Rayleigh number with the coefficient and exponent as functions of the inclination angle is presented. For a fixed Rayleigh number, the inclination angle increases or decreases is found.

Keywords: natural convection in enclosure, inclined enclosure, Nusselt number, entropy generation analyze

Procedia PDF Downloads 221
691 DCDNet: Lightweight Document Corner Detection Network Based on Attention Mechanism

Authors: Kun Xu, Yuan Xu, Jia Qiao

Abstract:

The document detection plays an important role in optical character recognition and text analysis. Because the traditional detection methods have weak generalization ability, and deep neural network has complex structure and large number of parameters, which cannot be well applied in mobile devices, this paper proposes a lightweight Document Corner Detection Network (DCDNet). DCDNet is a two-stage architecture. The first stage with Encoder-Decoder structure adopts depthwise separable convolution to greatly reduce the network parameters. After introducing the Feature Attention Union (FAU) module, the second stage enhances the feature information of spatial and channel dim and adaptively adjusts the size of receptive field to enhance the feature expression ability of the model. Aiming at solving the problem of the large difference in the number of pixel distribution between corner and non-corner, Weighted Binary Cross Entropy Loss (WBCE Loss) is proposed to define corner detection problem as a classification problem to make the training process more efficient. In order to make up for the lack of Dataset of document corner detection, a Dataset containing 6620 images named Document Corner Detection Dataset (DCDD) is made. Experimental results show that the proposed method can obtain fast, stable and accurate detection results on DCDD.

Keywords: document detection, corner detection, attention mechanism, lightweight

Procedia PDF Downloads 324
690 Hub Traveler Guidance Signage Evaluation via Panoramic Visualization Using Entropy Weight Method and TOPSIS

Authors: Si-yang Zhang, Chi Zhao

Abstract:

Comprehensive transportation hubs are important nodes of the transportation network, and their internal signage the functions as guidance and distribution assistance, which directly affects the operational efficiency of traffic in and around the hubs. Reasonably installed signage effectively attracts the visual focus of travelers and improves wayfinding efficiency. Among the elements of signage, the visual guidance effect is the key factor affecting the information conveyance, whom should be evaluated during design and optimization process. However, existing evaluation methods mostly focus on the layout, and are not able to fully understand if signage caters travelers’ need. This study conducted field investigations and developed panoramic videos for multiple transportation hubs in China, and designed survey accordingly. Human subjects are recruited to watch panoramic videos via virtual reality (VR) and respond to the surveys. In this paper, Pudong Airport and Xi'an North Railway Station were studied and compared as examples due to their high traveler volume and relatively well-developed traveler service systems. Visual attention was captured by eye tracker and subjective satisfaction ratings were collected through surveys. Entropy Weight Method (EWM) was utilized to evaluate the effectiveness of signage elements and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) was used to further rank the importance of the elements. The results show that the degree of visual attention of travelers significantly affects the evaluation results of guidance signage. Key factors affecting visual attention include accurate legibility, obstruction and defacement rates, informativeness, and whether signage is set up in a hierarchical manner.

Keywords: traveler guidance signage, panoramic video, visual attention, entropy weight method, TOPSIS

Procedia PDF Downloads 29
689 Estimation and Forecasting with a Quantile AR Model for Financial Returns

Authors: Yuzhi Cai

Abstract:

This talk presents a Bayesian approach to quantile autoregressive (QAR) time series model estimation and forecasting. We establish that the joint posterior distribution of the model parameters and future values is well defined. The associated MCMC algorithm for parameter estimation and forecasting converges to the posterior distribution quickly. We also present a combining forecasts technique to produce more accurate out-of-sample forecasts by using a weighted sequence of fitted QAR models. A moving window method to check the quality of the estimated conditional quantiles is developed. We verify our methodology using simulation studies and then apply it to currency exchange rate data. An application of the method to the USD to GBP daily currency exchange rates will also be discussed. The results obtained show that an unequally weighted combining method performs better than other forecasting methodology.

Keywords: combining forecasts, MCMC, quantile modelling, quantile forecasting, predictive density functions

Procedia PDF Downloads 315
688 Application of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise and Multipoint Optimal Minimum Entropy Deconvolution in Railway Bearings Fault Diagnosis

Authors: Yao Cheng, Weihua Zhang

Abstract:

Although the measured vibration signal contains rich information on machine health conditions, the white noise interferences and the discrete harmonic coming from blade, shaft and mash make the fault diagnosis of rolling element bearings difficult. In order to overcome the interferences of useless signals, a new fault diagnosis method combining Complete Ensemble Empirical Mode Decomposition with adaptive noise (CEEMDAN) and Multipoint Optimal Minimum Entropy Deconvolution (MOMED) is proposed for the fault diagnosis of high-speed train bearings. Firstly, the CEEMDAN technique is applied to adaptively decompose the raw vibration signal into a series of finite intrinsic mode functions (IMFs) and a residue. Compared with Ensemble Empirical Mode Decomposition (EEMD), the CEEMDAN can provide an exact reconstruction of the original signal and a better spectral separation of the modes, which improves the accuracy of fault diagnosis. An effective sensitivity index based on the Pearson's correlation coefficients between IMFs and raw signal is adopted to select sensitive IMFs that contain bearing fault information. The composite signal of the sensitive IMFs is applied to further analysis of fault identification. Next, for propose of identifying the fault information precisely, the MOMED is utilized to enhance the periodic impulses in composite signal. As a non-iterative method, the MOMED has better deconvolution performance than the classical deconvolution methods such Minimum Entropy Deconvolution (MED) and Maximum Correlated Kurtosis Deconvolution (MCKD). Third, the envelope spectrum analysis is applied to detect the existence of bearing fault. The simulated bearing fault signals with white noise and discrete harmonic interferences are used to validate the effectiveness of the proposed method. Finally, the superiorities of the proposed method are further demonstrated by high-speed train bearing fault datasets measured from test rig. The analysis results indicate that the proposed method has strong practicability.

Keywords: bearing, complete ensemble empirical mode decomposition with adaptive noise, fault diagnosis, multipoint optimal minimum entropy deconvolution

Procedia PDF Downloads 339
687 Detection Efficient Enterprises via Data Envelopment Analysis

Authors: S. Turkan

Abstract:

In this paper, the Turkey’s Top 500 Industrial Enterprises data in 2014 were analyzed by data envelopment analysis. Data envelopment analysis is used to detect efficient decision-making units such as universities, hospitals, schools etc. by using inputs and outputs. The decision-making units in this study are enterprises. To detect efficient enterprises, some financial ratios are determined as inputs and outputs. For this reason, financial indicators related to productivity of enterprises are considered. The efficient foreign weighted owned capital enterprises are detected via super efficiency model. According to the results, it is said that Mercedes-Benz is the most efficient foreign weighted owned capital enterprise in Turkey.

Keywords: data envelopment analysis, super efficiency, logistic regression, financial ratios

Procedia PDF Downloads 301
686 Application of RS and GIS Technique for Identifying Groundwater Potential Zone in Gomukhi Nadhi Sub Basin, South India

Authors: Punitha Periyasamy, Mahalingam Sudalaimuthu, Sachikanta Nanda, Arasu Sundaram

Abstract:

India holds 17.5% of the world’s population but has only 2% of the total geographical area of the world where 27.35% of the area is categorized as wasteland due to lack of or less groundwater. So there is a demand for excessive groundwater for agricultural and non agricultural activities to balance its growth rate. With this in mind, an attempt is made to find the groundwater potential zone in Gomukhi river sub basin of Vellar River basin, TamilNadu, India covering an area of 1146.6 Sq.Km consists of 9 blocks from Peddanaickanpalayam to Villupuram fall in the sub basin. The thematic maps such as Geology, Geomorphology, Lineament, Landuse, and Landcover and Drainage are prepared for the study area using IRS P6 data. The collateral data includes rainfall, water level, soil map are collected for analysis and inference. The digital elevation model (DEM) is generated using Shuttle Radar Topographic Mission (SRTM) and the slope of the study area is obtained. ArcGIS 10.1 acts as a powerful spatial analysis tool to find out the ground water potential zones in the study area by means of weighted overlay analysis. Each individual parameter of the thematic maps are ranked and weighted in accordance with their influence to increase the water level in the ground. The potential zones in the study area are classified viz., Very Good, Good, Moderate, Poor with its aerial extent of 15.67, 381.06, 575.38, 174.49 Sq.Km respectively.

Keywords: ArcGIS, DEM, groundwater, recharge, weighted overlay

Procedia PDF Downloads 415
685 Modelling and Maping Malnutrition Toddlers in Bojonegoro Regency with Mixed Geographically Weighted Regression Approach

Authors: Elvira Mustikawati P.H., Iis Dewi Ratih, Dita Amelia

Abstract:

Bojonegoro has proclaimed a policy of zero malnutrition. Therefore, as an effort to solve the cases of malnutrition children in Bojonegoro, this study used the approach geographically Mixed Weighted Regression (MGWR) to determine the factors that influence the percentage of malnourished children under five in which factors can be divided into locally influential factor in each district and global factors that influence throughout the district. Based on the test of goodness of fit models, R2 and AIC values in GWR models are better than MGWR models. R2 and AIC values in MGWR models are 84.37% and 14.28, while the GWR models respectively are 91.04% and -62.04. Based on the analysis with GWR models, District Sekar, Bubulan, Gondang, and Dander is a district with three predictor variables (percentage of vitamin A, the percentage of births assisted health personnel, and the percentage of clean water) that significantly influence the percentage of malnourished children under five.

Keywords: GWR, MGWR, R2, AIC

Procedia PDF Downloads 253
684 Frank Norris’ McTeague: An Entropic Melodrama

Authors: Mohsen Masoomi, Fazel Asadi Amjad, Monireh Arvin

Abstract:

According to Naturalistic principles, human destiny in the form of blind chance and determinism, entraps the individual, so man is a defenceless creature unable to escape from the ruthless paws of a stoical universe. In Naturalism; nonetheless, melodrama mirrors a conscious alternative with a peculiar function. A typical American Naturalistic character thus cannot be a subject for social criticism of American society since they are not victims of the ongoing virtual slavery, capitalist system, nor of a ruined milieu, but of their own volition, and more importantly, their character frailty. Through a Postmodern viewpoint, each Naturalistic work can encompass some entropic trends and changes culminating in an entire failure and devastation. Frank Norris in McTeague displays the futile struggles of ordinary men and how they end up brutes. McTeague encompasses intoxication, abuse, violation, and ruthless homicides. Norris’ depictions of the falling individual as a demon represent the entropic dimension of Naturalistic novels. McTeague’s defeat is somewhat his own fault, the result of his own blunders and resolution, not the result of sheer accident. Throughout the novel, each character is a kind of insane quester indicating McTeague’s decadence and, by inference, the decadence of Western civilisation. McTeague seems to designate Norris’ solicitude for a community fabricated by the elements of human negative demeanours and conducts hauling acute symptoms of infectious dehumanisation. The aim of this article is to illustrate how one specific negative human disposition gradually, like a running fire, can spread everywhere and burn everything in itself. The author applies the concept of entropy metaphorically to describe the individual devolutions that necessarily comprise community entropy in McTeague, a dying universe.

Keywords: animal imagery, entropy, Gypsy, melodrama

Procedia PDF Downloads 256
683 The Relationship between Rhythmic Complexity and Listening Engagement as a Proxy for Perceptual Interest

Authors: Noah R. Fram

Abstract:

Although it has been confirmed by multiple studies, the inverted-U relationship between stimulus complexity and preference (liking) remains contentious. Research aimed at substantiating the model are largely reliant upon anecdotal self-assessments of subjects and basic measures of complexity, leaving potential confounds unresolved. This study attempts to address the topic by assessing listening time as a behavioral correlate of liking (with the assumption that engagement prolongs listening time) and by looking for latent factors underlying several measures of rhythmic complexity. Participants listened to groups of rhythms, stopping each one when they started to lose interest and were asked to rate each rhythm in each group in terms of interest, complexity, and preference. Subjects were not informed that the time spent listening to each rhythm was the primary measure of interest. The hypothesis that listening time does demonstrate the same inverted-U relationship with complexity as verbal reports of liking was confirmed using a variety of metrics for rhythmic complexity, including meter-dependent measures of syncopation and meter-independent measures of entropy.

Keywords: complexity, entropy, rhythm, syncopation

Procedia PDF Downloads 143
682 The Analysis of Changes in Urban Hierarchy of Isfahan Province in the Fifty-Year Period (1956-2006)

Authors: Hamidreza Joudaki, Yousefali Ziari

Abstract:

The appearance of city and urbanism is one of the important processes which have affected social communities. Being industrialized urbanism developed along with each other in the history. In addition, they have had simple relationship for more than six thousand years, that is, from the appearance of the first cities. In 18th century by coming out of industrial capitalism, progressive development took place in urbanism in the world. In Iran, the city of each region made its decision by itself and the capital of region (downtown) was the only central part and also the regional city without any hierarchy, controlled its realm. However, this method of ruling during these three decays, because of changing in political, social and economic issues that have caused changes in rural and urban relationship. Moreover, it has changed the variety of performance of cities and systematic urban network in Iran. Today, urban system has very vast imbalanced apace and performance. In Isfahan, the trend of urbanism is like the other part of Iran and systematic urban hierarchy is not suitable and normal. This article is a quantitative and analytical. The statistical communities are Isfahan Province cities and the changes in urban network and its hierarchy during the period of fifty years (1956 -2006) has been surveyed. In addition, those data have been analyzed by model of Rank and size and Entropy index. In this article Iran cities and also the factor of entropy of primate city and urban hierarchy of Isfahan Province have been introduced. Urban residents of this Province have been reached from 55 percent to 83% (2006). As we see the analytical data reflects that there is mismatching and imbalance between cities. Because the entropy index was.91 in 1956.And it decreased to.63 in 2006. Isfahan city is the primate city in the whole of these periods. Moreover, the second and the third cities have population gap with regard to the other cities and finally, they do not follow the system of rank-size.

Keywords: urban network, urban hierarchy, primate city, Isfahan province, urbanism, first cities

Procedia PDF Downloads 222