Search results for: entropy weights analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27331

Search results for: entropy weights analysis

27181 Efficient Implementation of Finite Volume Multi-Resolution Weno Scheme on Adaptive Cartesian Grids

Authors: Yuchen Yang, Zhenming Wang, Jun Zhu, Ning Zhao

Abstract:

An easy-to-implement and robust finite volume multi-resolution Weighted Essentially Non-Oscillatory (WENO) scheme is proposed on adaptive cartesian grids in this paper. Such a multi-resolution WENO scheme is combined with the ghost cell immersed boundary method (IBM) and wall-function technique to solve Navier-Stokes equations. Unlike the k-exact finite volume WENO schemes which involve large amounts of extra storage, repeatedly solving the matrix generated in a least-square method or the process of calculating optimal linear weights on adaptive cartesian grids, the present methodology only adds very small overhead and can be easily implemented in existing edge-based computational fluid dynamics (CFD) codes with minor modifications. Also, the linear weights of this adaptive finite volume multi-resolution WENO scheme can be any positive numbers on condition that their sum is one. It is a way of bypassing the calculation of the optimal linear weights and such a multi-resolution WENO scheme avoids dealing with the negative linear weights on adaptive cartesian grids. Some benchmark viscous problems are numerical solved to show the efficiency and good performance of this adaptive multi-resolution WENO scheme. Compared with a second-order edge-based method, the presented method can be implemented into an adaptive cartesian grid with slight modification for big Reynolds number problems.

Keywords: adaptive mesh refinement method, finite volume multi-resolution WENO scheme, immersed boundary method, wall-function technique.

Procedia PDF Downloads 121
27180 Collision Theory Based Sentiment Detection Using Discourse Analysis in Hadoop

Authors: Anuta Mukherjee, Saswati Mukherjee

Abstract:

Data is growing everyday. Social networking sites such as Twitter are becoming an integral part of our daily lives, contributing a large increase in the growth of data. It is a rich source especially for sentiment detection or mining since people often express honest opinion through tweets. However, although sentiment analysis is a well-researched topic in text, this analysis using Twitter data poses additional challenges since these are unstructured data with abbreviations and without a strict grammatical correctness. We have employed collision theory to achieve sentiment analysis in Twitter data. We have also incorporated discourse analysis in the collision theory based model to detect accurate sentiment from tweets. We have also used the retweet field to assign weights to certain tweets and obtained the overall weightage of a topic provided in the form of a query. Hadoop has been exploited for speed. Our experiments show effective results.

Keywords: sentiment analysis, twitter, collision theory, discourse analysis

Procedia PDF Downloads 501
27179 Effect of Aluminium Content on Bending Properties and Microstructure of AlₓCoCrFeNi Alloy Fabricated by Induction Melting

Authors: Marzena Tokarewicz, Malgorzata Gradzka-Dahlke

Abstract:

High-entropy alloys (HEAs) have gained significant attention due to their great potential as functional and structural materials. HEAs have very good mechanical properties (in particular, alloys based on CoCrNi). They also show the ability to maintain their strength at high temperatures, which is extremely important in some applications. AlCoCrFeNi alloy is one of the most studied high-entropy alloys. Scientists often study the effect of changing the aluminum content in this alloy because it causes significant changes in phase presence and microstructure and consequently affects its hardness, ductility, and other properties. Research conducted by the authors also investigates the effect of aluminium content in AlₓCoCrFeNi alloy on its microstructure and mechanical properties. AlₓCoCrFeNi alloys were prepared by vacuum induction melting. The obtained samples were examined for chemical composition, microstructure, and microhardness. The three-point bending method was carried out to determine the bending strength, bending modulus, and conventional bending yield strength. The obtained results confirm the influence of aluminum content on the properties of AlₓCoCrFeNi alloy. Most studies on AlₓCoCrFeNi alloy focus on the determination of mechanical properties in compression or tension, much less in bending. The achieved results provide valuable information on the bending properties of AlₓCoCrFeNi alloy and lead to interesting conclusions.

Keywords: bending properties, high-entropy alloys, induction melting, microstructure

Procedia PDF Downloads 120
27178 Noise Detection Algorithm for Skin Disease Image Identification

Authors: Minakshi Mainaji Sonawane, Bharti W. Gawali, Sudhir Mendhekar, Ramesh R. Manza

Abstract:

People's lives and health are severely impacted by skin diseases. A new study proposes an effective method for identifying the different forms of skin diseases. Image denoising is a technique for improving image quality after it has been harmed by noise. The proposed technique is based on the usage of the wavelet transform. Wavelet transform is the best method for analyzing the image due to the ability to split the image into the sub-band, which has been used to estimate the noise ratio at the noisy image. According to experimental results, the proposed method presents the best values for MSE, PSNR, and Entropy for denoised images. we can found in Also, by using different types of wavelet transform filters is make the proposed approach can obtain the best results 23.13, 20.08, 50.7 for the image denoising process

Keywords: MSE, PSNR, entropy, Gaussian filter, DWT

Procedia PDF Downloads 189
27177 Determination of Verapamil Hydrochloride in the Tablet and Injection Solution by the Verapamil-Sensitive Electrode and Possibilities of Application in Pharmaceutical Analysis

Authors: Faisal A. Salih, V. V. Egorov

Abstract:

Verapamil is a drug used in medicine for arrhythmia, angina, and hypertension as a calcium channel blocker. In this study, a Verapamil-selective electrode was prepared, and the concentrations of the components in the membrane were as follows: PVC (32.8 wt %), O-NPhOE (66.6 wt %), and KTPClPB (0.6 wt % or approximately 0.01 M). The inner solution containing verapamil hydrochloride 1 x 10⁻³ M was introduced, and the electrodes were conditioned overnight in 1 x 10⁻³ M verapamil hydrochloride solution in 1 x 10⁻³ M orthophosphoric acid. These studies have demonstrated that O-NPhOE and KTPClPB are the best plasticizers and ion exchangers, while both direct potentiometry and potentiometric titration methods can be used for the determination of verapamil hydrochloride in tablets and injection solutions. Normalized weights of verapamil per tablet (80.4±0.2, 80.7±0.2, 81.0±0.4 mg) were determined by direct potentiometry and potentiometric titration, respectively. Weights of verapamil per average tablet weight determined by the methods of direct potentiometry and potentiometric titration were" 80.4±0.2, 80.7±0.2 mg determined for the same set of tablets, respectively. The masses of verapamil in solutions for injection, determined by direct potentiometry for two ampoules from one set, were (5.00±0.015, 5.004±0.006) mg. In all cases, good reproducibility and excellent correspondence with the declared quantities were observed.

Keywords: verapamil, potentiometry, ion-selective electrode, lipophilic physiologically active amines

Procedia PDF Downloads 64
27176 Personnel Selection Based on Step-Wise Weight Assessment Ratio Analysis and Multi-Objective Optimization on the Basis of Ratio Analysis Methods

Authors: Emre Ipekci Cetin, Ebru Tarcan Icigen

Abstract:

Personnel selection process is considered as one of the most important and most difficult issues in human resources management. At the stage of personnel selection, the applicants are handled according to certain criteria, the candidates are dealt with, and efforts are made to select the most appropriate candidate. However, this process can be more complicated in terms of the managers who will carry out the staff selection process. Candidates should be evaluated according to different criteria such as work experience, education, foreign language level etc. It is crucial that a rational selection process is carried out by considering all the criteria in an integrated structure. In this study, the problem of choosing the front office manager of a 5 star accommodation enterprise operating in Antalya is addressed by using multi-criteria decision-making methods. In this context, SWARA (Step-wise weight assessment ratio analysis) and MOORA (Multi-Objective Optimization on the basis of ratio analysis) methods, which have relatively few applications when compared with other methods, have been used together. Firstly SWARA method was used to calculate the weights of the criteria and subcriteria that were determined by the business. After the weights of the criteria were obtained, the MOORA method was used to rank the candidates using the ratio system and the reference point approach. Recruitment processes differ from sector to sector, from operation to operation. There are a number of criteria that must be taken into consideration by businesses in accordance with the structure of each sector. It is of utmost importance that all candidates are evaluated objectively in the framework of these criteria, after these criteria have been carefully selected in the selection of suitable candidates for employment. In the study, staff selection process was handled by using SWARA and MOORA methods together.

Keywords: accommodation establishments, human resource management, multi-objective optimization on the basis of ratio analysis, multi-criteria decision making, step-wise weight assessment ratio analysis

Procedia PDF Downloads 312
27175 Developing a Risk Rating Tool for Shopping Centres

Authors: Prandesha Govender, Chris Cloete

Abstract:

Purpose: The objective of the paper is to develop a tool for the evaluation of the financial risk of a shopping center. Methodology: Important factors that indicate the success of a shopping center were identified from the available literature. Weights were allocated to these factors and a risk rating was calculated for 505 shopping centers in the largest province in South Africa by taking the factor scores, factor weights, and category weights into account. The ratings for ten randomly selected shopping centers were correlated with consumer feedback and standardized against the ECAI (External Credit Assessment Institutions) data for the same centers. The ratings were also mapped to corporates with the same risk rating to provide a better intuitive assessment of the meaning of the inherent risk of each center. Results: The proposed risk tool shows a strong linear correlation with consumer views and can be compared to expert opinions, such as that of fund managers and REITs. Interpretation of the tool was also illustrated by correlating the risk rating of selected shopping centers to the risk rating of reputable and established entities. Conclusions: The proposed Shopping Centre Risk Tool, used in conjunction with financial inputs from the relevant center, should prove useful to an investor when the desirability of investment in or expansion, renovation, or purchase of a shopping center is being considered.

Keywords: risk, shopping centres, risk modelling, investment, rating tool, rating scale

Procedia PDF Downloads 85
27174 Automatic Registration of Rail Profile Based Local Maximum Curvature Entropy

Authors: Hao Wang, Shengchun Wang, Weidong Wang

Abstract:

On the influence of train vibration and environmental noise on the measurement of track wear, we proposed a method for automatic extraction of circular arc on the inner or outer side of the rail waist and achieved the high-precision registration of rail profile. Firstly, a polynomial fitting method based on truncated residual histogram was proposed to find the optimal fitting curve of the profile and reduce the influence of noise on profile curve fitting. Then, based on the curvature distribution characteristics of the fitting curve, the interval search algorithm based on dynamic window’s maximum curvature entropy was proposed to realize the automatic segmentation of small circular arc. At last, we fit two circle centers as matching reference points based on small circular arcs on both sides and realized the alignment from the measured profile to the standard designed profile. The static experimental results show that the mean and standard deviation of the method are controlled within 0.01mm with small measurement errors and high repeatability. The dynamic test also verified the repeatability of the method in the train-running environment, and the dynamic measurement deviation of rail wear is within 0.2mm with high repeatability.

Keywords: curvature entropy, profile registration, rail wear, structured light, train-running

Procedia PDF Downloads 228
27173 New Refrigerant La₀.₇Ca₀.₁₅Sr₀.₁₅Mn₁₋ₓGaₓO₃ for Application in Magnetic Refrigeration

Authors: Essebti Dhahri

Abstract:

We present a new refrigerant La₀.₇Ca₀.₁₅Sr₀.₁₅Mn₁₋ₓGaₓO₃ (x = 0.0-0.1) manganites. These compounds were prepared by the sol-gel method. The refinement of the X-ray diffraction reveals that all samples crystallize in a rhombohedral structure (space group R3 ̅c). Detailed measurements of the magnetization as a function of temperature and magnetic applied field M (µ₀H, T) were carried out. From the M(µ₀H, T) curves, we have calculated the magnetic entropy change (ΔSM) according to the Maxwell relation. The temperature dependence of the magnetization M(T) reveals a decrease of M when increasing the x content. The magnetic entropy change (ΔSM) reaches a maximum value near room temperature. It was also found that this compound exhibits a large magnetocaloric effect MCE which increases when decreasing Ga concentration. So, the studied compounds could be considered potential materials for magnetic refrigeration application.

Keywords: magnetic measurements, Rietveld refinement, magnetic refrigeration, magnetocaloric effect

Procedia PDF Downloads 64
27172 Microstructure, Mechanical and Tribological Properties of (TiTaZrNb)Nx Medium Entropy Nitride Coatings: Influence of Nitrogen Content and Bias Voltage

Authors: Mario Alejandro Grisales, M. Daniela Chimá, Gilberto Bejarano Gaitán

Abstract:

High entropy alloys (HEA) and nitride (HEN) are currently very attractive to the automotive, aerospace, metalworking and materials forming manufacturing industry, among others, for exhibiting higher mechanical properties, wear resistance, and thermal stability than binary and ternary alloys. In this work medium-entropy coatings of TiTaZrNb and the nitrides of (TiTaZrNb)Nx were synthesized on to AISI 420 and M2 steel samples by the direct current magnetron sputtering technique. The influence of the bias voltage supplied to the substrate on the microstructure, chemical- and phase composition of the matrix coating was evaluated, and the effect of nitrogen flow on the microstructural, mechanical and tribological properties of the corresponding nitrides was studied. A change in the crystalline structure from BCC for TiTaZrNb coatings to FCC for (TiTaZrNb)Nx was observed, that is associated with the incorporation of nitrogen into the matrix and the consequent formation of a solid solution of (TiTaZrNb)Nx. An increase in hardness and residual stresses was observed with increasing bias voltage for TiTaZrNb, reaching 12.8 GPa for the coating deposited with a bias of -130V. In the case of (TiTaZrNb)Nx nitride, a greater hardness of 23 GPa is achieved for the coating deposited with a N2 flow of 12 sccm, which slightly drops to 21.7 GPa for that deposited with N2 flow of 15 sccm. The slight reduction in hardness could be associated with the precipitation of the TiN and ZrN phases that are formed at higher nitrogen flows. The specific wear rate of the deposited coatings ranged between 0.5xexp13 and 0.6xexp13 N/m2. The steel substrate exhibited an average hardness of 2.0 GPa and a specific wear rate of 203.2exp13 N/m2. Both the hardness and the specific wear rate of the synthesized nitride coatings were higher than that of the steel substrate, showing a protective effect of the steel against wear.

Keywords: medium entropy coatings, hard coatings, magnetron sputtering, tribology, wear resistance

Procedia PDF Downloads 41
27171 A Neural Network Approach to Evaluate Supplier Efficiency in a Supply Chain

Authors: Kishore K. Pochampally

Abstract:

The success of a supply chain heavily relies on the efficiency of the suppliers involved. In this paper, we propose a neural network approach to evaluate the efficiency of a supplier, which is being considered for inclusion in a supply chain, using the available linguistic (fuzzy) data of suppliers that already exist in the supply chain. The approach is carried out in three phases, as follows: In phase one, we identify criteria for evaluation of the supplier of interest. Then, in phase two, we use performance measures of already existing suppliers to construct a neural network that gives weights (importance values) of criteria identified in phase one. Finally, in phase three, we calculate the overall rating of the supplier of interest. The following are the major findings of the research conducted for this paper: (i) linguistic (fuzzy) ratings of suppliers such as 'good', 'bad', etc., can be converted (defuzzified) to numerical ratings (1 – 10 scale) using fuzzy logic so that those ratings can be used for further quantitative analysis; (ii) it is possible to construct and train a multi-level neural network in order to determine the weights of the criteria that are used to evaluate a supplier; and (iii) Borda’s rule can be used to group the weighted ratings and calculate the overall efficiency of the supplier.

Keywords: fuzzy data, neural network, supplier, supply chain

Procedia PDF Downloads 85
27170 Sustainable Approach for Strategic Planning of Construction of Buildings using Multi-Criteria Decision Making Tools

Authors: Kishor Bhagwat, Gayatri Vyas

Abstract:

Construction industry is earmarked with complex processes depending on the nature and scope of the project. In recent years, developments in this sector are remarkable and have resulted in both positive and negative impacts on the environment and human being. Sustainable construction can be looked upon as one of the solution to overcome the negative impacts since sustainable construction is a vast concept, which includes many parameters, and sometimes the use of multi-criteria decision making [MCDM] tools becomes necessary. The main objective of this study is to determine the weightage of sustainable building parameters with the help of MCDM tools. Questionnaire survey was conducted to examine the perspective of respondents on the importance of weights of the criterion, and the respondents were architects, green building consultants, and civil engineers. This paper presents an overview of research related to Indian and international green building rating systems and MCDM. The results depict that economy, environmental health, and safety, site selection, climatic condition, etc., are important parameters in sustainable construction.

Keywords: green building, sustainability, multi-criteria decision making method [MCDM], analytical hierarchy process [AHP], technique for order preference by similarity to an ideal solution [TOPSIS], entropy

Procedia PDF Downloads 63
27169 Phase Transition of Aqueous Ternary (THF + Polyvinylpyrrolidone + H2O) System as Revealed by Terahertz Time-Domain Spectroscopy

Authors: Hyery Kang, Dong-Yeun Koh, Yun-Ho Ahn, Huen Lee

Abstract:

Determination of the behavior of clathrate hydrate with inhibitor in the THz region will provide useful information about hydrate plug control in the upstream of the oil and gas industry. In this study, terahertz time-domain spectroscopy (THz-TDS) revealed the inhibition of the THF clathrate hydrate system with dosage of polyvinylpyrrolidone (PVP) with three different molecular weights. Distinct footprints of phase transition in the THz region (0.4–2.2 THz) were analyzed and absorption coefficients and real part of refractive indices are obtained in the temperature range of 253 K to 288 K. Along with the optical properties, ring breathing and stretching modes for different molecular weights of PVP in THF hydrate are analyzed by Raman spectroscopy.

Keywords: clathrate hydrate, terahertz spectroscopy, tetrahydrofuran, inhibitor

Procedia PDF Downloads 309
27168 Highly Accurate Target Motion Compensation Using Entropy Function Minimization

Authors: Amin Aghatabar Roodbary, Mohammad Hassan Bastani

Abstract:

One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.

Keywords: automatic target recognition (ATR), high resolution range profile (HRRP), motion compensation, stepped frequency waveform technique (SFW), target motion parameters (TMPs)

Procedia PDF Downloads 126
27167 Entropy-Based Multichannel Stationary Measure for Characterization of Non-Stationary Patterns

Authors: J. D. Martínez-Vargas, C. Castro-Hoyos, G. Castellanos-Dominguez

Abstract:

In this work, we propose a novel approach for measuring the stationarity level of a multichannel time-series. This measure is based on a stationarity definition over time-varying spectrum, and it is aimed to quantify the relation between local stationarity (single-channel) and global dynamic behavior (multichannel dynamics). To assess the proposed approach validity, we use a well known EEG-BCI database, that was constructed for separate between motor/imagery tasks. Thus, based on the statement that imagination of movements implies an increase on the EEG dynamics, we use as discriminant features the proposed measure computed over an estimation of the non-stationary components of input time-series. As measure of separability we use a t-student test, and the obtained results evidence that such measure is able to accurately detect the brain areas projected on the scalp where motor tasks are realized.

Keywords: stationary measure, entropy, sub-space projection, multichannel dynamics

Procedia PDF Downloads 368
27166 Design an Algorithm for Software Development in CBSE Envrionment Using Feed Forward Neural Network

Authors: Amit Verma, Pardeep Kaur

Abstract:

In software development organizations, Component based Software engineering (CBSE) is emerging paradigm for software development and gained wide acceptance as it often results in increase quality of software product within development time and budget. In component reusability, main challenges are the right component identification from large repositories at right time. The major objective of this work is to provide efficient algorithm for storage and effective retrieval of components using neural network and parameters based on user choice through clustering. This research paper aims to propose an algorithm that provides error free and automatic process (for retrieval of the components) while reuse of the component. In this algorithm, keywords (or components) are extracted from software document, after by applying k mean clustering algorithm. Then weights assigned to those keywords based on their frequency and after assigning weights, ANN predicts whether correct weight is assigned to keywords (or components) or not, otherwise it back propagates in to initial step (re-assign the weights). In last, store those all keywords into repositories for effective retrieval. Proposed algorithm is very effective in the error correction and detection with user base choice while choice of component for reusability for efficient retrieval is there.

Keywords: component based development, clustering, back propagation algorithm, keyword based retrieval

Procedia PDF Downloads 357
27165 Solanum Nigrum Show Anti-Obesity Effects on High Fat Diet Fed Sprague Dawley Rats

Authors: Kathryn Nderitu, Atunga Nyachieo, Ezekiel Mecha

Abstract:

Introduction: Solanum nigrum , also known as black nightshade, biosynthesizes various phytochemical compounds with various pharmacological activities, including treating cardiovascular diseases and type 2 diabetes, among others. Materials and Methods: To assess the anti-obesity effects of Solanum nigrum using high-fat-fed diet rats, Sprague Dawley male rats (n = 35) of weights 160–180 g were assigned randomly into seven groups comprising n = 5 rats each. Each group was fed for 11 weeks as follows: normal group (normal chow rat feed); high-fat diet control (HFD); HFD and standard drug (Orlistat 30 mg/kg bw); HFD and methanolic extract 150 mg/kgbw; HFD and methanolic extract 300 mg/kgbw; HFD and dichloromethane extract 150 mg/kgbw; HFD and dichloromethane extract 300 mg/kgbw. Body mass index and food intake were monitored per week, and an oral glucose tolerance test was measured in weeks 5 and 10. Lipid profiles, liver function tests, adipose tissue, liver weights, and phytochemical analysis of Solanum nigrum were later carried out. Results: High-fat diet control group rats exhibited a significant increase in body mass index (BMI), while rats administered with leaf extracts of Solanum nigrum showed a reduction in BMI. Both low doses of dichloromethane (150 mg/kgbw) and high doses of methanol extracts (300 mg/kgbw) showed a better reduction in BMI than the other treatment groups. A significant decrease (p <0.05) in low-density lipoprotein-cholesterol, triglycerides, and cholesterol was observed among the rats administered with Solanum nigrum extracts compared to those of HFD control. Moreover, the HFD control group significantly increased liver and adipose tissue weights compared to other treatment groups (p<0.05). Solanum nigrum also decreased glycemic levels and normalized the hepatic enzymes of HFD control. However, food intake among the groups showed no significant difference (p>0.05). Qualitative analysis of Solanum nigrum leaf extracts indicated the presence of various bioactive compounds associated with anti-obesity. Conclusion: These results validate the use of Solanum nigrum in controlling obesity.

Keywords: solanum nigrum, High fat diet, phytocompounds, obesity

Procedia PDF Downloads 28
27164 Searching the Efficient Frontier for the Coherent Covering Location Problem

Authors: Felipe Azocar Simonet, Luis Acosta Espejo

Abstract:

In this article, we will try to find an efficient boundary approximation for the bi-objective location problem with coherent coverage for two levels of hierarchy (CCLP). We present the mathematical formulation of the model used. Supported efficient solutions and unsupported efficient solutions are obtained by solving the bi-objective combinatorial problem through the weights method using a Lagrangean heuristic. Subsequently, the results are validated through the DEA analysis with the GEM index (Global efficiency measurement).

Keywords: coherent covering location problem, efficient frontier, lagragian relaxation, data envelopment analysis

Procedia PDF Downloads 306
27163 A Network-Theorical Perspective on Music Analysis

Authors: Alberto Alcalá-Alvarez, Pablo Padilla-Longoria

Abstract:

The present paper describes a framework for constructing mathematical networks encoding relevant musical information from a music score for structural analysis. These graphs englobe statistical information about music elements such as notes, chords, rhythms, intervals, etc., and the relations among them, and so become helpful in visualizing and understanding important stylistic features of a music fragment. In order to build such networks, musical data is parsed out of a digital symbolic music file. This data undergoes different analytical procedures from Graph Theory, such as measuring the centrality of nodes, community detection, and entropy calculation. The resulting networks reflect important structural characteristics of the fragment in question: predominant elements, connectivity between them, and complexity of the information contained in it. Music pieces in different styles are analyzed, and the results are contrasted with the traditional analysis outcome in order to show the consistency and potential utility of this method for music analysis.

Keywords: computational musicology, mathematical music modelling, music analysis, style classification

Procedia PDF Downloads 64
27162 Digital Watermarking Based on Visual Cryptography and Histogram

Authors: R. Rama Kishore, Sunesh

Abstract:

Nowadays, robust and secure watermarking algorithm and its optimization have been need of the hour. A watermarking algorithm is presented to achieve the copy right protection of the owner based on visual cryptography, histogram shape property and entropy. In this, both host image and watermark are preprocessed. Host image is preprocessed by using Butterworth filter, and watermark is with visual cryptography. Applying visual cryptography on water mark generates two shares. One share is used for embedding the watermark, and the other one is used for solving any dispute with the aid of trusted authority. Usage of histogram shape makes the process more robust against geometric and signal processing attacks. The combination of visual cryptography, Butterworth filter, histogram, and entropy can make the algorithm more robust, imperceptible, and copy right protection of the owner.

Keywords: digital watermarking, visual cryptography, histogram, butter worth filter

Procedia PDF Downloads 324
27161 Comparative Analysis of Classical and Parallel Inpainting Algorithms Based on Affine Combinations of Projections on Convex Sets

Authors: Irina Maria Artinescu, Costin Radu Boldea, Eduard-Ionut Matei

Abstract:

The paper is a comparative study of two classical variants of parallel projection methods for solving the convex feasibility problem with their equivalents that involve variable weights in the construction of the solutions. We used a graphical representation of these methods for inpainting a convex area of an image in order to investigate their effectiveness in image reconstruction applications. We also presented a numerical analysis of the convergence of these four algorithms in terms of the average number of steps and execution time in classical CPU and, alternatively, in parallel GPU implementation.

Keywords: convex feasibility problem, convergence analysis, inpainting, parallel projection methods

Procedia PDF Downloads 138
27160 An Earth Mover’s Distance Algorithm Based DDoS Detection Mechanism in SDN

Authors: Yang Zhou, Kangfeng Zheng, Wei Ni, Ren Ping Liu

Abstract:

Software-defined networking (SDN) provides a solution for scalable network framework with decoupled control and data plane. However, this architecture also induces a particular distributed denial-of-service (DDoS) attack that can affect or even overwhelm the SDN network. DDoS attack detection problem has to date been mostly researched as entropy comparison problem. However, this problem lacks the utilization of SDN, and the results are not accurate. In this paper, we propose a DDoS attack detection method, which interprets DDoS detection as a signature matching problem and is formulated as Earth Mover’s Distance (EMD) model. Considering the feasibility and accuracy, we further propose to define the cost function of EMD to be a generalized Kullback-Leibler divergence. Simulation results show that our proposed method can detect DDoS attacks by comparing EMD values with the ones computed in the case without attacks. Moreover, our method can significantly increase the true positive rate of detection.

Keywords: DDoS detection, EMD, relative entropy, SDN

Procedia PDF Downloads 303
27159 Logical-Probabilistic Modeling of the Reliability of Complex Systems

Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia

Abstract:

The paper presents logical-probabilistic methods, models, and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. It is important to design systems based on structural analysis, research, and evaluation of efficiency indicators. One of the important efficiency criteria is the reliability of the system, which depends on the components of the structure. Quantifying the reliability of large-scale systems is a computationally complex process, and it is advisable to perform it with the help of a computer. Logical-probabilistic modeling is one of the effective means of describing the structure of a complex system and quantitatively evaluating its reliability, which was the basis of our application. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of “weights” of elements of system. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research, and designing of optimal structure systems are carried out.

Keywords: complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability of systems, “weights” of elements

Procedia PDF Downloads 38
27158 Real-Time Episodic Memory Construction for Optimal Action Selection in Cognitive Robotics

Authors: Deon de Jager, Yahya Zweiri, Dimitrios Makris

Abstract:

The three most important components in the cognitive architecture for cognitive robotics is memory representation, memory recall, and action-selection performed by the executive. In this paper, action selection, performed by the executive, is defined as a memory quantification and optimization process. The methodology describes the real-time construction of episodic memory through semantic memory optimization. The optimization is performed by set-based particle swarm optimization, using an adaptive entropy memory quantification approach for fitness evaluation. The performance of the approach is experimentally evaluated by simulation, where a UAV is tasked with the collection and delivery of a medical package. The experiments show that the UAV dynamically uses the episodic memory to autonomously control its velocity, while successfully completing its mission.

Keywords: cognitive robotics, semantic memory, episodic memory, maximum entropy principle, particle swarm optimization

Procedia PDF Downloads 119
27157 Prediction of the Thermodynamic Properties of Hydrocarbons Using Gaussian Process Regression

Authors: N. Alhazmi

Abstract:

Knowing the thermodynamics properties of hydrocarbons is vital when it comes to analyzing the related chemical reaction outcomes and understanding the reaction process, especially in terms of petrochemical industrial applications, combustions, and catalytic reactions. However, measuring the thermodynamics properties experimentally is time-consuming and costly. In this paper, Gaussian process regression (GPR) has been used to directly predict the main thermodynamic properties - standard enthalpy of formation, standard entropy, and heat capacity -for more than 360 cyclic and non-cyclic alkanes, alkenes, and alkynes. A simple workflow has been proposed that can be applied to directly predict the main properties of any hydrocarbon by knowing its descriptors and chemical structure and can be generalized to predict the main properties of any material. The model was evaluated by calculating the statistical error R², which was more than 0.9794 for all the predicted properties.

Keywords: thermodynamic, Gaussian process regression, hydrocarbons, regression, supervised learning, entropy, enthalpy, heat capacity

Procedia PDF Downloads 188
27156 Multivariate Analysis on Water Quality Attributes Using Master-Slave Neural Network Model

Authors: A. Clementking, C. Jothi Venkateswaran

Abstract:

Mathematical and computational functionalities such as descriptive mining, optimization, and predictions are espoused to resolve natural resource planning. The water quality prediction and its attributes influence determinations are adopted optimization techniques. The water properties are tainted while merging water resource one with another. This work aimed to predict influencing water resource distribution connectivity in accordance to water quality and sediment using an innovative proposed master-slave neural network back-propagation model. The experiment results are arrived through collecting water quality attributes, computation of water quality index, design and development of neural network model to determine water quality and sediment, master–slave back propagation neural network back-propagation model to determine variations on water quality and sediment attributes between the water resources and the recommendation for connectivity. The homogeneous and parallel biochemical reactions are influences water quality and sediment while distributing water from one location to another. Therefore, an innovative master-slave neural network model [M (9:9:2)::S(9:9:2)] designed and developed to predict the attribute variations. The result of training dataset given as an input to master model and its maximum weights are assigned as an input to the slave model to predict the water quality. The developed master-slave model is predicted physicochemical attributes weight variations for 85 % to 90% of water quality as a target values.The sediment level variations also predicated from 0.01 to 0.05% of each water quality percentage. The model produced the significant variations on physiochemical attribute weights. According to the predicated experimental weight variation on training data set, effective recommendations are made to connect different resources.

Keywords: master-slave back propagation neural network model(MSBPNNM), water quality analysis, multivariate analysis, environmental mining

Procedia PDF Downloads 444
27155 Cold Spray High Entropy Alloy Coating Surface Microstructural Characterization and Mechanical Testing

Authors: Raffaella Sesana, Nazanin Sheibanian, Luca Corsaro, Sedat Özbilen, Rocco Lupoi, Francesco Artusio

Abstract:

High Entropy Alloy (HEA) coatings of Al0.1-0.5CoCrCuFeNi and MnCoCrCuFeNi on Mg substrates were prepared from mechanically alloyed HEA powder feedstocks and at three different Cold Spray (CS) process gas (N2) temperatures (650, 750 and 850°C). Mechanically alloyed and cold-sprayed HEA coatings were characterized by macro photography, OM, SEM+EDS study, micro-hardness testing, roughness, and porosity measurements. As a result of mechanical alloying (MA), harder particles are deformed and fractured. The particles in the Cu-rich region were coarser and more globular than those in the A1 phase, which is relatively soft and ductile. In addition to the A1 particles, there were some separate Cu-rich regions. Due to the brittle nature of the powder and the acicular shape, Mn-HEA powder exhibited a different trend with smaller particle sizes. It is observed that MA results in a loose structure characterized by many gaps, cracks, signs of plastic deformation, and small particles attached to the surface of the particle. Considering the experimental results obtained, it is not possible to conclude that the chemical composition of the high entropy alloy influences the roughness of the coating. It has been observed that the deposited volume increases with temperature only in the case of Al0.1 and Mg-based HEA, while for the rest of the Al-based HEA, there are no noticeable changes. There is a direct correlation between micro-hardness and the chemical composition of a coating: the micro-hardness of a coating increases as the percentage of aluminum increases in the sample. Compared to the substrate, the coating has a much higher hardness, and the hardness measured at the interface is intermediate.

Keywords: characterisation, cold spraying, HEA coatings, SEM+EDS

Procedia PDF Downloads 33
27154 The Bayesian Premium Under Entropy Loss

Authors: Farouk Metiri, Halim Zeghdoudi, Mohamed Riad Remita

Abstract:

Credibility theory is an experience rating technique in actuarial science which can be seen as one of quantitative tools that allows the insurers to perform experience rating, that is, to adjust future premiums based on past experiences. It is used usually in automobile insurance, worker's compensation premium, and IBNR (incurred but not reported claims to the insurer) where credibility theory can be used to estimate the claim size amount. In this study, we focused on a popular tool in credibility theory which is the Bayesian premium estimator, considering Lindley distribution as a claim distribution. We derive this estimator under entropy loss which is asymmetric and squared error loss which is a symmetric loss function with informative and non-informative priors. In a purely Bayesian setting, the prior distribution represents the insurer’s prior belief about the insured’s risk level after collection of the insured’s data at the end of the period. However, the explicit form of the Bayesian premium in the case when the prior is not a member of the exponential family could be quite difficult to obtain as it involves a number of integrations which are not analytically solvable. The paper finds a solution to this problem by deriving this estimator using numerical approximation (Lindley approximation) which is one of the suitable approximation methods for solving such problems, it approaches the ratio of the integrals as a whole and produces a single numerical result. Simulation study using Monte Carlo method is then performed to evaluate this estimator and mean squared error technique is made to compare the Bayesian premium estimator under the above loss functions.

Keywords: bayesian estimator, credibility theory, entropy loss, monte carlo simulation

Procedia PDF Downloads 298
27153 A PROMETHEE-BELIEF Approach for Multi-Criteria Decision Making Problems with Incomplete Information

Authors: H. Moalla, A. Frikha

Abstract:

Multi-criteria decision aid methods consider decision problems where numerous alternatives are evaluated on several criteria. These methods are used to deal with perfect information. However, in practice, it is obvious that this information requirement is too much strict. In fact, the imperfect data provided by more or less reliable decision makers usually affect decision results since any decision is closely linked to the quality and availability of information. In this paper, a PROMETHEE-BELIEF approach is proposed to help multi-criteria decisions based on incomplete information. This approach solves problems with incomplete decision matrix and unknown weights within PROMETHEE method. On the base of belief function theory, our approach first determines the distributions of belief masses based on PROMETHEE’s net flows and then calculates weights. Subsequently, it aggregates the distribution masses associated to each criterion using Murphy’s modified combination rule in order to infer a global belief structure. The final action ranking is obtained via pignistic probability transformation. A case study of real-world application concerning the location of a waste treatment center from healthcare activities with infectious risk in the center of Tunisia is studied to illustrate the detailed process of the BELIEF-PROMETHEE approach.

Keywords: belief function theory, incomplete information, multiple criteria analysis, PROMETHEE method

Procedia PDF Downloads 134
27152 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 240