Search results for: weighted permutation entropy (WPE)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 896

Search results for: weighted permutation entropy (WPE)

686 The Analysis of Changes in Urban Hierarchy of Isfahan Province in the Fifty-Year Period (1956-2006)

Authors: Hamidreza Joudaki, Yousefali Ziari

Abstract:

The appearance of city and urbanism is one of the important processes which have affected social communities. Being industrialized urbanism developed along with each other in the history. In addition, they have had simple relationship for more than six thousand years, that is, from the appearance of the first cities. In 18th century by coming out of industrial capitalism, progressive development took place in urbanism in the world. In Iran, the city of each region made its decision by itself and the capital of region (downtown) was the only central part and also the regional city without any hierarchy, controlled its realm. However, this method of ruling during these three decays, because of changing in political, social and economic issues that have caused changes in rural and urban relationship. Moreover, it has changed the variety of performance of cities and systematic urban network in Iran. Today, urban system has very vast imbalanced apace and performance. In Isfahan, the trend of urbanism is like the other part of Iran and systematic urban hierarchy is not suitable and normal. This article is a quantitative and analytical. The statistical communities are Isfahan Province cities and the changes in urban network and its hierarchy during the period of fifty years (1956 -2006) has been surveyed. In addition, those data have been analyzed by model of Rank and size and Entropy index. In this article Iran cities and also the factor of entropy of primate city and urban hierarchy of Isfahan Province have been introduced. Urban residents of this Province have been reached from 55 percent to 83% (2006). As we see the analytical data reflects that there is mismatching and imbalance between cities. Because the entropy index was.91 in 1956.And it decreased to.63 in 2006. Isfahan city is the primate city in the whole of these periods. Moreover, the second and the third cities have population gap with regard to the other cities and finally, they do not follow the system of rank-size.

Keywords: urban network, urban hierarchy, primate city, Isfahan province, urbanism, first cities

Procedia PDF Downloads 230
685 Proposals of Exposure Limits for Infrasound From Wind Turbines

Authors: M. Pawlaczyk-Łuszczyńska, T. Wszołek, A. Dudarewicz, P. Małecki, M. Kłaczyński, A. Bortkiewicz

Abstract:

Human tolerance to infrasound is defined by the hearing threshold. Infrasound that cannot be heard (or felt) is not annoying and is not thought to have any other adverse or health effects. Recent research has largely confirmed earlier findings. ISO 7196:1995 recommends the use of G-weighted characteristics for the assessment of infrasound. There is a strong correlation between G-weighted SPL and annoyance perception. The aim of this study was to propose exposure limits for infrasound from wind turbines. However, only a few countries have set limits for infrasound. These limits are usually no higher than 85-92 dBG, and none of them are specific to wind turbines. Over the years, a number of studies have been carried out to determine hearing thresholds below 20 Hz. It has been recognized that 10% of young people would be able to perceive 10 Hz at around 90 dB, and it has also been found that the difference in median hearing thresholds between young adults aged around 20 years and older adults aged over 60 years is around 10 dB, irrespective of frequency. This shows that older people (up to about 60 years of age) retain good hearing in the low frequency range, while their sensitivity to higher frequencies is often significantly reduced. In terms of exposure limits for infrasound, the average hearing threshold corresponds to a tone with a G-weighted SPL of about 96 dBG. In contrast, infrasound at Lp,G levels below 85-90 dBG is usually inaudible. The individual hearing threshold can, therefore be 10-15 dB lower than the average threshold, so the recommended limits for environmental infrasound could be 75 dBG or 80 dBG. It is worth noting that the G86 curve has been taken as the threshold of auditory perception of infrasound reached by 90-95% of the population, so the G75 and G80 curves can be taken as the criterion curve for wind turbine infrasound. Finally, two assessment methods and corresponding exposure limit values have been proposed for wind turbine infrasound, i.e. method I - based on G-weighted sound pressure level measurements and method II - based on frequency analysis in 1/3-octave bands in the frequency range 4-20 Hz. Separate limit values have been set for outdoor living areas in the open countryside (Area A) and for noise sensitive areas (Area B). In the case of Method I, infrasound limit values of 80 dBG (for areas A) and 75 dBG (for areas B) have been proposed, while in the case of Method II - criterion curves G80 and G75 have been chosen (for areas A and B, respectively).

Keywords: infrasound, exposure limit, hearing thresholds, wind turbines

Procedia PDF Downloads 55
684 E-Hailing Taxi Industry Management Mode Innovation Based on the Credit Evaluation

Authors: Yuan-lin Liu, Ye Li, Tian Xia

Abstract:

There are some shortcomings in Chinese existing taxi management modes. This paper suggests to establish the third-party comprehensive information management platform and put forward an evaluation model based on credit. Four indicators are used to evaluate the drivers’ credit, they are passengers’ evaluation score, driving behavior evaluation, drivers’ average bad record number, and personal credit score. A weighted clustering method is used to achieve credit level evaluation for taxi drivers. The management of taxi industry is based on the credit level, while the grade of the drivers is accorded to their credit rating. Credit rating determines the cost, income levels, the market access, useful period of license and the level of wage and bonus, as well as violation fine. These methods can make the credit evaluation effective. In conclusion, more credit data will help to set up a more accurate and detailed classification standard library.

Keywords: credit, mobile internet, e-hailing taxi, management mode, weighted cluster

Procedia PDF Downloads 293
683 The Moment of the Optimal Average Length of the Multivariate Exponentially Weighted Moving Average Control Chart for Equally Correlated Variables

Authors: Edokpa Idemudia Waziri, Salisu S. Umar

Abstract:

The Hotellng’s T^2 is a well-known statistic for detecting a shift in the mean vector of a multivariate normal distribution. Control charts based on T have been widely used in statistical process control for monitoring a multivariate process. Although it is a powerful tool, the T statistic is deficient when the shift to be detected in the mean vector of a multivariate process is small and consistent. The Multivariate Exponentially Weighted Moving Average (MEWMA) control chart is one of the control statistics used to overcome the drawback of the Hotellng’s T statistic. In this paper, the probability distribution of the Average Run Length (ARL) of the MEWMA control chart when the quality characteristics exhibit substantial cross correlation and when the process is in-control and out-of-control was derived using the Markov Chain algorithm. The derivation of the probability functions and the moments of the run length distribution were also obtained and they were consistent with some existing results for the in-control and out-of-control situation. By simulation process, the procedure identified a class of ARL for the MEWMA control when the process is in-control and out-of-control. From our study, it was observed that the MEWMA scheme is quite adequate for detecting a small shift and a good way to improve the quality of goods and services in a multivariate situation. It was also observed that as the in-control average run length ARL0¬ or the number of variables (p) increases, the optimum value of the ARL0pt increases asymptotically and as the magnitude of the shift σ increases, the optimal ARLopt decreases. Finally, we use the example from the literature to illustrate our method and demonstrate its efficiency.

Keywords: average run length, markov chain, multivariate exponentially weighted moving average, optimal smoothing parameter

Procedia PDF Downloads 394
682 Analysis of Spatial Heterogeneity of Residential Prices in Guangzhou: An Actual Study Based on Poi Geographically Weighted Regression Model

Authors: Zichun Guo

Abstract:

Guangzhou's housing prices have declined for a long time compared with the other three major cities. As Guangzhou's housing price ladder increases, the influencing factors of housing prices have gradually attracted attention. This article attempts to use housing price data and POI data and uses the Kriging spatial interpolation method and the geographically weighted regression model to explore the distribution of housing prices and the impact of factors. Caused, especially located in Huadu District and the city center. The response is mainly obvious in surrounding areas, which may be related to housing positioning. Economic POIs close to the city center have stronger responses. The factors affecting housing prices provide this method, which is conducive to the management and macro-control of relevant departments, better meets the demand for home purchases, and realizes financing-side reforms.

Keywords: housing prices, spatial heterogeneity, Guangzhou, POI

Procedia PDF Downloads 17
681 Image Encryption Using Eureqa to Generate an Automated Mathematical Key

Authors: Halima Adel Halim Shnishah, David Mulvaney

Abstract:

Applying traditional symmetric cryptography algorithms while computing encryption and decryption provides immunity to secret keys against different attacks. One of the popular techniques generating automated secret keys is evolutionary computing by using Eureqa API tool, which got attention in 2013. In this paper, we are generating automated secret keys for image encryption and decryption using Eureqa API (tool which is used in evolutionary computing technique). Eureqa API models pseudo-random input data obtained from a suitable source to generate secret keys. The validation of generated secret keys is investigated by performing various statistical tests (histogram, chi-square, correlation of two adjacent pixels, correlation between original and encrypted images, entropy and key sensitivity). Experimental results obtained from methods including histogram analysis, correlation coefficient, entropy and key sensitivity, show that the proposed image encryption algorithms are secure and reliable, with the potential to be adapted for secure image communication applications.

Keywords: image encryption algorithms, Eureqa, statistical measurements, automated key generation

Procedia PDF Downloads 461
680 Hardy Type Inequalities of Two-Dimensional on Time Scales via Steklov Operator

Authors: Wedad Albalawi

Abstract:

The mathematical inequalities have been the core of mathematical study and used in almost all branches of mathematics as well in various areas of science and engineering. The inequalities by Hardy, Littlewood and Polya were the first significant composition of several science. This work presents fundamental ideas, results and techniques and it has had much influence on research in various branches of analysis. Since 1934, various inequalities have been produced and studied in the literature. Furthermore, some inequalities have been formulated by some operators; in 1989, weighted Hardy inequalities have been obtained for integration operators. Then, they obtained weighted estimates for Steklov operators that were used in the solution of the Cauchy problem for the wave equation. They were improved upon in 2011 to include the boundedness of integral operators from the weighted Sobolev space to the weighted Lebesgue space. Some inequalities have been demonstrated and improved using the Hardy–Steklov operator. Recently, a lot of integral inequalities have been improved by differential operators. Hardy inequality has been one of the tools that is used to consider integrity solutions of differential equations. Then dynamic inequalities of Hardy and Coposon have been extended and improved by various integral operators. These inequalities would be interesting to apply in different fields of mathematics (functional spaces, partial differential equations, mathematical modeling). Some inequalities have been appeared involving Copson and Hardy inequalities on time scales to obtain new special version of them. A time scale is defined as a closed subset contains real numbers. Then the inequalities of time scales version have received a lot of attention and has had a major field in both pure and applied mathematics. There are many applications of dynamic equations on time scales to quantum mechanics, electrical engineering, neural networks, heat transfer, combinatorics, and population dynamics. This study focuses on double integrals to obtain new time-scale inequalities of Copson driven by Steklov operator. They will be applied in the solution of the Cauchy problem for the wave equation. The proof can be done by introducing restriction on the operator in several cases. In addition, the obtained inequalities done by using some concepts in time scale version such as time scales calculus, theorem of Fubini and the inequality of H¨older.

Keywords: time scales, inequality of Hardy, inequality of Coposon, Steklov operator

Procedia PDF Downloads 51
679 Gray Level Image Encryption

Authors: Roza Afarin, Saeed Mozaffari

Abstract:

The aim of this paper is image encryption using Genetic Algorithm (GA). The proposed encryption method consists of two phases. In modification phase, pixels locations are altered to reduce correlation among adjacent pixels. Then, pixels values are changed in the diffusion phase to encrypt the input image. Both phases are performed by GA with binary chromosomes. For modification phase, these binary patterns are generated by Local Binary Pattern (LBP) operator while for diffusion phase binary chromosomes are obtained by Bit Plane Slicing (BPS). Initial population in GA includes rows and columns of the input image. Instead of subjective selection of parents from this initial population, a random generator with predefined key is utilized. It is necessary to decrypt the coded image and reconstruct the initial input image. Fitness function is defined as average of transition from 0 to 1 in LBP image and histogram uniformity in modification and diffusion phases, respectively. Randomness of the encrypted image is measured by entropy, correlation coefficients and histogram analysis. Experimental results show that the proposed method is fast enough and can be used effectively for image encryption.

Keywords: correlation coefficients, genetic algorithm, image encryption, image entropy

Procedia PDF Downloads 304
678 Optimization of Smart Beta Allocation by Momentum Exposure

Authors: J. B. Frisch, D. Evandiloff, P. Martin, N. Ouizille, F. Pires

Abstract:

Smart Beta strategies intend to be an asset management revolution with reference to classical cap-weighted indices. Indeed, these strategies allow a better control on portfolios risk factors and an optimized asset allocation by taking into account specific risks or wishes to generate alpha by outperforming indices called 'Beta'. Among many strategies independently used, this paper focuses on four of them: Minimum Variance Portfolio, Equal Risk Contribution Portfolio, Maximum Diversification Portfolio, and Equal-Weighted Portfolio. Their efficiency has been proven under constraints like momentum or market phenomenon, suggesting a reconsideration of cap-weighting.
 To further increase strategy return efficiency, it is proposed here to compare their strengths and weaknesses inside time intervals corresponding to specific identifiable market phases, in order to define adapted strategies depending on pre-specified situations. 
Results are presented as performance curves from different combinations compared to a benchmark. If a combination outperforms the applicable benchmark in well-defined actual market conditions, it will be preferred. It is mainly shown that such investment 'rules', based on both historical data and evolution of Smart Beta strategies, and implemented according to available specific market data, are providing very interesting optimal results with higher return performance and lower risk.
 Such combinations have not been fully exploited yet and justify present approach aimed at identifying relevant elements characterizing them.

Keywords: smart beta, minimum variance portfolio, equal risk contribution portfolio, maximum diversification portfolio, equal weighted portfolio, combinations

Procedia PDF Downloads 317
677 Spatial Analysis of Flood Vulnerability in Highly Urbanized Area: A Case Study in Taipei City

Authors: Liang Weichien

Abstract:

Without adequate information and mitigation plan for natural disaster, the risk to urban populated areas will increase in the future as populations grow, especially in Taiwan. Taiwan is recognized as the world's high-risk areas, where an average of 5.7 times of floods occur per year should seek to strengthen coherence and consensus in how cities can plan for flood and climate change. Therefore, this study aims at understanding the vulnerability to flooding in Taipei city, Taiwan, by creating indicators and calculating the vulnerability of each study units. The indicators were grouped into sensitivity and adaptive capacity based on the definition of vulnerability of Intergovernmental Panel on Climate Change. The indicators were weighted by using Principal Component Analysis. However, current researches were based on the assumption that the composition and influence of the indicators were the same in different areas. This disregarded spatial correlation that might result in inaccurate explanation on local vulnerability. The study used Geographically Weighted Principal Component Analysis by adding geographic weighting matrix as weighting to get the different main flood impact characteristic in different areas. Cross Validation Method and Akaike Information Criterion were used to decide bandwidth and Gaussian Pattern as the bandwidth weight scheme. The ultimate outcome can be used for the reduction of damage potential by integrating the outputs into local mitigation plan and urban planning.

Keywords: flood vulnerability, geographically weighted principal components analysis, GWPCA, highly urbanized area, spatial correlation

Procedia PDF Downloads 269
676 Hybrid Direct Numerical Simulation and Large Eddy Simulating Wall Models Approach for the Analysis of Turbulence Entropy

Authors: Samuel Ahamefula

Abstract:

Turbulent motion is a highly nonlinear and complex phenomenon, and its modelling is still very challenging. In this study, we developed a hybrid computational approach to accurately simulate fluid turbulence phenomenon. The focus is coupling and transitioning between Direct Numerical Simulation (DNS) and Large Eddy Simulating Wall Models (LES-WM) regions. In the framework, high-order fidelity fluid dynamical methods are utilized to simulate the unsteady compressible Navier-Stokes equations in the Eulerian format on the unstructured moving grids. The coupling and transitioning of DNS and LES-WM are conducted through the linearly staggered Dirichlet-Neumann coupling scheme. The high-fidelity framework is verified and validated based on namely, DNS ability for capture full range of turbulent scales, giving accurate results and LES-WM efficiency in simulating near-wall turbulent boundary layer by using wall models.

Keywords: computational methods, turbulence modelling, turbulence entropy, navier-stokes equations

Procedia PDF Downloads 71
675 Genetic Algorithm for In-Theatre Military Logistics Search-and-Delivery Path Planning

Authors: Jean Berger, Mohamed Barkaoui

Abstract:

Discrete search path planning in time-constrained uncertain environment relying upon imperfect sensors is known to be hard, and current problem-solving techniques proposed so far to compute near real-time efficient path plans are mainly bounded to provide a few move solutions. A new information-theoretic –based open-loop decision model explicitly incorporating false alarm sensor readings, to solve a single agent military logistics search-and-delivery path planning problem with anticipated feedback is presented. The decision model consists in minimizing expected entropy considering anticipated possible observation outcomes over a given time horizon. The model captures uncertainty associated with observation events for all possible scenarios. Entropy represents a measure of uncertainty about the searched target location. Feedback information resulting from possible sensor observations outcomes along the projected path plan is exploited to update anticipated unit target occupancy beliefs. For the first time, a compact belief update formulation is generalized to explicitly include false positive observation events that may occur during plan execution. A novel genetic algorithm is then proposed to efficiently solve search path planning, providing near-optimal solutions for practical realistic problem instances. Given the run-time performance of the algorithm, natural extension to a closed-loop environment to progressively integrate real visit outcomes on a rolling time horizon can be easily envisioned. Computational results show the value of the approach in comparison to alternate heuristics.

Keywords: search path planning, false alarm, search-and-delivery, entropy, genetic algorithm

Procedia PDF Downloads 338
674 Development of Impressive Tensile Properties of Hybrid Rolled Ta0.5Nb0.5Hf0.5ZrTi1.5 Refractory High Entropy Alloy

Authors: M. Veeresham

Abstract:

The microstructure, texture, phase stability, and tensile properties of annealed Ta0.5Nb0.5Hf0.5ZrTi1.5 alloy have been investigated in the present research. The alloy was severely hybrid-rolled up to 93.5% thickness reduction, subsequently rolled samples subjected to an annealing treatment at 800 °C and 1000 °C temperatures for 1 h. Consequently, the rolled condition and both annealed temperatures have a body-centered cubic (BCC) structure. Furthermore, quantitative texture measurements (orientation distribution function (ODF) analysis) and microstructural examinations (analytical electron backscatter diffraction (EBSD) maps) permitted to establish a good relationship between annealing texture and microstructure and universal testing machine (UTM) utilized for obtaining the mechanical properties. Impressive room temperature tensile properties combination with the tensile strength (1380 MPa) and (24.7%) elongation is achieved for the 800 °C heat-treated condition. The evolution of the coarse microstructure featured in the case of 1000 °C annealed temperature ascribed to the influence of high thermal energy.

Keywords: refractory high entropy alloys, hybrid-rolling, recrystallization, microstructure, tensile properties

Procedia PDF Downloads 113
673 Determination of the Cooling Rate Dependency of High Entropy Alloys Using a High-Temperature Drop-on-Demand Droplet Generator

Authors: Saeedeh Imani Moqadam, Ilya Bobrov, Jérémy Epp, Nils Ellendt, Lutz Mädler

Abstract:

High entropy alloys (HEAs), having adjustable properties and enhanced stability compared with intermetallic compounds, are solid solution alloys that contain more than five principal elements with almost equal atomic percentage. The concept of producing such alloys pave the way for developing advanced materials with unique properties. However, the synthesis of such alloys may require advanced processes with high cooling rates depending on which alloy elements are used. In this study, the micro spheres of different diameters of HEAs were generated via a drop-on-demand droplet generator and subsequently solidified during free-fall in an argon atmosphere. Such droplet generators can generate individual droplets with high reproducibility regarding droplet diameter, trajectory and cooling while avoiding any interparticle momentum or thermal coupling. Metallography as well as X-ray diffraction investigations for each diameter of the generated metallic droplets where then carried out to obtain information about the microstructural state. To calculate the cooling rate of the droplets, a droplet cooling model was developed and validated using model alloys such as CuSn%6 and AlCu%4.5 for which a correlation of secondary dendrite arm spacing (SDAS) and cooling rate is well-known. Droplets were generated from these alloys and their SDAS was determined using quantitative metallography. The cooling rate was then determined from the SDAS and used to validate the cooling rates obtained from the droplet cooling model. The application of that model on the HEA then leads to the cooling rate dependency and hence to the identification of process windows for the synthesis of these alloys. These process windows were then compared with cooling rates obtained in processes such as powder production, spray forming, selective laser melting and casting to predict if a synthesis is possible with these processes.

Keywords: cooling rate, drop-on-demand, high entropy alloys, microstructure, single droplet generation, X-ray Diffractometry

Procedia PDF Downloads 187
672 Using the Smith-Waterman Algorithm to Extract Features in the Classification of Obesity Status

Authors: Rosa Figueroa, Christopher Flores

Abstract:

Text categorization is the problem of assigning a new document to a set of predetermined categories, on the basis of a training set of free-text data that contains documents whose category membership is known. To train a classification model, it is necessary to extract characteristics in the form of tokens that facilitate the learning and classification process. In text categorization, the feature extraction process involves the use of word sequences also known as N-grams. In general, it is expected that documents belonging to the same category share similar features. The Smith-Waterman (SW) algorithm is a dynamic programming algorithm that performs a local sequence alignment in order to determine similar regions between two strings or protein sequences. This work explores the use of SW algorithm as an alternative to feature extraction in text categorization. The dataset used for this purpose, contains 2,610 annotated documents with the classes Obese/Non-Obese. This dataset was represented in a matrix form using the Bag of Word approach. The score selected to represent the occurrence of the tokens in each document was the term frequency-inverse document frequency (TF-IDF). In order to extract features for classification, four experiments were conducted: the first experiment used SW to extract features, the second one used unigrams (single word), the third one used bigrams (two word sequence) and the last experiment used a combination of unigrams and bigrams to extract features for classification. To test the effectiveness of the extracted feature set for the four experiments, a Support Vector Machine (SVM) classifier was tuned using 20% of the dataset. The remaining 80% of the dataset together with 5-Fold Cross Validation were used to evaluate and compare the performance of the four experiments of feature extraction. Results from the tuning process suggest that SW performs better than the N-gram based feature extraction. These results were confirmed by using the remaining 80% of the dataset, where SW performed the best (accuracy = 97.10%, weighted average F-measure = 97.07%). The second best was obtained by the combination of unigrams-bigrams (accuracy = 96.04, weighted average F-measure = 95.97) closely followed by the bigrams (accuracy = 94.56%, weighted average F-measure = 94.46%) and finally unigrams (accuracy = 92.96%, weighted average F-measure = 92.90%).

Keywords: comorbidities, machine learning, obesity, Smith-Waterman algorithm

Procedia PDF Downloads 272
671 Multi-Criteria Goal Programming Model for Sustainable Development of India

Authors: Irfan Ali, Srikant Gupta, Aquil Ahmed

Abstract:

Every country needs a sustainable development (SD) for its economic growth by forming suitable policies and initiative programs for the development of different sectors of the country. This paper is comprised of modeling and optimization of different sectors of India that form a multi-criterion model. In this paper, we developed a fractional goal programming (FGP) model that helps in providing the efficient allocation of resources simultaneously by achieving the sustainable goals in gross domestic product (GDP), electricity consumption (EC) and greenhouse gasses (GHG) emission by the year 2030. Also, a weighted model of FGP is presented to obtain varying solution according to the priorities set by the policy maker for achieving future goals of GDP growth, EC, and GHG emission. The presented models provide a useful insight to the decision makers for implementing strategies in a different sector.

Keywords: sustainable and economic development, multi-objective fractional programming, fuzzy goal programming, weighted fuzzy goal programming

Procedia PDF Downloads 198
670 Optimal Design for SARMA(P,Q)L Process of EWMA Control Chart

Authors: Yupaporn Areepong

Abstract:

The main goal of this paper is to study Statistical Process Control (SPC) with Exponentially Weighted Moving Average (EWMA) control chart when observations are serially-correlated. The characteristic of control chart is Average Run Length (ARL) which is the average number of samples taken before an action signal is given. Ideally, an acceptable ARL of in-control process should be enough large, so-called (ARL0). Otherwise it should be small when the process is out-of-control, so-called Average of Delay Time (ARL1) or a mean of true alarm. We find explicit formulas of ARL for EWMA control chart for Seasonal Autoregressive and Moving Average processes (SARMA) with Exponential white noise. The results of ARL obtained from explicit formula and Integral equation are in good agreement. In particular, this formulas for evaluating (ARL0) and (ARL1) be able to get a set of optimal parameters which depend on smoothing parameter (λ) and width of control limit (H) for designing EWMA chart with minimum of (ARL1).

Keywords: average run length, optimal parameters, exponentially weighted moving average (EWMA), control chart

Procedia PDF Downloads 533
669 Detection Method of Federated Learning Backdoor Based on Weighted K-Medoids

Authors: Xun Li, Haojie Wang

Abstract:

Federated learning is a kind of distributed training and centralized training mode, which is of great value in the protection of user privacy. In order to solve the problem that the model is vulnerable to backdoor attacks in federated learning, a backdoor attack detection method based on a weighted k-medoids algorithm is proposed. First of all, this paper collates the update parameters of the client to construct a vector group, then uses the principal components analysis (PCA) algorithm to extract the corresponding feature information from the vector group, and finally uses the improved k-medoids clustering algorithm to identify the normal and backdoor update parameters. In this paper, the backdoor is implanted in the federation learning model through the model replacement attack method in the simulation experiment, and the update parameters from the attacker are effectively detected and removed by the defense method proposed in this paper.

Keywords: federated learning, backdoor attack, PCA, k-medoids, backdoor defense

Procedia PDF Downloads 85
668 Hyperspectral Image Classification Using Tree Search Algorithm

Authors: Shreya Pare, Parvin Akhter

Abstract:

Remotely sensing image classification becomes a very challenging task owing to the high dimensionality of hyperspectral images. The pixel-wise classification methods fail to take the spatial structure information of an image. Therefore, to improve the performance of classification, spatial information can be integrated into the classification process. In this paper, the multilevel thresholding algorithm based on a modified fuzzy entropy function is used to perform the segmentation of hyperspectral images. The fuzzy parameters of the MFE function have been optimized by using a new meta-heuristic algorithm based on the Tree-Search algorithm. The segmented image is classified by a large distribution machine (LDM) classifier. Experimental results are shown on a hyperspectral image dataset. The experimental outputs indicate that the proposed technique (MFE-TSA-LDM) achieves much higher classification accuracy for hyperspectral images when compared to state-of-art classification techniques. The proposed algorithm provides accurate segmentation and classification maps, thus becoming more suitable for image classification with large spatial structures.

Keywords: classification, hyperspectral images, large distribution margin, modified fuzzy entropy function, multilevel thresholding, tree search algorithm, hyperspectral image classification using tree search algorithm

Procedia PDF Downloads 142
667 Evaluation of Fetal brain using Magnetic Resonance Imaging

Authors: Mahdi Farajzadeh Ajirlou

Abstract:

Ordinary fetal brain development can be considered by in vivo attractive reverberation imaging (MRI) from the 18th gestational week (GW) to term and depends fundamentally on T2-weighted and diffusion-weighted (DW) arrangements. The foremost commonly suspected brain pathologies alluded to fetal MRI for assist assessment are ventriculomegaly, lost corpus callosum, and anomalies of the posterior fossa. Brain division could be a crucial to begin with step in neuroimage examination. Within the case of fetal MRI it is especially challenging and critical due to the subjective introduction of the hatchling, organs that encompass the fetal head, and irregular fetal movement. A few promising strategies have been proposed but are constrained in their execution in challenging cases and in realtime division. Fetal MRI is routinely performed on a 1.5-Tesla scanner without maternal or fetal sedation. The mother lies recumbent amid the course of the examination, the length of which is ordinarily 45 to 60 minutes. The accessibility and continuous approval of standardizing fetal brain development directions will give critical devices for early discovery of impeded fetal brain development upon which to oversee high-risk pregnancies.

Keywords: brain, fetal, MRI, imaging

Procedia PDF Downloads 50
666 Analysis of Factors Affecting the Number of Infant and Maternal Mortality in East Java with Geographically Weighted Bivariate Generalized Poisson Regression Method

Authors: Luh Eka Suryani, Purhadi

Abstract:

Poisson regression is a non-linear regression model with response variable in the form of count data that follows Poisson distribution. Modeling for a pair of count data that show high correlation can be analyzed by Poisson Bivariate Regression. Data, the number of infant mortality and maternal mortality, are count data that can be analyzed by Poisson Bivariate Regression. The Poisson regression assumption is an equidispersion where the mean and variance values are equal. However, the actual count data has a variance value which can be greater or less than the mean value (overdispersion and underdispersion). Violations of this assumption can be overcome by applying Generalized Poisson Regression. Characteristics of each regency can affect the number of cases occurred. This issue can be overcome by spatial analysis called geographically weighted regression. This study analyzes the number of infant mortality and maternal mortality based on conditions in East Java in 2016 using Geographically Weighted Bivariate Generalized Poisson Regression (GWBGPR) method. Modeling is done with adaptive bisquare Kernel weighting which produces 3 regency groups based on infant mortality rate and 5 regency groups based on maternal mortality rate. Variables that significantly influence the number of infant and maternal mortality are the percentages of pregnant women visit health workers at least 4 times during pregnancy, pregnant women get Fe3 tablets, obstetric complication handled, clean household and healthy behavior, and married women with the first marriage age under 18 years.

Keywords: adaptive bisquare kernel, GWBGPR, infant mortality, maternal mortality, overdispersion

Procedia PDF Downloads 135
665 Multi-Objective Variable Neighborhood Search Algorithm to Solving Scheduling Problem with Transportation Times

Authors: Majid Khalili

Abstract:

This paper deals with a bi-objective hybrid no-wait flowshop scheduling problem minimizing the makespan and total weighted tardiness, in which we consider transportation times between stages. Obtaining an optimal solution for this type of complex, large-sized problem in reasonable computational time by using traditional approaches and optimization tools is extremely difficult. This paper presents a new multi-objective variable neighborhood algorithm (MOVNS). A set of experimental instances are carried out to evaluate the algorithm by advanced multi-objective performance measures. The algorithm is carefully evaluated for its performance against available algorithm by means of multi-objective performance measures and statistical tools. The related results show that a variant of our proposed MOVNS provides sound performance comparing with other algorithms.

Keywords: no-wait hybrid flowshop scheduling; multi-objective variable neighborhood algorithm; makespan; total weighted tardiness

Procedia PDF Downloads 399
664 Conjugate Mixed Convection Heat Transfer and Entropy Generation of Cu-Water Nanofluid in an Enclosure with Thick Wavy Bottom Wall

Authors: Sanjib Kr Pal, S. Bhattacharyya

Abstract:

Mixed convection of Cu-water nanofluid in an enclosure with thick wavy bottom wall has been investigated numerically. A co-ordinate transformation method is used to transform the computational domain into an orthogonal co-ordinate system. The governing equations in the computational domain are solved through a pressure correction based iterative algorithm. The fluid flow and heat transfer characteristics are analyzed for a wide range of Richardson number (0.1 ≤ Ri ≤ 5), nanoparticle volume concentration (0.0 ≤ ϕ ≤ 0.2), amplitude (0.0 ≤ α ≤ 0.1) of the wavy thick- bottom wall and the wave number (ω) at a fixed Reynolds number. Obtained results showed that heat transfer rate increases remarkably by adding the nanoparticles. Heat transfer rate is dependent on the wavy wall amplitude and wave number and decreases with increasing Richardson number for fixed amplitude and wave number. The Bejan number and the entropy generation are determined to analyze the thermodynamic optimization of the mixed convection.

Keywords: conjugate heat transfer, mixed convection, nano fluid, wall waviness

Procedia PDF Downloads 236
663 Thermodynamic Approach of Lanthanide-Iron Double Oxides Formation

Authors: Vera Varazashvili, Murman Tsarakhov, Tamar Mirianashvili, Teimuraz Pavlenishvili, Tengiz Machaladze, Mzia Khundadze

Abstract:

Standard Gibbs energy of formation ΔGfor(298.15) of lanthanide-iron double oxides of garnet-type crystal structure R3Fe5O12 - RIG (R – are rare earth ions) from initial oxides are evaluated. The calculation is based on the data of standard entropies S298.15 and standard enthalpies ΔH298.15 of formation of compounds which are involved in the process of garnets synthesis. Gibbs energy of formation is presented as temperature function ΔGfor(T) for the range 300-1600K. The necessary starting thermodynamic data were obtained from calorimetric study of heat capacity – temperature functions and by using the semi-empirical method for calculation of ΔH298.15 of formation. Thermodynamic functions for standard temperature – enthalpy, entropy and Gibbs energy - are recommended as reference data for technological evaluations. Through the isostructural series of rare earth-iron garnets the correlation between thermodynamic properties and characteristics of lanthanide ions are elucidated.

Keywords: calorimetry, entropy, enthalpy, heat capacity, gibbs energy of formation, rare earth iron garnets

Procedia PDF Downloads 357
662 Content-Based Image Retrieval Using HSV Color Space Features

Authors: Hamed Qazanfari, Hamid Hassanpour, Kazem Qazanfari

Abstract:

In this paper, a method is provided for content-based image retrieval. Content-based image retrieval system searches query an image based on its visual content in an image database to retrieve similar images. In this paper, with the aim of simulating the human visual system sensitivity to image's edges and color features, the concept of color difference histogram (CDH) is used. CDH includes the perceptually color difference between two neighboring pixels with regard to colors and edge orientations. Since the HSV color space is close to the human visual system, the CDH is calculated in this color space. In addition, to improve the color features, the color histogram in HSV color space is also used as a feature. Among the extracted features, efficient features are selected using entropy and correlation criteria. The final features extract the content of images most efficiently. The proposed method has been evaluated on three standard databases Corel 5k, Corel 10k and UKBench. Experimental results show that the accuracy of the proposed image retrieval method is significantly improved compared to the recently developed methods.

Keywords: content-based image retrieval, color difference histogram, efficient features selection, entropy, correlation

Procedia PDF Downloads 225
661 Applied Complement of Probability and Information Entropy for Prediction in Student Learning

Authors: Kennedy Efosa Ehimwenma, Sujatha Krishnamoorthy, Safiya Al‑Sharji

Abstract:

The probability computation of events is in the interval of [0, 1], which are values that are determined by the number of outcomes of events in a sample space S. The probability Pr(A) that an event A will never occur is 0. The probability Pr(B) that event B will certainly occur is 1. This makes both events A and B a certainty. Furthermore, the sum of probabilities Pr(E₁) + Pr(E₂) + … + Pr(Eₙ) of a finite set of events in a given sample space S equals 1. Conversely, the difference of the sum of two probabilities that will certainly occur is 0. This paper first discusses Bayes, the complement of probability, and the difference of probability for occurrences of learning-events before applying them in the prediction of learning objects in student learning. Given the sum of 1; to make a recommendation for student learning, this paper proposes that the difference of argMaxPr(S) and the probability of student-performance quantifies the weight of learning objects for students. Using a dataset of skill-set, the computational procedure demonstrates i) the probability of skill-set events that have occurred that would lead to higher-level learning; ii) the probability of the events that have not occurred that requires subject-matter relearning; iii) accuracy of the decision tree in the prediction of student performance into class labels and iv) information entropy about skill-set data and its implication on student cognitive performance and recommendation of learning.

Keywords: complement of probability, Bayes’ rule, prediction, pre-assessments, computational education, information theory

Procedia PDF Downloads 135
660 Assessment of Ground Water Potential Zone: A Case Study of Paramakudi Taluk, Ramanathapuram, Tamilnadu, India

Authors: Shri Devi

Abstract:

This paper was conducted to see the ground water potential zones in Paramakudi taluk, Ramanathapuram,Tamilnadu India with a total areal extent of 745 sq. km. The various thematic map have been prepared for the study such as soil, geology, geomorphology, drainage, land use of the particular study area using the Toposheet of 1: 50000. The digital elevation model (DEM) has been generated from contour interval of 10m and also the slope was prepared. The ground water potential zone of the region was obtained using the weighted overlay analysis for which all the thematic maps were overlayed in arc gis 10.2. For the particular output the ranking has been given for all the parameters of each thematic layer with different weightage such as 25% was given to soil, 25% to geomorphology and land use land cover also 25%, slope 15%, lineament with 5% and drainage streams with 5 percentage. Using these entire potential zone maps was prepared which was overlayed with the village map to check the region which has good, moderate and low groundwater potential zone.

Keywords: GIS, ground water, Paramakudi, weighted overlay analysis

Procedia PDF Downloads 317
659 Gait Biometric for Person Re-Identification

Authors: Lavanya Srinivasan

Abstract:

Biometric identification is to identify unique features in a person like fingerprints, iris, ear, and voice recognition that need the subject's permission and physical contact. Gait biometric is used to identify the unique gait of the person by extracting moving features. The main advantage of gait biometric to identify the gait of a person at a distance, without any physical contact. In this work, the gait biometric is used for person re-identification. The person walking naturally compared with the same person walking with bag, coat, and case recorded using longwave infrared, short wave infrared, medium wave infrared, and visible cameras. The videos are recorded in rural and in urban environments. The pre-processing technique includes human identified using YOLO, background subtraction, silhouettes extraction, and synthesis Gait Entropy Image by averaging the silhouettes. The moving features are extracted from the Gait Entropy Energy Image. The extracted features are dimensionality reduced by the principal component analysis and recognised using different classifiers. The comparative results with the different classifier show that linear discriminant analysis outperforms other classifiers with 95.8% for visible in the rural dataset and 94.8% for longwave infrared in the urban dataset.

Keywords: biometric, gait, silhouettes, YOLO

Procedia PDF Downloads 152
658 A Study on the Assessment of Prosthetic Infection after Total Knee Replacement Surgery

Authors: Chun-Lang Chang, Chun-Kai Liu

Abstract:

In this study, the patients that have undergone total knee replacement surgery from the 2010 National Health Insurance database were adopted as the study participants. The important factors were screened and selected through literature collection and interviews with physicians. Through the Cross Entropy Method (CE), Genetic Algorithm Logistic Regression (GALR), and Particle Swarm Optimization (PSO), the weights of the factors were obtained. In addition, the weights of the respective algorithms, coupled with the Excel VBA were adopted to construct the Case Based Reasoning (CBR) system. The results through statistical tests show that the GALR and PSO produced no significant differences, and the accuracy of both models were above 97%. Moreover, the area under the curve of ROC for these two models also exceeded 0.87. This study shall serve as a reference for medical staff as an assistance for clinical assessment of infections in order to effectively enhance medical service quality and efficiency, avoid unnecessary medical waste, and substantially contribute to resource allocations in medical institutions.

Keywords: Case Based Reasoning, Cross Entropy Method, Genetic Algorithm Logistic Regression, Particle Swarm Optimization, Total Knee Replacement Surgery

Procedia PDF Downloads 299
657 Performance Analysis of Next Generation OCDM-RoF-Based Hybrid Network under Diverse Conditions

Authors: Anurag Sharma, Rahul Malhotra, Love Kumar, Harjit Pal Singh

Abstract:

This paper demonstrates OCDM-ROF based hybrid architecture where data/voice communication is enabled via a permutation of Optical Code Division Multiplexing (OCDM) and Radio-over-Fiber (RoF) techniques under various diverse conditions. OCDM-RoF hybrid network of 16 users with DPSK modulation format has been designed and performance of proposed network is analyzed for 100, 150, and 200 km fiber span length under the influence of linear and nonlinear effect. It has been reported that Polarization Mode Dispersion (PMD) has the least effect while other nonlinearity affects the performance of proposed network.

Keywords: OCDM, RoF, DPSK, PMD, eye diagram, BER, Q factor

Procedia PDF Downloads 612