Search results for: Elliptic Curve Digital Signature Algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4692

Search results for: Elliptic Curve Digital Signature Algorithm

3822 Comparison of Data Reduction Algorithms for Image-Based Point Cloud Derived Digital Terrain Models

Authors: M. Uysal, M. Yilmaz, I. Tiryakioğlu

Abstract:

Digital Terrain Model (DTM) is a digital numerical representation of the Earth's surface. DTMs have been applied to a diverse field of tasks, such as urban planning, military, glacier mapping, disaster management. In the expression of the Earth' surface as a mathematical model, an infinite number of point measurements are needed. Because of the impossibility of this case, the points at regular intervals are measured to characterize the Earth's surface and DTM of the Earth is generated. Hitherto, the classical measurement techniques and photogrammetry method have widespread use in the construction of DTM. At present, RADAR, LiDAR, and stereo satellite images are also used for the construction of DTM. In recent years, especially because of its superiorities, Airborne Light Detection and Ranging (LiDAR) has an increased use in DTM applications. A 3D point cloud is created with LiDAR technology by obtaining numerous point data. However recently, by the development in image mapping methods, the use of unmanned aerial vehicles (UAV) for photogrammetric data acquisition has increased DTM generation from image-based point cloud. The accuracy of the DTM depends on various factors such as data collection method, the distribution of elevation points, the point density, properties of the surface and interpolation methods. In this study, the random data reduction method is compared for DTMs generated from image based point cloud data. The original image based point cloud data set (100%) is reduced to a series of subsets by using random algorithm, representing the 75, 50, 25 and 5% of the original image based point cloud data set. Over the ANS campus of Afyon Kocatepe University as the test area, DTM constructed from the original image based point cloud data set is compared with DTMs interpolated from reduced data sets by Kriging interpolation method. The results show that the random data reduction method can be used to reduce the image based point cloud datasets to 50% density level while still maintaining the quality of DTM.

Keywords: DTM, unmanned aerial vehicle, UAV, random, Kriging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 810
3821 Numerical Analysis of Electrical Interaction between two Axisymmetric Spheroids

Authors: Kuan-Liang Liu, Eric Lee, Jung-Jyh Lee, Jyh-Ping Hsu

Abstract:

The electrical interaction between two axisymmetric spheroidal particles in an electrolyte solution is examined numerically. A Galerkin finite element method combined with a Newton-Raphson iteration scheme is proposed to evaluate the spatial variation in the electrical potential, and the result obtained used to estimate the interaction energy between two particles. We show that if the surface charge density is fixed, the potential gradient is larger at a point, which has a larger curvature, and if surface potential is fixed, surface charge density is proportional to the curvature. Also, if the total interaction energy against closest surface-to-surface curve exhibits a primary maximum, the maximum follows the order (oblate-oblate) > (sphere-sphere)>(oblate-prolate)>(prolate-prolate), and if the curve has a secondary minimum, the absolute value of the minimum follows the same order.

Keywords: interaction energy, interaction force, Poisson-Boltzmann equation, spheroid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1470
3820 Synthetic Daily Flow Duration Curves for the Çoruh River Basin, Turkey

Authors: Fatih Tosunoğlu, İbrahim Can

Abstract:

The flow duration curve (FDC) is an informative method that represents the flow regime’s properties for a river basin. Therefore, the FDC is widely used for water resource projects such as hydropower, water supply, irrigation and water quality management. The primary purpose of this study is to obtain synthetic daily flow duration curves for Çoruh Basin, Turkey. For this aim, we firstly developed univariate auto-regressive moving average (ARMA) models for daily flows of 9 stations located in Çoruh basin and then these models were used to generate 100 synthetic flow series each having same size as historical series. Secondly, flow duration curves of each synthetic series were drawn and the flow values exceeded 10, 50 and 95% of the time and 95% confidence limit of these flows were calculated. As a result, flood, mean and low flows potential of Çoruh basin will comprehensively be represented.

Keywords: ARMA models, Çoruh basin, flow duration curve, Turkey.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3791
3819 Path Planning of a Robot Manipulator using Retrieval RRT Strategy

Authors: K. Oh, J. P. Hwang, E. Kim, H. Lee

Abstract:

This paper presents an algorithm which extends the rapidly-exploring random tree (RRT) framework to deal with change of the task environments. This algorithm called the Retrieval RRT Strategy (RRS) combines a support vector machine (SVM) and RRT and plans the robot motion in the presence of the change of the surrounding environment. This algorithm consists of two levels. At the first level, the SVM is built and selects a proper path from the bank of RRTs for a given environment. At the second level, a real path is planned by the RRT planners for the given environment. The suggested method is applied to the control of KUKA™,, a commercial 6 DOF robot manipulator, and its feasibility and efficiency are demonstrated via the cosimulatation of MatLab™, and RecurDyn™,.

Keywords: Path planning, RRT, 6 DOF manipulator, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2531
3818 Trustworthy Link Failure Recovery Algorithm for Highly Dynamic Mobile Adhoc Networks

Authors: Y. Harold Robinson, M. Rajaram

Abstract:

The Trustworthy link failure recovery algorithm is introduced in this paper, to provide the forwarding continuity even with compound link failures. The ephemeral failures are common in IP networks and it also has some proposals based on local rerouting. To ensure forwarding continuity, we are introducing the compound link failure recovery algorithm, even with compound link failures. For forwarding the information, each packet carries a blacklist, which is a min set of failed links encountered along its path, and the next hop is chosen by excluding the blacklisted links. Our proposed method describes how it can be applied to ensure forwarding to all reachable destinations in case of any two or more link or node failures in the network. After simulating with NS2 contains lot of samples proved that the proposed protocol achieves exceptional concert even under elevated node mobility using Trustworthy link Failure Recovery Algorithm.

Keywords: Wireless Sensor Networks, Predistribution Scheme, Cryptographic Techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1874
3817 Feature Based Dense Stereo Matching using Dynamic Programming and Color

Authors: Hajar Sadeghi, Payman Moallem, S. Amirhassn Monadjemi

Abstract:

This paper presents a new feature based dense stereo matching algorithm to obtain the dense disparity map via dynamic programming. After extraction of some proper features, we use some matching constraints such as epipolar line, disparity limit, ordering and limit of directional derivative of disparity as well. Also, a coarseto- fine multiresolution strategy is used to decrease the search space and therefore increase the accuracy and processing speed. The proposed method links the detected feature points into the chains and compares some of the feature points from different chains, to increase the matching speed. We also employ color stereo matching to increase the accuracy of the algorithm. Then after feature matching, we use the dynamic programming to obtain the dense disparity map. It differs from the classical DP methods in the stereo vision, since it employs sparse disparity map obtained from the feature based matching stage. The DP is also performed further on a scan line, between any matched two feature points on that scan line. Thus our algorithm is truly an optimization method. Our algorithm offers a good trade off in terms of accuracy and computational efficiency. Regarding the results of our experiments, the proposed algorithm increases the accuracy from 20 to 70%, and reduces the running time of the algorithm almost 70%.

Keywords: Chain Correspondence, Color Stereo Matching, Dynamic Programming, Epipolar Line, Stereo Vision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2349
3816 Breast Skin-Line Estimation and Breast Segmentation in Mammograms using Fast-Marching Method

Authors: Roshan Dharshana Yapa, Koichi Harada

Abstract:

Breast skin-line estimation and breast segmentation is an important pre-process in mammogram image processing and computer-aided diagnosis of breast cancer. Limiting the area to be processed into a specific target region in an image would increase the accuracy and efficiency of processing algorithms. In this paper we are presenting a new algorithm for estimating skin-line and breast segmentation using fast marching algorithm. Fast marching is a partial-differential equation based numerical technique to track evolution of interfaces. We have introduced some modifications to the traditional fast marching method, specifically to improve the accuracy of skin-line estimation and breast tissue segmentation. Proposed modifications ensure that the evolving front stops near the desired boundary. We have evaluated the performance of the algorithm by using 100 mammogram images taken from mini-MIAS database. The results obtained from the experimental evaluation indicate that this algorithm explains 98.6% of the ground truth breast region and accuracy of the segmentation is 99.1%. Also this algorithm is capable of partially-extracting nipple when it is available in the profile.

Keywords: Mammogram, fast marching method, mathematical morphology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2675
3815 Content Based Sampling over Transactional Data Streams

Authors: Mansour Tarafdar, Mohammad Saniee Abade

Abstract:

This paper investigates the problem of sampling from transactional data streams. We introduce CFISDS as a content based sampling algorithm that works on a landmark window model of data streams and preserve more informed sample in sample space. This algorithm that work based on closed frequent itemset mining tasks, first initiate a concept lattice using initial data, then update lattice structure using an incremental mechanism.Incremental mechanism insert, update and delete nodes in/from concept lattice in batch manner. Presented algorithm extracts the final samples on demand of user. Experimental results show the accuracy of CFISDS on synthetic and real datasets, despite on CFISDS algorithm is not faster than exist sampling algorithms such as Z and DSS.

Keywords: Sampling, data streams, closed frequent item set mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1709
3814 Improved Segmentation of Speckled Images Using an Arithmetic-to-Geometric Mean Ratio Kernel

Authors: J. Daba, J. Dubois

Abstract:

In this work, we improve a previously developed segmentation scheme aimed at extracting edge information from speckled images using a maximum likelihood edge detector. The scheme was based on finding a threshold for the probability density function of a new kernel defined as the arithmetic mean-to-geometric mean ratio field over a circular neighborhood set and, in a general context, is founded on a likelihood random field model (LRFM). The segmentation algorithm was applied to discriminated speckle areas obtained using simple elliptic discriminant functions based on measures of the signal-to-noise ratio with fractional order moments. A rigorous stochastic analysis was used to derive an exact expression for the cumulative density function of the probability density function of the random field. Based on this, an accurate probability of error was derived and the performance of the scheme was analysed. The improved segmentation scheme performed well for both simulated and real images and showed superior results to those previously obtained using the original LRFM scheme and standard edge detection methods. In particular, the false alarm probability was markedly lower than that of the original LRFM method with oversegmentation artifacts virtually eliminated. The importance of this work lies in the development of a stochastic-based segmentation, allowing an accurate quantification of the probability of false detection. Non visual quantification and misclassification in medical ultrasound speckled images is relatively new and is of interest to clinicians.

Keywords: Discriminant function, false alarm, segmentation, signal-to-noise ratio, skewness, speckle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655
3813 Comparative Correlation Investigation of Polynuclear Aromatic Hydrocarbons (PAHs) in Soils of Different Land Use: Sources Evaluation Perspective

Authors: O. Onoriode Emoyan, E. Eyitemi Akporhonor, Charles Otobrise

Abstract:

Polycyclic Aromatic Hydrocarbons (PAHs) are formed mainly because of incomplete combustion of organic materials during industrial, domestic activities or natural occurrence. Their toxicity and contamination of terrestrial and aquatic ecosystem have been established. However, with limited validity index, previous research has focused on PAHs isomer pair ratios of variable physicochemical properties in source identification. The objective of this investigation was to determine the empirical validity of Pearson Correlation Coefficient (PCC) and Cluster Analysis (CA) in PAHs source identification along soil samples of different land uses. Therefore, 16 PAHs grouped, as Endocrine Disruption Substances (EDSs) were determined in 10 sample stations in top and sub soils seasonally. PAHs was determined the use of Varian 300 gas chromatograph interfaced with flame ionization detector. Instruments and reagents used are of standard and chromatographic grades respectively. PCC and CA results showed that the classification of PAHs along pyrolitic and petrogenic organics used in source signature is about the predominance PAHs in environmental matrix. Therefore, the distribution of PAHs in the studied stations revealed the presence of trace quantities of the vast majority of the sixteen PAHs, which may ultimately inhabit the actual source signature authentication. Therefore, factors to be considered when evaluating possible sources of PAHs could be; type and extent of bacterial metabolism, transformation products/substrates, and environmental factors such as salinity, pH, oxygen concentration, nutrients, light intensity, temperature, co-substrates, and environmental medium are hereby recommended as factors to be considered when evaluating possible sources of PAHs.

Keywords: Comparative correlation, kinetically, polynuclear aromatic hydrocarbons, thermodynamically- favored PAHs, sources evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1984
3812 Human Digital Twin for Personal Conversation Automation Using Supervised Machine Learning Approaches

Authors: Aya Salama

Abstract:

Digital Twin has emerged as a compelling research area, capturing the attention of scholars over the past decade. It finds applications across diverse fields, including smart manufacturing and healthcare, offering significant time and cost savings. Notably, it often intersects with other cutting-edge technologies such as Data Mining, Artificial Intelligence, and Machine Learning. However, the concept of a Human Digital Twin (HDT) is still in its infancy and requires further demonstration of its practicality. HDT takes the notion of Digital Twin a step further by extending it to living entities, notably humans, who are vastly different from inanimate physical objects. The primary objective of this research was to create an HDT capable of automating real-time human responses by simulating human behavior. To achieve this, the study delved into various areas, including clustering, supervised classification, topic extraction, and sentiment analysis. The paper successfully demonstrated the feasibility of HDT for generating personalized responses in social messaging applications. Notably, the proposed approach achieved an overall accuracy of 63%, a highly promising result that could pave the way for further exploration of the HDT concept. The methodology employed Random Forest for clustering the question database and matching new questions, while K-nearest neighbor was utilized for sentiment analysis.

Keywords: Human Digital twin, sentiment analysis, topic extraction, supervised machine learning, unsupervised machine learning, classification and clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188
3811 A Study of Cooperative Co-evolutionary Genetic Algorithm for Solving Flexible Job Shop Scheduling Problem

Authors: Lee Yih Rou, Hishammuddin Asmuni

Abstract:

Flexible Job Shop Problem (FJSP) is an extension of classical Job Shop Problem (JSP). The FJSP extends the routing flexibility of the JSP, i.e assigning machine to an operation. Thus it makes it more difficult than the JSP. In this study, Cooperative Coevolutionary Genetic Algorithm (CCGA) is presented to solve the FJSP. Makespan (time needed to complete all jobs) is used as the performance evaluation for CCGA. In order to test performance and efficiency of our CCGA the benchmark problems are solved. Computational result shows that the proposed CCGA is comparable with other approaches.

Keywords: Co-evolution, Genetic Algorithm (GA), Flexible JobShop Problem(FJSP)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1788
3810 Fuzzy Controller Design for Ball and Beam System with an Improved Ant Colony Optimization

Authors: Yeong-Hwa Chang, Chia-Wen Chang, Hung-Wei Lin, C.W. Tao

Abstract:

In this paper, an improved ant colony optimization (ACO) algorithm is proposed to enhance the performance of global optimum search. The strategy of the proposed algorithm has the capability of fuzzy pheromone updating, adaptive parameter tuning, and mechanism resetting. The proposed method is utilized to tune the parameters of the fuzzy controller for a real beam and ball system. Simulation and experimental results indicate that better performance can be achieved compared to the conventional ACO algorithms in the aspect of convergence speed and accuracy.

Keywords: Ant colony algorithm, Fuzzy control, ball and beamsystem

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2196
3809 A New Particle Filter Inspired by Biological Evolution: Genetic Filter

Authors: S. Park, J. Hwang, K. Rou, E. Kim

Abstract:

In this paper, we consider a new particle filter inspired by biological evolution. In the standard particle filter, a resampling scheme is used to decrease the degeneracy phenomenon and improve estimation performance. Unfortunately, however, it could cause the undesired the particle deprivation problem, as well. In order to overcome this problem of the particle filter, we propose a novel filtering method called the genetic filter. In the proposed filter, we embed the genetic algorithm into the particle filter and overcome the problems of the standard particle filter. The validity of the proposed method is demonstrated by computer simulation.

Keywords: Particle filter, genetic algorithm, evolutionary algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2497
3808 WormHex: A Volatile Memory Analysis Tool for Retrieval of Social Media Evidence

Authors: Norah Almubairik, Wadha Almattar, Amani Alqarni

Abstract:

Social media applications are increasingly being used in our everyday communications. These applications utilise end-to-end encryption mechanisms which make them suitable tools for criminals to exchange messages. These messages are preserved in the volatile memory until the device is restarted. Therefore, volatile forensics has become an important branch of digital forensics. In this study, the WormHex tool was developed to inspect the memory dump files for Windows and Mac based workstations. The tool supports digital investigators by enabling them to extract valuable data written in Arabic and English through web-based WhatsApp and Twitter applications. The results confirm that social media applications write their data into the memory, regardless of the operating system running the application, with there being no major differences between Windows and Mac.

Keywords: Volatile memory, REGEX, digital forensics, memory acquisition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 923
3807 Oil Displacement by Water in Hauterivian Sandstone Reservoir of Kashkari Oil Field

Authors: A. J. Nazari, S. Honma

Abstract:

This paper evaluates oil displacement by water in Hauterivian sandstone reservoir of Kashkari oil field in North of Afghanistan. The core samples of this oil field were taken out from well No-21st, and the relative permeability and fractional flow are analyzed. Steady state flow laboratory experiments are performed to empirically obtain the fractional flow curves and relative permeability in different water saturation ratio. The relative permeability represents the simultaneous flow behavior in the reservoir. The fractional flow approach describes the individual phases as fractional of the total flow. The fractional flow curve interprets oil displacement by water, and from the tangent of fractional flow curve can find out the average saturation behind the water front flow saturation. Therefore, relative permeability and fractional flow curves are suitable for describing the displacement of oil by water in a petroleum reservoir. The effects of irreducible water saturation, residual oil saturation on the displaceable amount of oil are investigated through Buckley-Leveret analysis.

Keywords: Fractional flow, oil displacement, relative permeability, simultaneously flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1311
3806 Simulated Annealing Application for Structural Optimization

Authors: Farhad Kolahan, M. Hossein Abolbashari, Samaeddin Mohitzadeh

Abstract:

Several methods are available for weight and shape optimization of structures, among which Evolutionary Structural Optimization (ESO) is one of the most widely used methods. In ESO, however, the optimization criterion is completely case-dependent. Moreover, only the improving solutions are accepted during the search. In this paper a Simulated Annealing (SA) algorithm is used for structural optimization problem. This algorithm differs from other random search methods by accepting non-improving solutions. The implementation of SA algorithm is done through reducing the number of finite element analyses (function evaluations). Computational results show that SA can efficiently and effectively solve such optimization problems within short search time.

Keywords: Simulated annealing, Structural optimization, Compliance, C.V. product.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1956
3805 Improved Artificial Bee Colony Algorithm for Non-Convex Economic Power Dispatch Problem

Authors: Badr M. Alshammari, T. Guesmi

Abstract:

This study presents a modified version of the artificial bee colony (ABC) algorithm by including a local search technique for solving the non-convex economic power dispatch problem. The local search step is incorporated at the end of each iteration. Total system losses, valve-point loading effects and prohibited operating zones have been incorporated in the problem formulation. Thus, the problem becomes highly nonlinear and with discontinuous objective function. The proposed technique is validated using an IEEE benchmark system with ten thermal units. Simulation results demonstrate that the proposed optimization algorithm has better convergence characteristics in comparison with the original ABC algorithm.

Keywords: Economic power dispatch, artificial bee colony, valve-point loading effects, prohibited operating zones.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 756
3804 Performance Comparison of Prim’s and Ant Colony Optimization Algorithm to Select Shortest Path in Case of Link Failure

Authors: Rimmy Yadav, Avtar Singh

Abstract:

Ant Colony Optimization (ACO) is a promising modern approach to the unused combinatorial optimization. Here ACO is applied to finding the shortest during communication link failure. In this paper, the performances of the prim’s and ACO algorithm are made. By comparing the time complexity and program execution time as set of parameters, we demonstrate the pleasant performance of ACO in finding excellent solution to finding shortest path during communication link failure.

Keywords: Ant colony optimization, link failure, prim’s algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2184
3803 An Efficient Algorithm for Delay Delay-variation Bounded Least Cost Multicast Routing

Authors: Manas Ranjan Kabat, Manoj Kumar Patel, Chita Ranjan Tripathy

Abstract:

Many multimedia communication applications require a source to transmit messages to multiple destinations subject to quality of service (QoS) delay constraint. To support delay constrained multicast communications, computer networks need to guarantee an upper bound end-to-end delay from the source node to each of the destination nodes. This is known as multicast delay problem. On the other hand, if the same message fails to arrive at each destination node at the same time, there may arise inconsistency and unfairness problem among users. This is related to multicast delayvariation problem. The problem to find a minimum cost multicast tree with delay and delay-variation constraints has been proven to be NP-Complete. In this paper, we propose an efficient heuristic algorithm, namely, Economic Delay and Delay-Variation Bounded Multicast (EDVBM) algorithm, based on a novel heuristic function, to construct an economic delay and delay-variation bounded multicast tree. A noteworthy feature of this algorithm is that it has very high probability of finding the optimal solution in polynomial time with low computational complexity.

Keywords: EDVBM, Heuristic algorithm, Multicast tree, QoS routing, Shortest path.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1643
3802 Combined Simulated Annealing and Genetic Algorithm to Solve Optimization Problems

Authors: Younis R. Elhaddad

Abstract:

Combinatorial optimization problems arise in many scientific and practical applications. Therefore many researchers try to find or improve different methods to solve these problems with high quality results and in less time. Genetic Algorithm (GA) and Simulated Annealing (SA) have been used to solve optimization problems. Both GA and SA search a solution space throughout a sequence of iterative states. However, there are also significant differences between them. The GA mechanism is parallel on a set of solutions and exchanges information using the crossover operation. SA works on a single solution at a time. In this work SA and GA are combined using new technique in order to overcome the disadvantages' of both algorithms.

Keywords: Genetic Algorithm, Optimization problems, Simulated Annealing, Traveling Salesman Problem

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3440
3801 Wavelet Entropy Based Algorithm for Fault Detection and Classification in FACTS Compensated Transmission Line

Authors: Amany M. El-Zonkoly, Hussein Desouki

Abstract:

Distance protection of transmission lines including advanced flexible AC transmission system (FACTS) devices has been a very challenging task. FACTS devices of interest in this paper are static synchronous series compensators (SSSC) and unified power flow controller (UPFC). In this paper, a new algorithm is proposed to detect and classify the fault and identify the fault position in a transmission line with respect to a FACTS device placed in the midpoint of the transmission line. Discrete wavelet transformation and wavelet entropy calculations are used to analyze during fault current and voltage signals of the compensated transmission line. The proposed algorithm is very simple and accurate in fault detection and classification. A variety of fault cases and simulation results are introduced to show the effectiveness of such algorithm.

Keywords: Entropy calculation, FACTS, SSSC, UPFC, wavelet transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2074
3800 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: Predictive analysis, big data, predictive analysis algorithms. CART algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1075
3799 Digital Image Encryption Scheme using Chaotic Sequences with a Nonlinear Function

Authors: H. Ogras, M. Turk

Abstract:

In this study, a system of encryption based on chaotic sequences is described. The system is used for encrypting digital image data for the purpose of secure image transmission. An image secure communication scheme based on Logistic map chaotic sequences with a nonlinear function is proposed in this paper. Encryption and decryption keys are obtained by one-dimensional Logistic map that generates secret key for the input of the nonlinear function. Receiver can recover the information using the received signal and identical key sequences through the inverse system technique. The results of computer simulations indicate that the transmitted source image can be correctly and reliably recovered by using proposed scheme even under the noisy channel. The performance of the system will be discussed through evaluating the quality of recovered image with and without channel noise.

Keywords: Digital image, Image encryption, Secure communication

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2238
3798 Fast Generation of High-Performance Driveshafts: A Digital Approach to Automated Linked Topology and Design Optimization

Authors: Willi Zschiebsch, Alrik Dargel, Sebastian Spitzer, Philipp Johst, Robert Böhm, Niels Modler

Abstract:

In this article, we investigate an approach that digitally links individual development process steps by using the drive shaft of an aircraft engine as representative example of a fiber polymer composite. Such high-performance lightweight composite structures have many adjustable parameters that influence the mechanical properties. Only a combination of optimal parameter values can lead to energy efficient lightweight structures. The development tools required for the Engineering Design Process (EDP) are often isolated solutions and their compatibility with each other is limited. A digital framework is presented in this study, which allows individual specialised tools to be linked via the generated data in such a way that automated optimization across programs becomes possible. This is demonstrated using the example of linking geometry generation with numerical structural analysis. The proposed digital framework for automated design optimization demonstrates the feasibility of developing a complete digital approach to design optimization. The methodology shows promising potential for achieving optimal solutions in terms of mass, material utilization, eigenfrequency and deformation under lateral load with less development effort. The development of such a framework is an important step towards promoting a more efficient design approach that can lead to stable and balanced results.

Keywords: Digital Linked Process, composite, CFRP, multi-objective, EDP, NSGA-2, NSGA-3, TPE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 163
3797 Development of Perez-Du Mortier Calibration Algorithm for Ground-Based Aerosol Optical Depth Measurement with Validation using SMARTS Model

Authors: Jedol Dayou, Jackson Hian Wui Chang, Rubena Yusoff, Ag. Sufiyan Abd. Hamid, Fauziah Sulaiman, Justin Sentian

Abstract:

Aerosols are small particles suspended in air that have wide varying spatial and temporal distributions. The concentration of aerosol in total columnar atmosphere is normally measured using aerosol optical depth (AOD). In long-term monitoring stations, accurate AOD retrieval is often difficult due to the lack of frequent calibration. To overcome this problem, a near-sea-level Langley calibration algorithm is developed using the combination of clear-sky detection model and statistical filter. It attempts to produce a dataset that consists of only homogenous and stable atmospheric condition for the Langley calibration purposes. In this paper, a radiance-based validation method is performed to further investigate the feasibility and consistency of the proposed algorithm at different location, day, and time. The algorithm is validated using SMARTS model based n DNI value. The overall results confirmed that the proposed calibration algorithm feasible and consistent for measurements taken at different sites and weather conditions.

Keywords: Aerosol optical depth, direct normal irradiance, Langley calibration, radiance-based validation, SMARTS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1808
3796 Unpacking Chilean Preservice Teachers’ Beliefs on Practicum Experiences through Digital Stories

Authors: Claudio Díaz, Mabel Ortiz

Abstract:

An EFL teacher education programme in Chile takes five years to train a future teacher of English. Preservice teachers are prepared to learn an advanced level of English and teach the language from 5th to 12th grade in the Chilean educational system. In the context of their first EFL Methodology course in year four, preservice teachers have to create a five-minute digital story that starts from a critical incident they have experienced as teachers-to-be during their observations or interventions in the schools. A critical incident can be defined as a happening, a specific incident or event either observed by them or involving them. The happening sparks their thinking and may make them subsequently think differently about the particular event. When they create their digital stories, preservice teachers put technology, teaching practice and theory together to narrate a story that is complemented by still images, moving images, text, sound effects and music. The story should be told as a personal narrative, which explains the critical incident. This presentation will focus on the creation process of 50 Chilean preservice teachers’ digital stories highlighting the critical incidents they started their stories. It will also unpack preservice teachers’ beliefs and reflections when approaching their teaching practices in schools. These beliefs will be coded and categorized through content analysis to evidence preservice teachers’ most rooted conceptions about English teaching and learning in Chilean schools. The findings seem to indicate that preservice teachers’ beliefs are strongly mediated by contextual and affective factors.

Keywords: Beliefs, Digital stories, Preservice teachers, Practicum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443
3795 Proposing a Pareto-based Multi-Objective Evolutionary Algorithm to Flexible Job Shop Scheduling Problem

Authors: Seyed Habib A. Rahmati

Abstract:

During last decades, developing multi-objective evolutionary algorithms for optimization problems has found considerable attention. Flexible job shop scheduling problem, as an important scheduling optimization problem, has found this attention too. However, most of the multi-objective algorithms that are developed for this problem use nonprofessional approaches. In another words, most of them combine their objectives and then solve multi-objective problem through single objective approaches. Of course, except some scarce researches that uses Pareto-based algorithms. Therefore, in this paper, a new Pareto-based algorithm called controlled elitism non-dominated sorting genetic algorithm (CENSGA) is proposed for the multi-objective FJSP (MOFJSP). Our considered objectives are makespan, critical machine work load, and total work load of machines. The proposed algorithm is also compared with one the best Pareto-based algorithms of the literature on some multi-objective criteria, statistically.

Keywords: Scheduling, Flexible job shop scheduling problem, controlled elitism non-dominated sorting genetic algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1936
3794 A Comparison between Heuristic and Meta-Heuristic Methods for Solving the Multiple Traveling Salesman Problem

Authors: San Nah Sze, Wei King Tiong

Abstract:

The multiple traveling salesman problem (mTSP) can be used to model many practical problems. The mTSP is more complicated than the traveling salesman problem (TSP) because it requires determining which cities to assign to each salesman, as well as the optimal ordering of the cities within each salesman's tour. Previous studies proposed that Genetic Algorithm (GA), Integer Programming (IP) and several neural network (NN) approaches could be used to solve mTSP. This paper compared the results for mTSP, solved with Genetic Algorithm (GA) and Nearest Neighbor Algorithm (NNA). The number of cities is clustered into a few groups using k-means clustering technique. The number of groups depends on the number of salesman. Then, each group is solved with NNA and GA as an independent TSP. It is found that k-means clustering and NNA are superior to GA in terms of performance (evaluated by fitness function) and computing time.

Keywords: Multiple Traveling Salesman Problem, GeneticAlgorithm, Nearest Neighbor Algorithm, k-Means Clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3232
3793 A Novel Plausible Deniability Scheme in Secure Steganography

Authors: Farshad Amin, Majid Soleimanipour, Alireza Karimi

Abstract:

The goal of steganography is to avoid drawing suspicion to the transmission of a hidden message. If suspicion is raised, steganography may fail. The success of steganography depends on the secrecy of the action. If steganography is detected, the system will fail but data security depends on the robustness of the applied algorithm. In this paper, we propose a novel plausible deniability scheme in steganography by using a diversionary message and encrypt it with a DES-based algorithm. Then, we compress the secret message and encrypt it by the receiver-s public key along with the stego key and embed both messages in a carrier using an embedding algorithm. It will be demonstrated how this method can support plausible deniability and is robust against steganalysis.

Keywords: Steganography, Cryptography, Information Hiding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2190