Search results for: genetic data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8017

Search results for: genetic data

6817 The Pixel Value Data Approach for Rainfall Forecasting Based on GOES-9 Satellite Image Sequence Analysis

Authors: C. Yaiprasert, K. Jaroensutasinee, M. Jaroensutasinee

Abstract:

To develop a process of extracting pixel values over the using of satellite remote sensing image data in Thailand. It is a very important and effective method of forecasting rainfall. This paper presents an approach for forecasting a possible rainfall area based on pixel values from remote sensing satellite images. First, a method uses an automatic extraction process of the pixel value data from the satellite image sequence. Then, a data process is designed to enable the inference of correlations between pixel value and possible rainfall occurrences. The result, when we have a high averaged pixel value of daily water vapor data, we will also have a high amount of daily rainfall. This suggests that the amount of averaged pixel values can be used as an indicator of raining events. There are some positive associations between pixel values of daily water vapor images and the amount of daily rainfall at each rain-gauge station throughout Thailand. The proposed approach was proven to be a helpful manual for rainfall forecasting from meteorologists by which using automated analyzing and interpreting process of meteorological remote sensing data.

Keywords: Pixel values, satellite image, water vapor, rainfall, image processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1855
6816 Multistage Data Envelopment Analysis Model for Malmquist Productivity Index Using Grey's System Theory to Evaluate Performance of Electric Power Supply Chain in Iran

Authors: Mesbaholdin Salami, Farzad Movahedi Sobhani, Mohammad Sadegh Ghazizadeh

Abstract:

Evaluation of organizational performance is among the most important measures that help organizations and entities continuously improve their efficiency. Organizations can use the existing data and results from the comparison of units under investigation to obtain an estimation of their performance. The Malmquist Productivity Index (MPI) is an important index in the evaluation of overall productivity, which considers technological developments and technical efficiency at the same time. This article proposed a model based on the multistage MPI, considering limited data (Grey’s theory). This model can evaluate the performance of units using limited and uncertain data in a multistage process. It was applied by the electricity market manager to Iran’s electric power supply chain (EPSC), which contains uncertain data, to evaluate the performance of its actors. Results from solving the model showed an improvement in the accuracy of future performance of the units under investigation, using the Grey’s system theory. This model can be used in all case studies, in which MPI is used and there are limited or uncertain data.

Keywords: Malmquist Index, Grey's Theory, Charnes Cooper & Rhodes (CCR) Model, network data envelopment analysis, Iran electricity power chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 550
6815 Optimization of Shear Frame Structures Applying Various Forms of Wavelet Transforms

Authors: Seyed Sadegh Naseralavi, Sohrab Nemati, Ehsan Khojastehfar, Sadegh Balaghi

Abstract:

In the present research, various formulations of wavelet transform are applied on acceleration time history of earthquake. The mentioned transforms decompose the strong ground motion into low and high frequency parts. Since the high frequency portion of strong ground motion has a minor effect on dynamic response of structures, the structure is excited by low frequency part. Consequently, the seismic response of structure is predicted consuming one half of computational time, comparing with conventional time history analysis. Towards reducing the computational effort needed in seismic optimization of structure, seismic optimization of a shear frame structure is conducted by applying various forms of mentioned transformation through genetic algorithm.

Keywords: Time history analysis, wavelet transform, optimization, earthquake.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 790
6814 Optimization of Distribution Network Configuration for Loss Reduction Using Artificial Bee Colony Algorithm

Authors: R. Srinivasa Rao, S.V.L. Narasimham, M. Ramalingaraju

Abstract:

Network reconfiguration in distribution system is realized by changing the status of sectionalizing switches to reduce the power loss in the system. This paper presents a new method which applies an artificial bee colony algorithm (ABC) for determining the sectionalizing switch to be operated in order to solve the distribution system loss minimization problem. The ABC algorithm is a new population based metaheuristic approach inspired by intelligent foraging behavior of honeybee swarm. The advantage of ABC algorithm is that it does not require external parameters such as cross over rate and mutation rate as in case of genetic algorithm and differential evolution and it is hard to determine these parameters in prior. The other advantage is that the global search ability in the algorithm is implemented by introducing neighborhood source production mechanism which is a similar to mutation process. To demonstrate the validity of the proposed algorithm, computer simulations are carried out on 14, 33, and 119-bus systems and compared with different approaches available in the literature. The proposed method has outperformed the other methods in terms of the quality of solution and computational efficiency.

Keywords: Distribution system, Network reconfiguration, Loss reduction, Artificial Bee Colony Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3753
6813 Comparative Analysis of the Public Funding for Greek Universities: An Ordinal DEA/MCDM Approach

Authors: Yiannis Smirlis, Dimitris K. Despotis

Abstract:

This study performs a comparative analysis of the 21 Greek Universities in terms of their public funding, awarded for covering their operating expenditure. First it introduces a DEA/MCDM model that allocates the fund into four expenditure factors in the most favorable way for each university. Then, it presents a common, consensual assessment model to reallocate the amounts, remaining in the same level of total public budget. From the analysis it derives that a number of universities cannot justify the public funding in terms of their size and operational workload. For them, the sufficient reduction of their public funding amount is estimated as a future target. Due to the lack of precise data for a number of expenditure criteria, the analysis is based on a mixed crisp-ordinal data set.

Keywords: Data envelopment analysis, Greek universities, operating expenditures, ordinal data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1758
6812 Inversion of Electrical Resistivity Data: A Review

Authors: Shrey Sharma, Gunjan Kumar Verma

Abstract:

High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.

Keywords: Resistivity, inversion, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6063
6811 Increasing the System Availability of Data Centers by Using Virtualization Technologies

Authors: Chris Ewe, Naoum Jamous, Holger Schrödl

Abstract:

Like most entrepreneurs, data center operators pursue goals such as profit-maximization, improvement of the company’s reputation or basically to exist on the market. Part of those aims is to guarantee a given quality of service. Quality characteristics are specified in a contract called the service level agreement. Central part of this agreement is non-functional properties of an IT service. The system availability is one of the most important properties as it will be shown in this paper. To comply with availability requirements, data center operators can use virtualization technologies. A clear model to assess the effect of virtualization functions on the parts of a data center in relation to the system availability is still missing. This paper aims to introduce a basic model that shows these connections, and consider if the identified effects are positive or negative. Thus, this work also points out possible disadvantages of the technology. In consequence, the paper shows opportunities as well as risks of data center virtualization in relation to system availability.

Keywords: Availability, cloud computing IT service, quality of service, service level agreement, virtualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 992
6810 Biodiversity of Micromycetes Isolated from Soils of Different Agricultures in Kazakhstan and Their Plant Growth Promoting Potential

Authors: L. V. Ignatova, Y. V. Brazhnikova, T. D. Mukasheva, A. A. Omirbekova, R. Zh. Berzhanova, R. K. Sydykbekova, T. A. Karpenyuk, A. V. Goncharova

Abstract:

The comparative analysis of different taxonomic groups of microorganisms isolated from dark chernozem soils under different agricultures (alfalfa, melilot, sainfoin, soybean, rapeseed) at Almaty region of Kazakhstan was conducted. It was shown that the greatest number of micromycetes was typical to the soil planted with alfalfa and canola. Species diversity of micromycetes markedly decreases as it approaches the surface of the root, so that the species composition in the rhizosphere is much more uniform than in the virgin soil. Promising strains of microscopic fungi and yeast with plant growth-promoting activity to agricultures were selected. Among the selected fungi there are representatives of Penicillium bilaiae, Trichoderma koningii, Fusarium equiseti, Aspergillus ustus. The highest rates of growth and development of seedlings of plants observed under the influence of yeasts Aureobasidium pullulans, Rhodotorula mucilaginosa, Metschnikovia pulcherrima. Using molecular - genetic techniques confirmation of the identification results of selected micromycetes was conducted.

Keywords: Agricultures, biodiversity, micromycetes, plant growth-promoting microorganisms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2643
6809 Data Oriented Modeling of Uniform Random Variable: Applied Approach

Authors: Ahmad Habibizad Navin, Mehdi Naghian Fesharaki, Mirkamal Mirnia, Mohamad Teshnelab, Ehsan Shahamatnia

Abstract:

In this paper we introduce new data oriented modeling of uniform random variable well-matched with computing systems. Due to this conformity with current computers structure, this modeling will be efficiently used in statistical inference.

Keywords: Uniform random variable, Data oriented modeling, Statistical inference, Prodigraph, Statistically complete tree, Uniformdigital probability digraph, Uniform n-complete probability tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1625
6808 Enhancement of Stereo Video Pairs Using SDNs To Aid In 3D Reconstruction

Authors: Lewis E. Hibell, Honghai Liu, David J. Brown

Abstract:

This paper presents the results of enhancing images from a left and right stereo pair in order to increase the resolution of a 3D representation of a scene generated from that same pair. A new neural network structure known as a Self Delaying Dynamic Network (SDN) has been used to perform the enhancement. The advantage of SDNs over existing techniques such as bicubic interpolation is their ability to cope with motion and noise effects. SDNs are used to generate two high resolution images, one based on frames taken from the left view of the subject, and one based on the frames from the right. This new high resolution stereo pair is then processed by a disparity map generator. The disparity map generated is compared to two other disparity maps generated from the same scene. The first is a map generated from an original high resolution stereo pair and the second is a map generated using a stereo pair which has been enhanced using bicubic interpolation. The maps generated using the SDN enhanced pairs match more closely the target maps. The addition of extra noise into the input images is less problematic for the SDN system which is still able to out perform bicubic interpolation.

Keywords: Genetic Evolution, Image Enhancement, Neuron Networks, Stereo Vision

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1420
6807 Topic Modeling Using Latent Dirichlet Allocation and Latent Semantic Indexing on South African Telco Twitter Data

Authors: Phumelele P. Kubheka, Pius A. Owolawi, Gbolahan Aiyetoro

Abstract:

Twitter is one of the most popular social media platforms where users share their opinions on different subjects. Twitter can be considered a great source for mining text due to the high volumes of data generated through the platform daily. Many industries such as telecommunication companies can leverage the availability of Twitter data to better understand their markets and make an appropriate business decision. This study performs topic modeling on Twitter data using Latent Dirichlet Allocation (LDA). The obtained results are benchmarked with another topic modeling technique, Latent Semantic Indexing (LSI). The study aims to retrieve topics on a Twitter dataset containing user tweets on South African Telcos. Results from this study show that LSI is much faster than LDA. However, LDA yields better results with higher topic coherence by 8% for the best-performing model in this experiment. A higher topic coherence score indicates better performance of the model.

Keywords: Big data, latent Dirichlet allocation, latent semantic indexing, Telco, topic modeling, Twitter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 444
6806 Pin type Clamping Attachment for Remote Setup of Machining Process

Authors: Afzeri, R. Muhida, Darmawan, A. N. Berahim

Abstract:

Sharing the manufacturing facility through remote operation and monitoring of a machining process is challenge for effective use the production facility. Several automation tools in term of hardware and software are necessary for successfully remote operation of a machine. This paper presents a prototype of workpiece holding attachment for remote operation of milling process by self configuration the workpiece setup. The prototype is designed with mechanism to reorient the work surface into machining spindle direction with high positioning accuracy. Variety of parts geometry is hold by attachment to perform single setup machining. Pin type with array pattern additionally clamps the workpiece surface from two opposite directions for increasing the machining rigidity. Optimum pins configuration for conforming the workpiece geometry with minimum deformation is determined through hybrid algorithms, Genetic Algorithms (GA) and Particle Swarm Optimization (PSO). Prototype with intelligent optimization technique enables to hold several variety of workpiece geometry which is suitable for machining low of repetitive production in remote operation.

Keywords: Optimization, Remote machining, GeneticAlgorithms, Machining Fixture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2634
6805 Prediction of Compressive Strength of SCC Containing Bottom Ash using Artificial Neural Networks

Authors: Yogesh Aggarwal, Paratibha Aggarwal

Abstract:

The paper presents a comparative performance of the models developed to predict 28 days compressive strengths using neural network techniques for data taken from literature (ANN-I) and data developed experimentally for SCC containing bottom ash as partial replacement of fine aggregates (ANN-II). The data used in the models are arranged in the format of six and eight input parameters that cover the contents of cement, sand, coarse aggregate, fly ash as partial replacement of cement, bottom ash as partial replacement of sand, water and water/powder ratio, superplasticizer dosage and an output parameter that is 28-days compressive strength and compressive strengths at 7 days, 28 days, 90 days and 365 days, respectively for ANN-I and ANN-II. The importance of different input parameters is also given for predicting the strengths at various ages using neural network. The model developed from literature data could be easily extended to the experimental data, with bottom ash as partial replacement of sand with some modifications.

Keywords: Self compacting concrete, bottom ash, strength, prediction, neural network, importance factor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2222
6804 Ride Control of Passenger Cars with Semi-active Suspension System Using a Linear Quadratic Regulator and Hybrid Optimization Algorithm

Authors: Ali Fellah Jahromi, Wen Fang Xie, Rama B. Bhat

Abstract:

A semi-active control strategy for suspension systems of passenger cars is presented employing Magnetorheological (MR) dampers. The vehicle is modeled with seven DOFs including the, roll pitch and bounce of car body, and the vertical motion of the four tires. In order to design an optimal controller based on the actuator constraints, a Linear-Quadratic Regulator (LQR) is designed. The design procedure of the LQR consists of selecting two weighting matrices to minimize the energy of the control system. This paper presents a hybrid optimization procedure which is a combination of gradient-based and evolutionary algorithms to choose the weighting matrices with regards to the actuator constraint. The optimization algorithm is defined based on maximum comfort and actuator constraints. It is noted that utilizing the present control algorithm may significantly reduce the vibration response of the passenger car, thus, providing a comfortable ride.

Keywords: Full car model, Linear Quadratic Regulator, Sequential Quadratic Programming, Genetic Algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2937
6803 Using the Simple Fixed Rate Approach to Solve Economic Lot Scheduling Problem under the Basic Period Approach

Authors: Yu-Jen Chang, Yun Chen, Hei-Lam Wong

Abstract:

The Economic Lot Scheduling Problem (ELSP) is a valuable mathematical model that can support decision-makers to make scheduling decisions. The basic period approach is effective for solving the ELSP. The assumption for applying the basic period approach is that a product must use its maximum production rate to be produced. However, a product can lower its production rate to reduce the average total cost when a facility has extra idle time. The past researches discussed how a product adjusts its production rate under the common cycle approach. To the best of our knowledge, no studies have addressed how a product lowers its production rate under the basic period approach. This research is the first paper to discuss this topic. The research develops a simple fixed rate approach that adjusts the production rate of a product under the basic period approach to solve the ELSP. Our numerical example shows our approach can find a better solution than the traditional basic period approach. Our mathematical model that applies the fixed rate approach under the basic period approach can serve as a reference for other related researches.

Keywords: Economic Lot, Basic Period, Genetic Algorithm, Fixed Rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1935
6802 A Review: Comparative Analysis of Different Categorical Data Clustering Ensemble Methods

Authors: S. Sarumathi, N. Shanthi, M. Sharmila

Abstract:

Over the past epoch a rampant amount of work has been done in the data clustering research under the unsupervised learning technique in Data mining. Furthermore several algorithms and methods have been proposed focusing on clustering different data types, representation of cluster models, and accuracy rates of the clusters. However no single clustering algorithm proves to be the most efficient in providing best results. Accordingly in order to find the solution to this issue a new technique, called Cluster ensemble method was bloomed. This cluster ensemble is a good alternative approach for facing the cluster analysis problem. The main hope of the cluster ensemble is to merge different clustering solutions in such a way to achieve accuracy and to improve the quality of individual data clustering. Due to the substantial and unremitting development of new methods in the sphere of data mining and also the incessant interest in inventing new algorithms, makes obligatory to scrutinize a critical analysis of the existing techniques and the future novelty. This paper exposes the comparative study of different cluster ensemble methods along with their features, systematic working process and the average accuracy and error rates of each ensemble methods. Consequently this speculative and comprehensive analysis will be very useful for the community of clustering practitioners and also helps in deciding the most suitable one to rectify the problem in hand.

Keywords: Clustering, Cluster Ensemble methods, Co-association matrix, Consensus function, Median partition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2597
6801 Application of Data Envelopment Analysis to Assess Quality Management Efficiency

Authors: Chuen Tse Kuah, Kuan Yew Wong, Farzad Behrouzi

Abstract:

This paper is aimed to give an illustration on the application of Data Envelopment Analysis (DEA) as a tool to assess Quality Management (QM) efficiency. A variant of DEA, slack based measure (SBM) is used for this purpose. From this study, it is found that DEA is suitable to measure QM efficiency and give improvement suggestions to the inefficient QM.

Keywords: Quality Management, Data Envelopment Analysis, Slack Based Measure, Efficiency Measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2084
6800 Exploring the Correlation between Population Distribution and Urban Heat Island under Urban Data: Taking Shenzhen Urban Heat Island as an Example

Authors: Wang Yang

Abstract:

Shenzhen is a modern city of China's reform and opening-up policy, the development of urban morphology has been established on the administration of the Chinese government. This city`s planning paradigm is primarily affected by the spatial structure and human behavior. The subjective urban agglomeration center is divided into several groups and centers. In comparisons of this effect, the city development law has better to be neglected. With the continuous development of the internet, extensive data technology has been introduced in China. Data mining and data analysis has become important tools in municipal research. Data mining has been utilized to improve data cleaning such as receiving business data, traffic data and population data. Prior to data mining, government data were collected by traditional means, then were analyzed using city-relationship research, delaying the timeliness of urban development, especially for the contemporary city. Data update speed is very fast and based on the Internet. The city's point of interest (POI) in the excavation serves as data source affecting the city design, while satellite remote sensing is used as a reference object, city analysis is conducted in both directions, the administrative paradigm of government is broken and urban research is restored. Therefore, the use of data mining in urban analysis is very important. The satellite remote sensing data of the Shenzhen city in July 2018 were measured by the satellite Modis sensor and can be utilized to perform land surface temperature inversion, and analyze city heat island distribution of Shenzhen. This article acquired and classified the data from Shenzhen by using Data crawler technology. Data of Shenzhen heat island and interest points were simulated and analyzed in the GIS platform to discover the main features of functional equivalent distribution influence. Shenzhen is located in the east-west area of China. The city’s main streets are also determined according to the direction of city development. Therefore, it is determined that the functional area of the city is also distributed in the east-west direction. The urban heat island can express the heat map according to the functional urban area. Regional POI has correspondence. The research result clearly explains that the distribution of the urban heat island and the distribution of urban POIs are one-to-one correspondence. Urban heat island is primarily influenced by the properties of the underlying surface, avoiding the impact of urban climate. Using urban POIs as analysis object, the distribution of municipal POIs and population aggregation are closely connected, so that the distribution of the population corresponded with the distribution of the urban heat island.

Keywords: POI, satellite remote sensing, the population distribution, urban heat island thermal map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 919
6799 An Improved Particle Swarm Optimization Technique for Combined Economic and Environmental Power Dispatch Including Valve Point Loading Effects

Authors: Badr M. Alshammari, T. Guesmi

Abstract:

In recent years, the combined economic and emission power dispatch is one of the main problems of electrical power system. It aims to schedule the power generation of generators in order to minimize cost production and emission of harmful gases caused by fossil-fueled thermal units such as CO, CO2, NOx, and SO2. To solve this complicated multi-objective problem, an improved version of the particle swarm optimization technique that includes non-dominated sorting concept has been proposed. Valve point loading effects and system losses have been considered. The three-unit and ten-unit benchmark systems have been used to show the effectiveness of the suggested optimization technique for solving this kind of nonconvex problem. The simulation results have been compared with those obtained using genetic algorithm based method. Comparison results show that the proposed approach can provide a higher quality solution with better performance.

Keywords: Power dispatch, valve point loading effects, multiobjective optimization, Pareto solutions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 766
6798 Prototype of a Federative Factory Data Management for the Support of Factory Planning Processes

Authors: Christian Mosch, Reiner Anderl, Antonio Álvaro de Assis Moura, Klaus Schützer

Abstract:

Due to short product life cycles, increasing variety of products and short cycles of leap innovations manufacturing companies have to increase the flexibility of factory structures. Flexibility of factory structures is based on defined factory planning processes in which product, process and resource data of various partial domains have to be considered. Thus factory planning processes can be characterized as iterative, interdisciplinary and participative processes [1]. To support interdisciplinary and participative character of planning processes, a federative factory data management (FFDM) as a holistic solution will be described. FFDM is already implemented in form of a prototype. The interim results of the development of FFDM will be shown in this paper. The principles are the extracting of product, process and resource data from documents of various partial domains providing as web services on a server. The described data can be requested by the factory planner by using a FFDM-browser.

Keywords: BRAGECRIM, Factory Planning Process, FactoryData Management, Web Services

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1491
6797 Multi-Criteria Optimization of High-Temperature Reversed Starter-Generator

Authors: Flur R. Ismagilov, Irek Kh. Khayrullin, Vyacheslav E. Vavilov, Ruslan D. Karimov, Anton S. Gorbunov, Danis R. Farrakhov

Abstract:

The paper presents another structural scheme of high-temperature starter-generator with external rotor to be installed on High Pressure Shaft (HPS) of aircraft engines (AE) to implement More Electrical Engine concept. The basic materials to make this starter-generator (SG) were selected and justified. Multi-criteria optimization of the developed structural scheme was performed using a genetic algorithm and Pareto method. The optimum (in Pareto terms) active length and thickness of permanent magnets of SG were selected as a result of the optimization. Using the dimensions obtained, allowed to reduce the weight of the designed SG by 10 kg relative to a base option at constant thermal loads. Multidisciplinary computer simulation was performed on the basis of the optimum geometric dimensions, which proved performance efficiency of the design. We further plan to make a full-scale sample of SG of HPS and publish the results of its experimental research.

Keywords: High-temperature starter-generator, More electrical engine, multi-criteria optimization, permanent magnet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1204
6796 A Differential Calculus Based Image Steganography with Crossover

Authors: Srilekha Mukherjee, Subha Ash, Goutam Sanyal

Abstract:

Information security plays a major role in uplifting the standard of secured communications via global media. In this paper, we have suggested a technique of encryption followed by insertion before transmission. Here, we have implemented two different concepts to carry out the above-specified tasks. We have used a two-point crossover technique of the genetic algorithm to facilitate the encryption process. For each of the uniquely identified rows of pixels, different mathematical methodologies are applied for several conditions checking, in order to figure out all the parent pixels on which we perform the crossover operation. This is done by selecting two crossover points within the pixels thereby producing the newly encrypted child pixels, and hence the encrypted cover image. In the next lap, the first and second order derivative operators are evaluated to increase the security and robustness. The last lap further ensures reapplication of the crossover procedure to form the final stego-image. The complexity of this system as a whole is huge, thereby dissuading the third party interferences. Also, the embedding capacity is very high. Therefore, a larger amount of secret image information can be hidden. The imperceptible vision of the obtained stego-image clearly proves the proficiency of this approach.

Keywords: Steganography, Crossover, Differential Calculus, Peak Signal to Noise Ratio, Cross-correlation Coefficient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1386
6795 Analysis of a WDM System for Tanzania

Authors: Shaban Pazi, Chris Chatwin, Rupert Young, Philip Birch

Abstract:

Internet infrastructures in most places of the world have been supported by the advancement of optical fiber technology, most notably wavelength division multiplexing (WDM) system. Optical technology by means of WDM system has revolutionized long distance data transport and has resulted in high data capacity, cost reductions, extremely low bit error rate, and operational simplification of the overall Internet infrastructure. This paper analyses and compares the system impairments, which occur at data transmission rates of 2.5Gb/s and 10 Gb/s per wavelength channel in our proposed optical WDM system for Internet infrastructure in Tanzania. The results show that the data transmission rate of 2.5 Gb/s has minimum system impairments compared with a rate of 10 Gb/s per wavelength channel, and achieves a sufficient system performance to provide a good Internet access service.

Keywords: Internet infrastructure, WDM system, standard single mode fibers, system impairments.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
6794 Parameter Tuning of Complex Systems Modeled in Agent Based Modeling and Simulation

Authors: Rabia Korkmaz Tan, Şebnem Bora

Abstract:

The major problem encountered when modeling complex systems with agent-based modeling and simulation techniques is the existence of large parameter spaces. A complex system model cannot be expected to reflect the whole of the real system, but by specifying the most appropriate parameters, the actual system can be represented by the model under certain conditions. When the studies conducted in recent years were reviewed, it has been observed that there are few studies for parameter tuning problem in agent based simulations, and these studies have focused on tuning parameters of a single model. In this study, an approach of parameter tuning is proposed by using metaheuristic algorithms such as Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Artificial Bee Colonies (ABC), Firefly (FA) algorithms. With this hybrid structured study, the parameter tuning problems of the models in the different fields were solved. The new approach offered was tested in two different models, and its achievements in different problems were compared. The simulations and the results reveal that this proposed study is better than the existing parameter tuning studies.

Keywords: Parameter tuning, agent based modeling and simulation, metaheuristic algorithms, complex systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1240
6793 Comparison of Data Reduction Algorithms for Image-Based Point Cloud Derived Digital Terrain Models

Authors: M. Uysal, M. Yilmaz, I. Tiryakioğlu

Abstract:

Digital Terrain Model (DTM) is a digital numerical representation of the Earth's surface. DTMs have been applied to a diverse field of tasks, such as urban planning, military, glacier mapping, disaster management. In the expression of the Earth' surface as a mathematical model, an infinite number of point measurements are needed. Because of the impossibility of this case, the points at regular intervals are measured to characterize the Earth's surface and DTM of the Earth is generated. Hitherto, the classical measurement techniques and photogrammetry method have widespread use in the construction of DTM. At present, RADAR, LiDAR, and stereo satellite images are also used for the construction of DTM. In recent years, especially because of its superiorities, Airborne Light Detection and Ranging (LiDAR) has an increased use in DTM applications. A 3D point cloud is created with LiDAR technology by obtaining numerous point data. However recently, by the development in image mapping methods, the use of unmanned aerial vehicles (UAV) for photogrammetric data acquisition has increased DTM generation from image-based point cloud. The accuracy of the DTM depends on various factors such as data collection method, the distribution of elevation points, the point density, properties of the surface and interpolation methods. In this study, the random data reduction method is compared for DTMs generated from image based point cloud data. The original image based point cloud data set (100%) is reduced to a series of subsets by using random algorithm, representing the 75, 50, 25 and 5% of the original image based point cloud data set. Over the ANS campus of Afyon Kocatepe University as the test area, DTM constructed from the original image based point cloud data set is compared with DTMs interpolated from reduced data sets by Kriging interpolation method. The results show that the random data reduction method can be used to reduce the image based point cloud datasets to 50% density level while still maintaining the quality of DTM.

Keywords: DTM, unmanned aerial vehicle, UAV, random, Kriging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 802
6792 Frequent Itemset Mining Using Rough-Sets

Authors: Usman Qamar, Younus Javed

Abstract:

Frequent pattern mining is the process of finding a pattern (a set of items, subsequences, substructures, etc.) that occurs frequently in a data set. It was proposed in the context of frequent itemsets and association rule mining. Frequent pattern mining is used to find inherent regularities in data. What products were often purchased together? Its applications include basket data analysis, cross-marketing, catalog design, sale campaign analysis, Web log (click stream) analysis, and DNA sequence analysis. However, one of the bottlenecks of frequent itemset mining is that as the data increase the amount of time and resources required to mining the data increases at an exponential rate. In this investigation a new algorithm is proposed which can be uses as a pre-processor for frequent itemset mining. FASTER (FeAture SelecTion using Entropy and Rough sets) is a hybrid pre-processor algorithm which utilizes entropy and roughsets to carry out record reduction and feature (attribute) selection respectively. FASTER for frequent itemset mining can produce a speed up of 3.1 times when compared to original algorithm while maintaining an accuracy of 71%.

Keywords: Rough-sets, Classification, Feature Selection, Entropy, Outliers, Frequent itemset mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2428
6791 Primer Design with Specific PCR Product using Particle Swarm Optimization

Authors: Cheng-Hong Yang, Yu-Huei Cheng, Hsueh-Wei Chang, Li-Yeh Chuang

Abstract:

Before performing polymerase chain reactions (PCR), a feasible primer set is required. Many primer design methods have been proposed for design a feasible primer set. However, the majority of these methods require a relatively long time to obtain an optimal solution since large quantities of template DNA need to be analyzed. Furthermore, the designed primer sets usually do not provide a specific PCR product. In recent years, evolutionary computation has been applied to PCR primer design and yielded promising results. In this paper, a particle swarm optimization (PSO) algorithm is proposed to solve primer design problems associated with providing a specific product for PCR experiments. A test set of the gene CYP1A1, associated with a heightened lung cancer risk was analyzed and the comparison of accuracy and running time with the genetic algorithm (GA) and memetic algorithm (MA) was performed. A comparison of results indicated that the proposed PSO method for primer design finds optimal or near-optimal primer sets and effective PCR products in a relatively short time.

Keywords: polymerase chain reaction (PCR), primer design, evolutionary computation, particle swarm optimization (PSO).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1873
6790 Analysis of Cross-Sectional and Retrograde Data on the Prevalence of Marginal Gingivitis

Authors: Ilma Robo, Saimir Heta, Nedja Hysi, Vera Ostreni

Abstract:

Introduction: Marginal gingivitis is a disease with considerable frequency among patients who present routinely for periodontal control and treatment. In fact, this disease may not have alarming symptoms in patients and may go unnoticed by themselves when personal hygiene conditions are optimal. The aim of this study was to collect retrograde data on the prevalence of marginal gingiva in the respective group of patients, evaluated according to specific periodontal diagnostic tools. Materials and methods: The study was conducted in two patient groups. The first group was with 34 patients, during December 2019-January 2020, and the second group was with 64 patients during 2010-2018 (each year in the mentioned monthly period). Bacterial plaque index, hemorrhage index, amount of gingival fluid, presence of xerostomia and candidiasis were recorded in patients. Results: Analysis of the collected data showed that susceptibility to marginal gingivitis shows higher values according to retrograde data, compared to cross-sectional ones. Susceptibility to candidiasis and the occurrence of xerostomia, even in the combination of both pathologies, as risk factors for the occurrence of marginal gingivitis, show higher values ​​according to retrograde data. The female are presented with a reduced bacterial plaque index than the males, but more importantly, this index in the females is also associated with a reduced index of gingival hemorrhage, in contrast to the males. Conclusions: Cross-sectional data show that the prevalence of marginal gingivitis is more reduced, compared to retrograde data, based on the hemorrhage index and the bacterial plaque index together. Changes in production in the amount of gingival fluid show a higher prevalence of marginal gingivitis in cross-sectional data than in retrograde data; this is based on the sophistication of the way data are recorded, which evolves over time and also based on professional sensitivity to this phenomenon.

Keywords: Marginal gingivitis, cross-sectional, retrograde, prevalence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 505
6789 Growing Self Organising Map Based Exploratory Analysis of Text Data

Authors: Sumith Matharage, Damminda Alahakoon

Abstract:

Textual data plays an important role in the modern world. The possibilities of applying data mining techniques to uncover hidden information present in large volumes of text collections is immense. The Growing Self Organizing Map (GSOM) is a highly successful member of the Self Organising Map family and has been used as a clustering and visualisation tool across wide range of disciplines to discover hidden patterns present in the data. A comprehensive analysis of the GSOM’s capabilities as a text clustering and visualisation tool has so far not been published. These functionalities, namely map visualisation capabilities, automatic cluster identification and hierarchical clustering capabilities are presented in this paper and are further demonstrated with experiments on a benchmark text corpus.

Keywords: Text Clustering, Growing Self Organizing Map, Automatic Cluster Identification, Hierarchical Clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1992
6788 Impact of Fixation Time on Subjective Video Quality Metric: a New Proposal for Lossy Compression Impairment Assessment

Authors: M. G. Albanesi, R. Amadeo

Abstract:

In this paper, a new approach for quality assessment tasks in lossy compressed digital video is proposed. The research activity is based on the visual fixation data recorded by an eye tracker. The method involved both a new paradigm for subjective quality evaluation and the subsequent statistical analysis to match subjective scores provided by the observer to the data obtained from the eye tracker experiments. The study brings improvements to the state of the art, as it solves some problems highlighted in literature. The experiments prove that data obtained from an eye tracker can be used to classify videos according to the level of impairment due to compression. The paper presents the methodology, the experimental results and their interpretation. Conclusions suggest that the eye tracker can be useful in quality assessment, if data are collected and analyzed in a proper way.

Keywords: eye tracker, video compression, video qualityassessment, visual attention

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1603