Search results for: machining accuracy.
523 Intelligent Earthquake Prediction System Based On Neural Network
Authors: Emad Amar, Tawfik Khattab, Fatma Zada
Abstract:
Predicting earthquakes is an important issue in the study of geography. Accurate prediction of earthquakes can help people to take effective measures to minimize the loss of personal and economic damage, such as large casualties, destruction of buildings and broken of traffic, occurred within a few seconds. United States Geological Survey (USGS) science organization provides reliable scientific information about Earthquake Existed throughout history & the Preliminary database from the National Center Earthquake Information (NEIC) show some useful factors to predict an earthquake in a seismic area like Aleutian Arc in the U.S. state of Alaska. The main advantage of this prediction method that it does not require any assumption, it makes prediction according to the future evolution of the object's time series. The article compares between simulation data result from trained BP and RBF neural network versus actual output result from the system calculations. Therefore, this article focuses on analysis of data relating to real earthquakes. Evaluation results show better accuracy and higher speed by using radial basis functions (RBF) neural network.
Keywords: BP neural network, Prediction, RBF neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3219522 Pallet Tracking and Cost Optimization of the Flow of Goods in Logistics Operations by Serial Shipping Container Code
Authors: Dominika Crnjac Milic, Martina Martinovic, Vladimir Simovic
Abstract:
The case study method in this paper shows the implementation of Information Technology (IT) and the Serial Shipping Container Code (SSCC) in a Croatian company that deals with logistics operations and provides logistics services in the cold chain segment. This company is aware of the sensitivity of the goods entrusted to them by the user of the service, as well as of the importance of speed and accuracy in providing logistics services. To that end, it has implemented and used the latest IT to ensure the highest standard of high-quality logistics services to its customers. Looking for efficiency and optimization of supply chain management, while maintaining a high level of quality of the products that are sold, today's users of outsourced logistics services are open to the implementation of new IT products that ultimately deliver savings. By analysing the positive results and the difficulties that arise when using this technology, we aim to provide an insight into the potential of this approach of the logistics service provider.
Keywords: Logistics operations, serial shipping container code, SSCC, information technology, cost optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 940521 Hippocampus Segmentation using a Local Prior Model on its Boundary
Authors: Dimitrios Zarpalas, Anastasios Zafeiropoulos, Petros Daras, Nicos Maglaveras
Abstract:
Segmentation techniques based on Active Contour Models have been strongly benefited from the use of prior information during their evolution. Shape prior information is captured from a training set and is introduced in the optimization procedure to restrict the evolution into allowable shapes. In this way, the evolution converges onto regions even with weak boundaries. Although significant effort has been devoted on different ways of capturing and analyzing prior information, very little thought has been devoted on the way of combining image information with prior information. This paper focuses on a more natural way of incorporating the prior information in the level set framework. For proof of concept the method is applied on hippocampus segmentation in T1-MR images. Hippocampus segmentation is a very challenging task, due to the multivariate surrounding region and the missing boundary with the neighboring amygdala, whose intensities are identical. The proposed method, mimics the human segmentation way and thus shows enhancements in the segmentation accuracy.Keywords: Medical imaging & processing, Brain MRI segmentation, hippocampus segmentation, hippocampus-amygdala missingboundary, weak boundary segmentation, region based segmentation, prior information, local weighting scheme in level sets, spatialdistribution of labels, gradient distribution on boundary.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1752520 On the Efficient Implementation of a Serial and Parallel Decomposition Algorithm for Fast Support Vector Machine Training Including a Multi-Parameter Kernel
Authors: Tatjana Eitrich, Bruno Lang
Abstract:
This work deals with aspects of support vector machine learning for large-scale data mining tasks. Based on a decomposition algorithm for support vector machine training that can be run in serial as well as shared memory parallel mode we introduce a transformation of the training data that allows for the usage of an expensive generalized kernel without additional costs. We present experiments for the Gaussian kernel, but usage of other kernel functions is possible, too. In order to further speed up the decomposition algorithm we analyze the critical problem of working set selection for large training data sets. In addition, we analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our tests and conclusions led to several modifications of the algorithm and the improvement of overall support vector machine learning performance. Our method allows for using extensive parameter search methods to optimize classification accuracy.
Keywords: Support Vector Machine Training, Multi-ParameterKernels, Shared Memory Parallel Computing, Large Data
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443519 Reliability Factors Based Fuzzy Logic Scheme for Spectrum Sensing
Authors: Tallataf Rasheed, Adnan Rashdi, Ahmad Naeem Akhtar
Abstract:
The accurate spectrum sensing is a fundamental requirement of dynamic spectrum access for deployment of Cognitive Radio Network (CRN). To acheive this requirement a Reliability factors based Fuzzy Logic (RFL) Scheme for Spectrum Sensing has been proposed in this paper. Cognitive Radio User (CRU) predicts the presence or absence of Primary User (PU) using energy detector and calculates the Reliability factors which are SNR of sensing node, threshold of energy detector and decision difference of each node with other nodes in a cooperative spectrum sensing environment. Then the decision of energy detector is combined with Reliability factors of sensing node using Fuzzy Logic. These Reliability Factors used in RFL Scheme describes the reliability of decision made by a CRU to improve the local spectrum sensing. This Fuzzy combining scheme provides the accuracy of decision made by sensornode. The simulation results have shown that the proposed technique provide better PU detection probability than existing Spectrum Sensing Techniques.Keywords: Cognitive radio, spectrum sensing, energy detector, reliability factors, fuzzy logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1065518 Impact of Stack Caches: Locality Awareness and Cost Effectiveness
Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang
Abstract:
Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.
Keywords: Hit rate, Locality of program, Stack cache, and Stack data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1508517 Empirical Modeling of Air Dried Rubberwood Drying System
Authors: S. Khamtree, T. Ratanawilai, C. Nuntadusit
Abstract:
Rubberwood is a crucial commercial timber in Southern Thailand. All processes in a rubberwood production depend on the knowledge and expertise of the technicians, especially the drying process. This research aims to develop an empirical model for drying kinetics in rubberwood. During the experiment, the temperature of the hot air and the average air flow velocity were kept at 80-100 °C and 1.75 m/s, respectively. The moisture content in the samples was determined less than 12% in the achievement of drying basis. The drying kinetic was simulated using an empirical solver. The experimental results illustrated that the moisture content was reduced whereas the drying temperature and time were increased. The coefficient of the moisture ratio between the empirical and the experimental model was tested with three statistical parameters, R-square (R²), Root Mean Square Error (RMSE) and Chi-square (χ²) to predict the accuracy of the parameters. The experimental moisture ratio had a good fit with the empirical model. Additionally, the results indicated that the drying of rubberwood using the Henderson and Pabis model revealed the suitable level of agreement. The result presented an excellent estimation (R² = 0.9963) for the moisture movement compared to the other models. Therefore, the empirical results were valid and can be implemented in the future experiments.
Keywords: Empirical models, hot air, moisture ratio, rubberwood.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 780516 NSGA Based Optimal Volt / Var Control in Distribution System with Dispersed Generation
Authors: P. N. Hrisheekesha, Jaydev Sharma
Abstract:
In this paper, a method based on Non-Dominated Sorting Genetic Algorithm (NSGA) has been presented for the Volt / Var control in power distribution systems with dispersed generation (DG). Genetic algorithm approach is used due to its broad applicability, ease of use and high accuracy. The proposed method is better suited for volt/var control problems. A multi-objective optimization problem has been formulated for the volt/var control of the distribution system. The non-dominated sorting genetic algorithm based method proposed in this paper, alleviates the problem of tuning the weighting factors required in solving the multi-objective volt/var control optimization problems. Based on the simulation studies carried out on the distribution system, the proposed scheme has been found to be simple, accurate and easy to apply to solve the multiobjective volt/var control optimization problem of the distribution system with dispersed generation.Keywords: Dispersed Generation, Distribution System, Non-Dominated Sorting Genetic Algorithm, Voltage / Reactive powercontrol.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1630515 Model based Soft-Sensor for Industrial Crystallization: On-line Mass of Crystals and Solubility Measurement
Authors: Cédric Damour, Michel Benne, Brigitte Grondin-Perez, Jean-Pierre Chabriat
Abstract:
Monitoring and control of cane sugar crystallization processes depend on the stability of the supersaturation (σ ) state. The most widely used information to represent σ is the electrical conductivity κ of the solutions. Nevertheless, previous studies point out the shortcomings of this approach: κ may be regarded as inappropriate to guarantee an accurate estimation of σ in impure solutions. To improve the process control efficiency, additional information is necessary. The mass of crystals in the solution ( c m ) and the solubility (mass ratio of sugar to water / s w m m ) are relevant to complete information. Indeed, c m inherently contains information about the mass balance and / s w m m contains information about the supersaturation state of the solution. The main problem is that c m and / s w m m are not available on-line. In this paper, a model based soft-sensor is presented for a final crystallization stage (C sugar). Simulation results obtained on industrial data show the reliability of this approach, c m and the crystal content ( cc ) being estimated with a sufficient accuracy for achieving on-line monitoring in industryKeywords: Soft-sensor, on-line monitoring, cane sugarcrystallization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2136514 State Estimation Solution with Optimal Allocation of Phasor Measurement Units Considering Zero Injection Bus Modeling
Authors: M. Ravindra, R. Srinivasa Rao, V. Shanmukha Naga Raju
Abstract:
This paper presents state estimation with Phasor Measurement Unit (PMU) allocation to obtain complete observability of network. A matrix is designed with modeling of zero injection constraints to minimize PMU allocations. State estimation algorithm is developed with optimal allocation of PMUs to find accurate states of network. The incorporation of PMU into traditional state estimation process improves accuracy and computational performance for large power systems. The nonlinearity integrated with zero injection (ZI) constraints is remodeled to linear frame to optimize number of PMUs. The problem of optimal PMU allocation is regarded with modeling of ZI constraints, PMU loss or line outage, cost factor and redundant measurements. The proposed state estimation with optimal PMU allocation has been compared with traditional state estimation process to show its importance. MATLAB programming on IEEE 14, 30, 57, and 118 bus networks is implemented out by Binary Integer Programming (BIP) method and compared with other methods to show its effectiveness.
Keywords: Observability, phasor measurement units, synchrophasors, SCADA measurements, zero injection bus.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 807513 Rapid Method for Low Level 90Sr Determination in Seawater by Liquid Extraction Technique
Authors: S. Visetpotjanakit, N. Nakkaew
Abstract:
Determination of low level 90Sr in seawater has been widely developed for the purpose of environmental monitoring and radiological research because 90Sr is one of the most hazardous radionuclides released from atmospheric during the testing of nuclear weapons, waste discharge from the generation nuclear energy and nuclear accident occurring at power plants. A liquid extraction technique using bis-2-etylhexyl-phosphoric acid to separate and purify yttrium followed by Cherenkov counting using a liquid scintillation counter to determine 90Y in secular equilibrium to 90Sr was developed to monitor 90Sr in the Asia Pacific Ocean. The analytical performance was validated for the accuracy, precision, and trueness criteria. Sr-90 determination in seawater using various low concentrations in a range of 0.01 – 1 Bq/L of 30 liters spiked seawater samples and 0.5 liters of IAEA-RML-2015-01 proficiency test sample was performed for statistical evaluation. The results had a relative bias in the range from 3.41% to 12.28%, which is below accepted relative bias of ± 25% and passed the criteria confirming that our analytical approach for determination of low levels of 90Sr in seawater was acceptable. Moreover, the approach is economical, non-laborious and fast.
Keywords: Proficiency test, radiation monitoring, seawater, strontium determination.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 866512 An Information Theoretic Approach to Rescoring Peptides Produced by De Novo Peptide Sequencing
Authors: John R. Rose, James P. Cleveland, Alvin Fox
Abstract:
Tandem mass spectrometry (MS/MS) is the engine driving high-throughput protein identification. Protein mixtures possibly representing thousands of proteins from multiple species are treated with proteolytic enzymes, cutting the proteins into smaller peptides that are then analyzed generating MS/MS spectra. The task of determining the identity of the peptide from its spectrum is currently the weak point in the process. Current approaches to de novo sequencing are able to compute candidate peptides efficiently. The problem lies in the limitations of current scoring functions. In this paper we introduce the concept of proteome signature. By examining proteins and compiling proteome signatures (amino acid usage) it is possible to characterize likely combinations of amino acids and better distinguish between candidate peptides. Our results strongly support the hypothesis that a scoring function that considers amino acid usage patterns is better able to distinguish between candidate peptides. This in turn leads to higher accuracy in peptide prediction.Keywords: Tandem mass spectrometry, proteomics, scoring, peptide, de novo, mutual information
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1729511 Auditing of Building Information Modeling Application in Decoration Engineering Projects in China
Authors: Lan Luo
Abstract:
In China’s construction industry, it is a normal practice to separately subcontract the decoration engineering part from construction engineering, and Building Information Modeling (BIM) is also done separately. Application of BIM in decoration engineering should be integrated with other disciplines, but Chinese current practice makes this very difficult and complicated. Currently, there are three barriers in the auditing of BIM application in decoration engineering in China: heavy workload; scarcity of qualified professionals; and lack of literature concerning audit contents, standards, and methods. Therefore, it is significant to perform research on what (contents) should be evaluated, in which phase, and by whom (professional qualifications) in BIM application in decoration construction so that the application of BIM can be promoted in a better manner. Based on this consideration, four principles of BIM auditing are proposed: Comprehensiveness of information, accuracy of data, aesthetic attractiveness of appearance, and scheme optimization. In the model audit, three methods should be used: Collision, observation, and contrast. In addition, BIM auditing at six stages is discussed and a checklist for work items and results to be submitted is proposed. This checklist can be used for reference by decoration project participants.
Keywords: Audit, evaluation, dimensions, methods, standards, building information modeling application, decoration engineering projects.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1868510 Splitting Modified Donor-Cell Schemes for Spectral Action Balance Equation
Authors: Tanapat Brikshavana, Anirut Luadsong
Abstract:
The spectral action balance equation is an equation that used to simulate short-crested wind-generated waves in shallow water areas such as coastal regions and inland waters. This equation consists of two spatial dimensions, wave direction, and wave frequency which can be solved by finite difference method. When this equation with dominating propagation velocity terms are discretized using central differences, stability problems occur when the grid spacing is chosen too coarse. In this paper, we introduce the splitting modified donorcell scheme for avoiding stability problems and prove that it is consistent to the modified donor-cell scheme with same accuracy. The splitting modified donor-cell scheme was adopted to split the wave spectral action balance equation into four one-dimensional problems, which for each small problem obtains the independently tridiagonal linear systems. For each smaller system can be solved by direct or iterative methods at the same time which is very fast when performed by a multi-cores computer.Keywords: donor-cell scheme, parallel algorithm, spectral action balance equation, splitting method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1488509 Software Reliability Prediction Model Analysis
Authors: L. Mirtskhulava, M. Khunjgurua, N. Lomineishvili, K. Bakuria
Abstract:
Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.
Keywords: Exponential distribution, conditional mean time to failure, distribution function, mathematical model, software reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1681508 Portable Virtual Piano Design
Authors: Yu-Xiang Zhao, Chien-Hsing Chou, Mu-Chun Su, Yi-Zeng Hsieh
Abstract:
The purpose of this study is to design a portable virtual piano. By utilizing optical fiber gloves and the virtual piano software designed by this study, the user can play the piano anywhere at any time. This virtual piano consists of three major parts: finger tapping identification, hand movement and positioning identification, and MIDI software sound effect simulation. To play the virtual piano, the user wears optical fiber gloves and simulates piano key tapping motions. The finger bending information detected by the optical fiber gloves can tell when piano key tapping motions are made. Images captured by a video camera are analyzed, hand locations and moving directions are positioned, and the corresponding scales are found. The system integrates finger tapping identification with information about hand placement in relation to corresponding piano key positions, and generates MIDI piano sound effects based on this data. This experiment shows that the proposed method achieves an accuracy rate of 95% for determining when a piano key is tapped.Keywords: virtual piano, portable, identification, optical fibergloves.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1746507 Simulation Model for Predicting Dengue Fever Outbreak
Authors: Azmi Ibrahim, Nor Azan Mat Zin, Noraidah Sahari Ashaari
Abstract:
Dengue fever is prevalent in Malaysia with numerous cases including mortality recorded over the years. Public education on the prevention of the desease through various means has been carried out besides the enforcement of legal means to eradicate Aedes mosquitoes, the dengue vector breeding ground. Hence, other means need to be explored, such as predicting the seasonal peak period of the dengue outbreak and identifying related climate factors contributing to the increase in the number of mosquitoes. Simulation model can be employed for this purpose. In this study, we created a simulation of system dynamic to predict the spread of dengue outbreak in Hulu Langat, Selangor Malaysia. The prototype was developed using STELLA 9.1.2 software. The main data input are rainfall, temperature and denggue cases. Data analysis from the graph showed that denggue cases can be predicted accurately using these two main variables- rainfall and temperature. However, the model will be further tested over a longer time period to ensure its accuracy, reliability and efficiency as a prediction tool for dengue outbreak.Keywords: dengue fever, prediction, system dynamic, simulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2336506 Quality Estimation of Video Transmitted overan Additive WGN Channel based on Digital Watermarking and Wavelet Transform
Authors: Mohamed S. El-Mahallawy, Attalah Hashad, Hazem Hassan Ali, Heba Sami Zaky
Abstract:
This paper presents an evaluation for a wavelet-based digital watermarking technique used in estimating the quality of video sequences transmitted over Additive White Gaussian Noise (AWGN) channel in terms of a classical objective metric, such as Peak Signal-to-Noise Ratio (PSNR) without the need of the original video. In this method, a watermark is embedded into the Discrete Wavelet Transform (DWT) domain of the original video frames using a quantization method. The degradation of the extracted watermark can be used to estimate the video quality in terms of PSNR with good accuracy. We calculated PSNR for video frames contaminated with AWGN and compared the values with those estimated using the Watermarking-DWT based approach. It is found that the calculated and estimated quality measures of the video frames are highly correlated, suggesting that this method can provide a good quality measure for video frames transmitted over AWGN channel without the need of the original video.Keywords: AWGN, DWT, PSNR, Watermarking, VideoQuality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1836505 A Fuzzy Approach to Liver Tumor Segmentation with Zernike Moments
Authors: Abder-Rahman Ali, Antoine Vacavant, Manuel Grand-Brochier, Adélaïde Albouy-Kissi, Jean-Yves Boire
Abstract:
In this paper, we present a new segmentation approach for liver lesions in regions of interest within MRI (Magnetic Resonance Imaging). This approach, based on a two-cluster Fuzzy CMeans methodology, considers the parameter variable compactness to handle uncertainty. Fine boundaries are detected by a local recursive merging of ambiguous pixels with a sequential forward floating selection with Zernike moments. The method has been tested on both synthetic and real images. When applied on synthetic images, the proposed approach provides good performance, segmentations obtained are accurate, their shape is consistent with the ground truth, and the extracted information is reliable. The results obtained on MR images confirm such observations. Our approach allows, even for difficult cases of MR images, to extract a segmentation with good performance in terms of accuracy and shape, which implies that the geometry of the tumor is preserved for further clinical activities (such as automatic extraction of pharmaco-kinetics properties, lesion characterization, etc.).Keywords: Defuzzification, floating search, fuzzy clustering, Zernike moments.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2050504 The Relationship between the Environmental and Financial Performance of Australian Electricity Producers
Authors: S. Forughi, A. De Zoysa, S. Bhati
Abstract:
The present study focuses on the environmental performance of the companies in the electricity-producing sector and its relationship with their financial performance. We will review the major studies that examined the relationship between the environmental and financial performance of firms in various industries. While the classical economic debates consider the environmental friendly activities costly and harmful to a firm’s profitability, it is claimed that firms will be rewarded with higher profitability in long run through the investments in environmental friendly activities. In this context, prior studies have examined the relationship between the environmental and financial performance of firms operating in different industry sectors. Our study will employ an environmental indicator to increase the accuracy of the results and be employed as an independent variable in our developed econometric model to evaluate the impact of the financial performance of the firms on their environmental friendly activities in the context of companies operating in the Australian electricity-producing sector. As a result, we expect our methodology to contribute to the literature and the findings of the study will help us to provide recommendations and policy implications to the electricity producers.Keywords: Australian electricity sector, efficiency measurement, environmental-financial performance interaction, environmental index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1189503 Developing Rice Disease Analysis System on Mobile via iOS Operating System
Authors: Rujijan Vichivanives, Kittiya Poonsilp, Canasanan Wanavijit
Abstract:
This research aims to create mobile tools to analyze rice disease quickly and easily. The principle of object-oriented software engineering and objective-C language were used for software development methodology and the principle of decision tree technique was used for analysis method. Application users can select the features of rice disease or the color appears on the rice leaves for recognition analysis results on iOS mobile screen. After completing the software development, unit testing and integrating testing method were used to check for program validity. In addition, three plant experts and forty farmers have been assessed for usability and benefit of this system. The overall of users’ satisfaction was found in a good level, 57%. The plant experts give a comment on the addition of various disease symptoms in the database for more precise results of the analysis. For further research, it is suggested that image processing system should be developed as a tool that allows users search and analyze for rice diseases more convenient with great accuracy.
Keywords: Rice disease, analysis system, mobile application, iOS operating system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1294502 Power Generation Potential of Dynamic Architecture
Authors: Ben Richard Hughes, Hassam Nasarullah Chaudhry
Abstract:
The main aim of this work is to establish the capabilities of new green buildings to ascertain off-grid electricity generation based on the integration of wind turbines in the conceptual model of a rotating tower [2] in Dubai. An in depth performance analysis of the WinWind 3.0MW [3] wind turbine is performed. Data based on the Dubai Meteorological Services is collected and analyzed in conjunction with the performance analysis of this wind turbine. The mathematical model is compared with Computational Fluid Dynamics (CFD) results based on a conceptual rotating tower design model. The comparison results are further validated and verified for accuracy by conducting experiments on a scaled prototype of the tower design. The study concluded that integrating wind turbines inside a rotating tower can generate enough electricity to meet the required power consumption of the building, which equates to a wind farm containing 9 horizontal axis wind turbines located at an approximate area of 3,237,485 m2 [14].Keywords: computational fluid dynamics, green building, horizontal axis wind turbine, rotating tower, velocity gradient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3282501 A Comparison of Air Pollution in Developed and Developing Cities: A Case Study of London and Beijing
Abstract:
With the rapid development of industrialization, countries in different stages of development in the world have gradually begun to pay attention to the impact of air pollution on health and the environment. Air control in developed countries is an effective reference for air control in developing countries. Artificial intelligence and other technologies also play a positive role in the prediction of air pollution. By comparing the annual changes of pollution in London and Beijing, this paper concludes that the pollution in developed cities is relatively low and stable, while the pollution in Beijing is relatively heavy and unstable, but is clearly improving. In addition, by analyzing the changes of major pollutants in Beijing in the past eight years, it is concluded that all pollutants except O3 show a significant downward trend. In addition, all pollutants except O3 have certain correlation. For example, PM10 and PM2.5 have the greatest influence on air quality index (AQI). Python, which is commonly used by artificial intelligence, is used as the main software to establish two models, support vector machine (SVM) and linear regression. By comparing the two models under the same conditions, it is concluded that SVM has higher accuracy in pollution prediction. The results of this study provide valuable reference for pollution control and prediction in developing countries.
Keywords: Air pollution, particulate matter, AQI, correlation coefficient, air pollution prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 582500 Using A Hybrid Algorithm to Improve the Quality of Services in Multicast Routing Problem
Authors: Mohammad Reza Karami Nejad
Abstract:
A hybrid learning automata-genetic algorithm (HLGA) is proposed to solve QoS routing optimization problem of next generation networks. The algorithm complements the advantages of the learning Automato Algorithm(LA) and Genetic Algorithm(GA). It firstly uses the good global search capability of LA to generate initial population needed by GA, then it uses GA to improve the Quality of Service(QoS) and acquiring the optimization tree through new algorithms for crossover and mutation operators which are an NP-Complete problem. In the proposed algorithm, the connectivity matrix of edges is used for genotype representation. Some novel heuristics are also proposed for mutation, crossover, and creation of random individuals. We evaluate the performance and efficiency of the proposed HLGA-based algorithm in comparison with other existing heuristic and GA-based algorithms by the result of simulation. Simulation results demonstrate that this paper proposed algorithm not only has the fast calculating speed and high accuracy but also can improve the efficiency in Next Generation Networks QoS routing. The proposed algorithm has overcome all of the previous algorithms in the literature.
Keywords: Routing, Quality of Service, Multicaset, Learning Automata, Genetic, Next Generation Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738499 Advanced Geolocation of IP Addresses
Authors: Robert Koch, Mario Golling, Gabi Dreo Rodosek
Abstract:
Tracing and locating the geographical location of users (Geolocation) is used extensively in todays Internet. Whenever we, e.g., request a page from google we are - unless there was a specific configuration made - automatically forwarded to the page with the relevant language and amongst others, dependent on our location identified, specific commercials are presented. Especially within the area of Network Security, Geolocation has a significant impact. Because of the way the Internet works, attacks can be executed from almost everywhere. Therefore, for an attribution, knowledge of the origination of an attack - and thus Geolocation - is mandatory in order to be able to trace back an attacker. In addition, Geolocation can also be used very successfully to increase the security of a network during operation (i.e. before an intrusion actually has taken place). Similar to greylisting in emails, Geolocation allows to (i) correlate attacks detected with new connections and (ii) as a consequence to classify traffic a priori as more suspicious (thus particularly allowing to inspect this traffic in more detail). Although numerous techniques for Geolocation are existing, each strategy is subject to certain restrictions. Following the ideas of Endo et al., this publication tries to overcome these shortcomings with a combined solution of different methods to allow improved and optimized Geolocation. Thus, we present our architecture for improved Geolocation, by designing a new algorithm, which combines several Geolocation techniques to increase the accuracy.
Keywords: IP geolocation, prosecution of computer fraud, attack attribution, target-analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4727498 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.Keywords: Big data, building-value analysis, machine learning, price prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1164497 Array Signal Processing: DOA Estimation for Missing Sensors
Authors: Lalita Gupta, R. P. Singh
Abstract:
Array signal processing involves signal enumeration and source localization. Array signal processing is centered on the ability to fuse temporal and spatial information captured via sampling signals emitted from a number of sources at the sensors of an array in order to carry out a specific estimation task: source characteristics (mainly localization of the sources) and/or array characteristics (mainly array geometry) estimation. Array signal processing is a part of signal processing that uses sensors organized in patterns or arrays, to detect signals and to determine information about them. Beamforming is a general signal processing technique used to control the directionality of the reception or transmission of a signal. Using Beamforming we can direct the majority of signal energy we receive from a group of array. Multiple signal classification (MUSIC) is a highly popular eigenstructure-based estimation method of direction of arrival (DOA) with high resolution. This Paper enumerates the effect of missing sensors in DOA estimation. The accuracy of the MUSIC-based DOA estimation is degraded significantly both by the effects of the missing sensors among the receiving array elements and the unequal channel gain and phase errors of the receiver.
Keywords: Array Signal Processing, Beamforming, ULA, Direction of Arrival, MUSIC
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3021496 A Study on Manufacturing of Head-Part of Pipes Using a Rotating Manufacturing Process
Authors: J. H. Park, S. K. Lee, Y. W. Kim, D. C. Ko
Abstract:
A large variety of pipe flange is required in marine and construction industry. Pipe flanges are usually welded or screwed to the pipe end and are connected with bolts. This approach is very simple and widely used for a long time; however, it results in high development cost and low productivity, and the productions made by this approach usually have safety problem at the welding area. In this research, a new approach of forming pipe flange based on cold forging and floating die concept is presented. This innovative approach increases the effectiveness of the material usage and save the time cost compared with conventional welding method. To ensure the dimensional accuracy of the final product, the finite element analysis (FEA) was carried out to simulate the process of cold forging, and the orthogonal experiment methods were used to investigate the influence of four manufacturing factors (pin die angle, pipe flange angle, rpm, pin die distance from clamp jig) and predicted the best combination of them. The manufacturing factors were obtained by numerical and experimental studies and it shows that the approach is very useful and effective for the forming of pipe flange, and can be widely used later.Keywords: Cold forging, FEA, finite element analysis, Forge- 3D, rotating forming, tubes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1641495 Primer Design with Specific PCR Product using Particle Swarm Optimization
Authors: Cheng-Hong Yang, Yu-Huei Cheng, Hsueh-Wei Chang, Li-Yeh Chuang
Abstract:
Before performing polymerase chain reactions (PCR), a feasible primer set is required. Many primer design methods have been proposed for design a feasible primer set. However, the majority of these methods require a relatively long time to obtain an optimal solution since large quantities of template DNA need to be analyzed. Furthermore, the designed primer sets usually do not provide a specific PCR product. In recent years, evolutionary computation has been applied to PCR primer design and yielded promising results. In this paper, a particle swarm optimization (PSO) algorithm is proposed to solve primer design problems associated with providing a specific product for PCR experiments. A test set of the gene CYP1A1, associated with a heightened lung cancer risk was analyzed and the comparison of accuracy and running time with the genetic algorithm (GA) and memetic algorithm (MA) was performed. A comparison of results indicated that the proposed PSO method for primer design finds optimal or near-optimal primer sets and effective PCR products in a relatively short time.
Keywords: polymerase chain reaction (PCR), primer design, evolutionary computation, particle swarm optimization (PSO).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1880494 A Neuro Adaptive Control Strategy for Movable Power Source of Proton Exchange Membrane Fuel Cell Using Wavelets
Authors: M. Sedighizadeh, A. Rezazadeh
Abstract:
Movable power sources of proton exchange membrane fuel cells (PEMFC) are the important research done in the current fuel cells (FC) field. The PEMFC system control influences the cell performance greatly and it is a control system for industrial complex problems, due to the imprecision, uncertainty and partial truth and intrinsic nonlinear characteristics of PEMFCs. In this paper an adaptive PI control strategy using neural network adaptive Morlet wavelet for control is proposed. It is based on a single layer feed forward neural networks with hidden nodes of adaptive morlet wavelet functions controller and an infinite impulse response (IIR) recurrent structure. The IIR is combined by cascading to the network to provide double local structure resulting in improving speed of learning. The proposed method is applied to a typical 1 KW PEMFC system and the results show the proposed method has more accuracy against to MLP (Multi Layer Perceptron) method.Keywords: Adaptive Control, Morlet Wavelets, PEMFC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1867