Search results for: Finite element method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9109

Search results for: Finite element method

5089 Calculation of Inflation from Salaries Instead of Consumer Products: A Logical Exercise

Authors: E. Dahlen

Abstract:

Inflation can be calculated from either the prices of consumer products or from salaries. This paper presents a logical exercise that shows it is easier to calculate inflation from salaries than from consumer products. While the prices of consumer products may change due to technological advancement, such as automation, which must be corrected for, salaries do not. If technological advancements are not accounted for within calculations based on consumer product prices, inflation can be confused with real wage changes, since both inflation and real wage changes affect the prices of consumer products. The method employed in this paper is a logical exercise. Logical arguments are presented that suggest the existence of many different feasible ways by which inflation can be determined. Then a short mathematical exercise will be presented which shows that one of these methods –using salaries – contains the fewest number of unknown parameters, and hence, is the preferred method, since the risk of mistakes is lower. From the results, it can be concluded that salaries, rather than consumer products, should be used to calculate inflation.

Keywords: Inflation, logic, math, real wages.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 722
5088 Absorption of CO2 in EAF Reducing Slag from Stainless Steel Making Process by Wet Grinding

Authors: B.M.N. Nik Hisyamudin, S. Yokoyama, M. Umemoto

Abstract:

In the current study, we have conducted an experimental investigation on the utilization of electronic arc furnace (EAF) reducing slag for the absorption of CO2 via wet grinding method. It was carried out by various grinding conditions. The slag was ground in the vibrating ball mill in the presence of CO2 and pure water under ambient temperature. The reaction behavior was monitored with constant pressure method, and the changes of experimental systems volume as a function of grinding time were measured. It was found that the CO2 absorption occurred as soon as the grinding started. The CO2 absorption was significantly increased in the case of wet grinding compare to the dry grinding. Generally, the amount of CO2 absorption increased as the amount of water, weight of slag and initial pressure increased. However, it was decreased when the amount of water exceeds 200ml and when smaller balls were used. The absorption of CO2 occurred simultaneously with the start of the grinding and it stopped when the grinding was stopped. According to this research, the CO2 reacted with the CaO inside the slag, forming CaCO3.

Keywords: CO2 absorption, EAF reducing slag, vibration ball mill, wet grinding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1789
5087 An Approach to Noise Variance Estimation in Very Low Signal-to-Noise Ratio Stochastic Signals

Authors: Miljan B. Petrović, Dušan B. Petrović, Goran S. Nikolić

Abstract:

This paper describes a method for AWGN (Additive White Gaussian Noise) variance estimation in noisy stochastic signals, referred to as Multiplicative-Noising Variance Estimation (MNVE). The aim was to develop an estimation algorithm with minimal number of assumptions on the original signal structure. The provided MATLAB simulation and results analysis of the method applied on speech signals showed more accuracy than standardized AR (autoregressive) modeling noise estimation technique. In addition, great performance was observed on very low signal-to-noise ratios, which in general represents the worst case scenario for signal denoising methods. High execution time appears to be the only disadvantage of MNVE. After close examination of all the observed features of the proposed algorithm, it was concluded it is worth of exploring and that with some further adjustments and improvements can be enviably powerful.

Keywords: Noise, signal-to-noise ratio, stochastic signals, variance estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2262
5086 A Constructive Analysis of the Formation of LGBTQ Families: Where Utopia and Reality Meet

Authors: Panagiotis Pentaris

Abstract:

The issue of social and legal recognition of LGBTQ families is of high importance when exploring the possibility of a family. Of equal importance is the fact that both society and the individual contribute to the overall recognition of LGBTQ families. This paper is a conceptual discussion, by methodology, of both sides; it uses a method of constructive analysis to expound on this issue. This method’s aim is to broaden conceptual theory, and introduce a new relationship between concepts that were previously not associated by evidence. This exploration has found that LGBTQ realities from an international perspective may differ and both legal and social rights are critical toward self-consciousness and the formation of a family. This paper asserts that internalised and historic oppression of LGBTQ individuals, places them, not always and not in all places, in a disadvantageous position as far as engaging with the potential of forming a family goes. The paper concludes that lack of social recognition and internalised oppression are key barriers regarding LGBTQ families.

Keywords: Family, gay, LGBTQ, self-worth, social rights.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2146
5085 Edge Detection Using Multi-Agent System: Evaluation on Synthetic and Medical MR Images

Authors: A. Nachour, L. Ouzizi, Y. Aoura

Abstract:

Recent developments on multi-agent system have brought a new research field on image processing. Several algorithms are used simultaneously and improved in deferent applications while new methods are investigated. This paper presents a new automatic method for edge detection using several agents and many different actions. The proposed multi-agent system is based on parallel agents that locally perceive their environment, that is to say, pixels and additional environmental information. This environment is built using Vector Field Convolution that attract free agent to the edges. Problems of partial, hidden or edges linking are solved with the cooperation between agents. The presented method was implemented and evaluated using several examples on different synthetic and medical images. The obtained experimental results suggest that this approach confirm the efficiency and accuracy of detected edge.

Keywords: Edge detection, medical MR images, multi-agent systems, vector field convolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1910
5084 LabVIEW with Fuzzy Logic Controller Simulation Panel for Condition Monitoring of Oil and Dry Type Transformer

Authors: N. A. Muhamad, S.A.M. Ali

Abstract:

Condition monitoring of electrical power equipment has attracted considerable attention for many years. The aim of this paper is to use Labview with Fuzzy Logic controller to build a simulation system to diagnose transformer faults and monitor its condition. The front panel of the system was designed using LabVIEW to enable computer to act as customer-designed instrument. The dissolved gas-in-oil analysis (DGA) method was used as technique for oil type transformer diagnosis; meanwhile terminal voltages and currents analysis method was used for dry type transformer. Fuzzy Logic was used as expert system that assesses all information keyed in at the front panel to diagnose and predict the condition of the transformer. The outcome of the Fuzzy Logic interpretation will be displayed at front panel of LabVIEW to show the user the conditions of the transformer at any time.

Keywords: LabVIEW, Fuzzy Logic, condition monitoring, oiltransformer, dry transformer, DGA, terminal values.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3237
5083 A Design of Elliptic Curve Cryptography Processor Based on SM2 over GF(p)

Authors: Shiji Hu, Lei Li, Wanting Zhou, Daohong Yang

Abstract:

The data encryption is the foundation of today’s communication. On this basis, to improve the speed of data encryption and decryption is always an important goal for high-speed applications. This paper proposed an elliptic curve crypto processor architecture based on SM2 prime field. Regarding hardware implementation, we optimized the algorithms in different stages of the structure. For modulo operation on finite field, we proposed an optimized improvement of the Karatsuba-Ofman multiplication algorithm and shortened the critical path through the pipeline structure in the algorithm implementation. Based on SM2 recommended prime field, a fast modular reduction algorithm is used to reduce 512-bit data obtained from the multiplication unit. The radix-4 extended Euclidean algorithm was used to realize the conversion between the affine coordinate system and the Jacobi projective coordinate system. In the parallel scheduling point operations on elliptic curves, we proposed a three-level parallel structure of point addition and point double based on the Jacobian projective coordinate system. Combined with the scalar multiplication algorithm, we added mutual pre-operation to the point addition and double point operation to improve the efficiency of the scalar point multiplication. The proposed ECC hardware architecture was verified and implemented on Xilinx Virtex-7 and ZYNQ-7 platforms, and each 256-bit scalar multiplication operation took 0.275ms. The performance for handling scalar multiplication is 32 times that of CPU (dual-core ARM Cortex-A9).

Keywords: Elliptic curve cryptosystems, SM2, modular multiplication, point multiplication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 279
5082 Walking Hexapod Robot in Disaster Recovery: Developing Algorithm for Terrain Negotiation and Navigation

Authors: Md. Masum Billah, Mohiuddin Ahmed, Soheli Farhana

Abstract:

In modern day disaster recovery mission has become one of the top priorities in any natural disaster management regime. Smart autonomous robots may play a significant role in such missions, including search for life under earth quake hit rubbles, Tsunami hit islands, de-mining in war affected areas and many other such situations. In this paper current state of many walking robots are compared and advantages of hexapod systems against wheeled robots are described. In our research we have selected a hexapod spider robot; we are developing focusing mainly on efficient navigation method in different terrain using apposite gait of locomotion, which will make it faster and at the same time energy efficient to navigate and negotiate difficult terrain. This paper describes the method of terrain negotiation navigation in a hazardous field.

Keywords: Walking robots, locomotion, hexapod robot, gait, hazardous field.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4443
5081 Infrastructure Change Monitoring Using Multitemporal Multispectral Satellite Images

Authors: U. Datta

Abstract:

The main objective of this study is to find a suitable approach to monitor the land infrastructure growth over a period of time using multispectral satellite images. Bi-temporal change detection method is unable to indicate the continuous change occurring over a long period of time. To achieve this objective, the approach used here estimates a statistical model from series of multispectral image data over a long period of time, assuming there is no considerable change during that time period and then compare it with the multispectral image data obtained at a later time. The change is estimated pixel-wise. Statistical composite hypothesis technique is used for estimating pixel based change detection in a defined region. The generalized likelihood ratio test (GLRT) is used to detect the changed pixel from probabilistic estimated model of the corresponding pixel. The changed pixel is detected assuming that the images have been co-registered prior to estimation. To minimize error due to co-registration, 8-neighborhood pixels around the pixel under test are also considered. The multispectral images from Sentinel-2 and Landsat-8 from 2015 to 2018 are used for this purpose. There are different challenges in this method. First and foremost challenge is to get quite a large number of datasets for multivariate distribution modelling. A large number of images are always discarded due to cloud coverage. Due to imperfect modelling there will be high probability of false alarm. Overall conclusion that can be drawn from this work is that the probabilistic method described in this paper has given some promising results, which need to be pursued further.

Keywords: Co-registration, GLRT, infrastructure growth, multispectral, multitemporal, pixel-based change detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 740
5080 The Effect of Saturates on Rheological and Aging Characteristics of Bitumen

Authors: Madi Hermadi, Kemas A. Zamhari, Ahmad T. bin A. Karim, Mohd. E. Abdullah, Ling Lloyd L.

Abstract:

According to Rostler method (ASTM D 2006), saturates content of bitumen is determined based on its reactivity to sulphuric acid. While Corbett method (ASTM D 4124) based on its polarity level. This paper presents results from the study on the effect of saturates content determined by two different fractionation methods on the rheological and aging characteristics of bitumen. The result indicated that the increment of saturates content tended to reduce all the rheological characteristics concerned. Bitumen became less elastic, less viscous, and less resistant to plastic deformation, but became more resistant to fatigue cracking. After short and long term aging process, the treatment effect coefficients of saturates decreased, saturates became thicker due to aging process. This study concludes that saturates is not really stable or reactive in aging process. Therefore, the reactivity of saturates should be considered in bitumen aging index

Keywords: Aging index, bitumen, saturates, rheolgy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2384
5079 Reinforced Concrete Bridge Deck Condition Assessment Methods Using Ground Penetrating Radar and Infrared Thermography

Authors: Nicole M. Martino

Abstract:

Reinforced concrete bridge deck condition assessments primarily use visual inspection methods, where an inspector looks for and records locations of cracks, potholes, efflorescence and other signs of probable deterioration. Sounding is another technique used to diagnose the condition of a bridge deck, however this method listens for damage within the subsurface as the surface is struck with a hammer or chain. Even though extensive procedures are in place for using these inspection techniques, neither one provides the inspector with a comprehensive understanding of the internal condition of a bridge deck – the location where damage originates from.  In order to make accurate estimates of repair locations and quantities, in addition to allocating the necessary funding, a total understanding of the deck’s deteriorated state is key. The research presented in this paper collected infrared thermography and ground penetrating radar data from reinforced concrete bridge decks without an asphalt overlay. These decks were of various ages and their condition varied from brand new, to in need of replacement. The goals of this work were to first verify that these nondestructive evaluation methods could identify similar areas of healthy and damaged concrete, and then to see if combining the results of both methods would provide a higher confidence than if the condition assessment was completed using only one method. The results from each method were presented as plan view color contour plots. The results from one of the decks assessed as a part of this research, including these plan view plots, are presented in this paper. Furthermore, in order to answer the interest of transportation agencies throughout the United States, this research developed a step-by-step guide which demonstrates how to collect and assess a bridge deck using these nondestructive evaluation methods. This guide addresses setup procedures on the deck during the day of data collection, system setups and settings for different bridge decks, data post-processing for each method, and data visualization and quantification.

Keywords: Bridge deck deterioration, ground penetrating radar, infrared thermography, NDT of bridge decks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 918
5078 Distributed System Computing Resource Scheduling Algorithm Based on Deep Reinforcement Learning

Authors: Yitao Lei, Xingxiang Zhai, Burra Venkata Durga Kumar

Abstract:

As the quantity and complexity of computing in large-scale software systems increase, distributed system computing becomes increasingly important. The distributed system realizes high-performance computing by collaboration between different computing resources. If there are no efficient resource scheduling resources, the abuse of distributed computing may cause resource waste and high costs. However, resource scheduling is usually an NP-hard problem, so we cannot find a general solution. However, some optimization algorithms exist like genetic algorithm, ant colony optimization, etc. The large scale of distributed systems makes this traditional optimization algorithm challenging to work with. Heuristic and machine learning algorithms are usually applied in this situation to ease the computing load. As a result, we do a review of traditional resource scheduling optimization algorithms and try to introduce a deep reinforcement learning method that utilizes the perceptual ability of neural networks and the decision-making ability of reinforcement learning. Using the machine learning method, we try to find important factors that influence the performance of distributed system computing and help the distributed system do an efficient computing resource scheduling. This paper surveys the application of deep reinforcement learning on distributed system computing resource scheduling. The research proposes a deep reinforcement learning method that uses a recurrent neural network to optimize the resource scheduling. The paper concludes the challenges and improvement directions for Deep Reinforcement Learning-based resource scheduling algorithms.

Keywords: Resource scheduling, deep reinforcement learning, distributed system, artificial intelligence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 505
5077 Forecasting Models for Steel Demand Uncertainty Using Bayesian Methods

Authors: Watcharin Sangma, Onsiri Chanmuang, Pitsanu Tongkhow

Abstract:

 A forecasting model for steel demand uncertainty in Thailand is proposed. It consists of trend, autocorrelation, and outliers in a hierarchical Bayesian frame work. The proposed model uses a cumulative Weibull distribution function, latent first-order autocorrelation, and binary selection, to account for trend, time-varying autocorrelation, and outliers, respectively. The Gibbs sampling Markov Chain Monte Carlo (MCMC) is used for parameter estimation. The proposed model is applied to steel demand index data in Thailand. The root mean square error (RMSE), mean absolute percentage error (MAPE), and mean absolute error (MAE) criteria are used for model comparison. The study reveals that the proposed model is more appropriate than the exponential smoothing method.

Keywords: Forecasting model, Steel demand uncertainty, Hierarchical Bayesian framework, Exponential smoothing method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2540
5076 Mobile Collaboration Learning Technique on Students in Developing Nations

Authors: Amah Nnachi Lofty, Oyefeso Olufemi, Ibiam Udu Ama

Abstract:

New and more powerful communications technologies continue to emerge at a rapid pace and their uses in education are widespread and the impact remarkable in the developing societies. This study investigates Mobile Collaboration Learning Technique (MCLT) on learners’ outcome among students in tertiary institutions of developing nations (a case of Nigeria students). It examines the significance of retention achievement scores of students taught using mobile collaboration and conventional method. The sample consisted of 120 students using Stratified random sampling method. Five research questions and hypotheses were formulated, and tested at 0.05 level of significance. A student achievement test (SAT) was made of 40 items of multiple-choice objective type, developed and validated for data collection by professionals. The SAT was administered to students as pre-test and post-test. The data were analyzed using t-test statistic to test the hypotheses. The result indicated that students taught using MCLT performed significantly better than their counterparts using the conventional method of instruction. Also, there was no significant difference in the post-test performance scores of male and female students taught using MCLT. Based on the findings, the following submissions was made that: Mobile collaboration system be encouraged in the institutions to boost knowledge sharing among learners, workshop and training should be organized to train teachers on the use of this technique, schools and government should consistently align curriculum standard to trends of technological dictates and formulate policies and procedures towards responsible use of MCLT.

Keywords: Education, communication, learning, mobile collaboration, technology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
5075 Optimal and Generalized Multiple Descriptions Image Coding Transform in the Wavelet Domain

Authors: Bahi brahim, El hassane Ibn Elhaj, Driss Aboutajdine

Abstract:

In this paper we propose a Multiple Description Image Coding(MDIC) scheme to generate two compressed and balanced rates descriptions in the wavelet domain (Daubechies biorthogonal (9, 7) wavelet) using pairwise correlating transform optimal and application method for Generalized Multiple Description Coding (GMDC) to image coding in the wavelet domain. The GMDC produces statistically correlated streams such that lost streams can be estimated from the received data. Our performance test shown that the proposed method gives more improvement and good quality of the reconstructed image when the wavelet coefficients are normalized by Gaussian Scale Mixture (GSM) model then the Gaussian one ,.

Keywords: Multiple description coding (MDC), gaussian scale mixture (GSM) model, joint source-channel coding, pairwise correlating transform, GMDCT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1621
5074 Unsupervised Feature Selection Using Feature Density Functions

Authors: Mina Alibeigi, Sattar Hashemi, Ali Hamzeh

Abstract:

Since dealing with high dimensional data is computationally complex and sometimes even intractable, recently several feature reductions methods have been developed to reduce the dimensionality of the data in order to simplify the calculation analysis in various applications such as text categorization, signal processing, image retrieval, gene expressions and etc. Among feature reduction techniques, feature selection is one the most popular methods due to the preservation of the original features. In this paper, we propose a new unsupervised feature selection method which will remove redundant features from the original feature space by the use of probability density functions of various features. To show the effectiveness of the proposed method, popular feature selection methods have been implemented and compared. Experimental results on the several datasets derived from UCI repository database, illustrate the effectiveness of our proposed methods in comparison with the other compared methods in terms of both classification accuracy and the number of selected features.

Keywords: Feature, Feature Selection, Filter, Probability Density Function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2085
5073 On Problem of Parameters Identification of Dynamic Object

Authors: Kamil Aida-zade, C. Ardil

Abstract:

In this paper, some problem formulations of dynamic object parameters recovery described by non-autonomous system of ordinary differential equations with multipoint unshared edge conditions are investigated. Depending on the number of additional conditions the problem is reduced to an algebraic equations system or to a problem of quadratic programming. With this purpose the paper offers a new scheme of the edge conditions transfer method called by conditions shift. The method permits to get rid from differential links and multipoint unshared initially-edge conditions. The advantage of the proposed approach is concluded by capabilities of reduction of a parametric identification problem to essential simple problems of the solution of an algebraic system or quadratic programming.

Keywords: dynamic objects, ordinary differential equations, multipoint unshared edge conditions, quadratic programming, conditions shift

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1464
5072 Method for Determining the Probing Points for Efficient Measurement of Freeform Surface

Authors: Yi Xu, Zexiang Li

Abstract:

In inspection and workpiece localization, sampling point data is an important issue. Since the devices for sampling only sample discrete points, not the completely surface, sampling size and location of the points will be taken into consideration. In this paper a method is presented for determining the sampled points size and location for achieving efficient sampling. Firstly, uncertainty analysis of the localization parameters is investigated. A localization uncertainty model is developed to predict the uncertainty of the localization process. Using this model the minimum size of the sampled points is predicted. Secondly, based on the algebra theory an eigenvalue-optimal optimization is proposed. Then a freeform surface is used in the simulation. The proposed optimization is implemented. The simulation result shows its effectivity.

Keywords: eigenvalue-optimal optimization, freeform surface inspection, sampling size and location, sampled points.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1248
5071 The Different Roles between Sodium and Potassium Ions in Ion Exchange of WO3/SiO2 Catalysts

Authors: K. Pipitthapan, S. Maksasithorn, P. Praserthdam, J. Panpranot, K. Suriye, S. Kunjara Na Ayudhya

Abstract:

WO3/SiO2 catalysts were modified by an ion exchange method with sodium hydroxide or potassium hydroxide solution. The performance of the modified catalysts was tested in the metathesis of ethylene and trans-2-butene to propylene. During ion exchange, sodium and potassium ions played different roles. Sodium modified catalysts revealed constant trans-2-butene conversion and propylene selectivity when the concentrations of sodium in the solution were varied. In contrast, potassium modified catalysts showed reduction of the conversion and increase of the selectivity. From these results, potassium hydroxide may affect the transformation of tungsten oxide active species, resulting in the decrease in conversion whereas sodium hydroxide did not. Moreover, the modification of catalysts by this method improved the catalyst stability by lowering the amount of coke deposited on the catalyst surface.

Keywords: Acid sites, alkali metals, isomerization, metathesis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1721
5070 Planning Rigid Body Motions and Optimal Control Problem on Lie Group SO(2, 1)

Authors: Nemat Abazari, Ilgin Sager

Abstract:

In this paper smooth trajectories are computed in the Lie group SO(2, 1) as a motion planning problem by assigning a Frenet frame to the rigid body system to optimize the cost function of the elastic energy which is spent to track a timelike curve in Minkowski space. A method is proposed to solve a motion planning problem that minimizes the integral of the Lorentz inner product of Darboux vector of a timelike curve. This method uses the coordinate free Maximum Principle of Optimal control and results in the theory of integrable Hamiltonian systems. The presence of several conversed quantities inherent in these Hamiltonian systems aids in the explicit computation of the rigid body motions.

Keywords: Optimal control, Hamiltonian vector field, Darboux vector, maximum principle, lie group, rigid body motion, Lorentz metric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576
5069 Silicon Application and Nitrogen on Yield and Yield Components in Rice (Oryza sativa L.) in Two Irrigation Systems

Authors: Abbas Ghanbari-Malidareh

Abstract:

Silicon is a beneficial element for plant growth. It helps plants to overcome multiple stresses, alleviates metal toxicity and improves nutrient imbalance. Field experiment was conducted as split-split plot arranged in a randomized complete block design with four replications. Irrigation system include continues flooding and deficit as main plots and nitrogen rates N0, N46, N92, and N138 kg/ha as sub plots and silicon rates Si0 & Si500 kg/ha as sub-subplots. Results indicate that grain yield had not significant difference between irrigation systems. Flooding irrigation had higher biological yield than deficit irrigation whereas, no significant difference in grain and straw yield. Nitrogen application increased grain, biological and straw yield. Silicon application increased grain, biological and straw yield but, decreased harvest index. Flooding irrigation had higher number of total tillers / hill than deficit irrigation, but deficit irrigation had higher number of fertile tillers / hill than flooding irrigation. Silicon increased number of filled spikelet and decreased blank spikelet. With high nitrogen application decreased 1000-grain weight. It can be concluded that if the nitrogen application was high and water supplied was available we could have silicon application until increase grain yield.

Keywords: Grain yield, Irrigation, Nitrogen, Rice, Silicon.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3270
5068 Design and Implementation of Shared Memory based Parallel File System Logging Method for High Performance Computing

Authors: Hyeyoung Cho, Sungho Kim, SangDong Lee

Abstract:

I/O workload is a critical and important factor to analyze I/O pattern and file system performance. However tracing I/O operations on the fly distributed parallel file system is non-trivial due to collection overhead and a large volume of data. In this paper, we design and implement a parallel file system logging method for high performance computing using shared memory-based multi-layer scheme. It minimizes the overhead with reduced logging operation response time and provides efficient post-processing scheme through shared memory. Separated logging server can collect sequential logs from multiple clients in a cluster through packet communication. Implementation and evaluation result shows low overhead and high scalability of this architecture for high performance parallel logging analysis.

Keywords: I/O workload, PVFS, I/O Trace.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1565
5067 Automatic Detection and Classification of Diabetic Retinopathy Using Retinal Fundus Images

Authors: A. Biran, P. Sobhe Bidari, A. Almazroe V. Lakshminarayanan, K. Raahemifar

Abstract:

Diabetic Retinopathy (DR) is a severe retinal disease which is caused by diabetes mellitus. It leads to blindness when it progress to proliferative level. Early indications of DR are the appearance of microaneurysms, hemorrhages and hard exudates. In this paper, an automatic algorithm for detection of DR has been proposed. The algorithm is based on combination of several image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Also, Support Vector Machine (SVM) Classifier is used to classify retinal images to normal or abnormal cases including non-proliferative or proliferative DR. The proposed method has been tested on images selected from Structured Analysis of the Retinal (STARE) database using MATLAB code. The method is perfectly able to detect DR. The sensitivity specificity and accuracy of this approach are 90%, 87.5%, and 91.4% respectively.

Keywords: Diabetic retinopathy, fundus images, STARE, Gabor filter, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1679
5066 A Nonoblivious Image Watermarking System Based on Singular Value Decomposition and Texture Segmentation

Authors: Soroosh Rezazadeh, Mehran Yazdi

Abstract:

In this paper, a robust digital image watermarking scheme for copyright protection applications using the singular value decomposition (SVD) is proposed. In this scheme, an entropy masking model has been applied on the host image for the texture segmentation. Moreover, the local luminance and textures of the host image are considered for watermark embedding procedure to increase the robustness of the watermarking scheme. In contrast to all existing SVD-based watermarking systems that have been designed to embed visual watermarks, our system uses a pseudo-random sequence as a watermark. We have tested the performance of our method using a wide variety of image processing attacks on different test images. A comparison is made between the results of our proposed algorithm with those of a wavelet-based method to demonstrate the superior performance of our algorithm.

Keywords: Watermarking, copyright protection, singular value decomposition, entropy masking, texture segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1769
5065 Automatic Detection and Spatio-temporal Analysis of Commercial Accumulations Using Digital Yellow Page Data

Authors: Yuki. Akiyama, Hiroaki. Sengoku, Ryosuke. Shibasaki

Abstract:

In this study, the locations and areas of commercial accumulations were detected by using digital yellow page data. An original buffering method that can accurately create polygons of commercial accumulations is proposed in this paper.; by using this method, distribution of commercial accumulations can be easily created and monitored over a wide area. The locations, areas, and time-series changes of commercial accumulations in the South Kanto region can be monitored by integrating polygons of commercial accumulations with the time-series data of digital yellow page data. The circumstances of commercial accumulations were shown to vary according to areas, that is, highly- urbanized regions such as the city center of Tokyo and prefectural capitals, suburban areas near large cities, and suburban and rural areas.

Keywords: Commercial accumulations, Spatio-temporal analysis, Urban monitoring, Yellow page data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1270
5064 A Study on the Application of TRIZ to CAD/CAM System

Authors: Yuan L. Lai, Jian H. Chen, Jui P. Hung

Abstract:

This study created new graphical icons and operating functions in a CAD/CAM software system by analyzing icons in some of the popular systems, such as AutoCAD, AlphaCAM, Mastercam and the 1st edition of LiteCAM. These software systems all focused on geometric design and editing, thus how to transmit messages intuitively from icon itself to users is an important function of graphical icons. The primary purpose of this study is to design innovative icons and commands for new software. This study employed the TRIZ method, an innovative design method, to generate new concepts systematically. Through literature review, it then investigated and analyzed the relationship between TRIZ and idea development. Contradiction Matrix and 40 Principles were used to develop an assisting tool suitable for icon design in software development. We first gathered icon samples from the selected CAD/CAM systems. Then grouped these icons by meaningful functions, and compared useful and harmful properties. Finally, we developed new icons for new software systems in order to avoid intellectual property problem.

Keywords: Icon, TRIZ, CAD/CAM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1798
5063 Error Correction Method for 2D Ultra-Wideband Indoor Wireless Positioning System Using Logarithmic Error Model

Authors: Phornpat Chewasoonthorn, Surat Kwanmuang

Abstract:

Indoor positioning technologies have been evolved rapidly. They augment the Global Positioning System (GPS) which requires line-of-sight to the sky to track the location of people or objects. In this study, we developed an error correction method for an indoor real-time location system (RTLS) based on an ultra-wideband (UWB) sensor from Decawave. Multiple stationary nodes (anchor) were installed throughout the workspace. The distance between stationary and moving nodes (tag) can be measured using a two-way-ranging (TWR) scheme. The result has shown that the uncorrected ranging error from the sensor system can be as large as 1 m. To reduce ranging error and thus increase positioning accuracy, we present an online correction algorithm using the Kalman filter. The results from experiments have shown that the system can reduce ranging error down to 5 cm.

Keywords: Indoor positioning, ultra-wideband, error correction, Kalman filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 540
5062 Arriving at an Optimum Value of Tolerance Factor for Compressing Medical Images

Authors: Sumathi Poobal, G. Ravindran

Abstract:

Medical imaging uses the advantage of digital technology in imaging and teleradiology. In teleradiology systems large amount of data is acquired, stored and transmitted. A major technology that may help to solve the problems associated with the massive data storage and data transfer capacity is data compression and decompression. There are many methods of image compression available. They are classified as lossless and lossy compression methods. In lossy compression method the decompressed image contains some distortion. Fractal image compression (FIC) is a lossy compression method. In fractal image compression an image is coded as a set of contractive transformations in a complete metric space. The set of contractive transformations is guaranteed to produce an approximation to the original image. In this paper FIC is achieved by PIFS using quadtree partitioning. PIFS is applied on different images like , Ultrasound, CT Scan, Angiogram, X-ray, Mammograms. In each modality approximately twenty images are considered and the average values of compression ratio and PSNR values are arrived. In this method of fractal encoding, the parameter, tolerance factor Tmax, is varied from 1 to 10, keeping the other standard parameters constant. For all modalities of images the compression ratio and Peak Signal to Noise Ratio (PSNR) are computed and studied. The quality of the decompressed image is arrived by PSNR values. From the results it is observed that the compression ratio increases with the tolerance factor and mammogram has the highest compression ratio. The quality of the image is not degraded upto an optimum value of tolerance factor, Tmax, equal to 8, because of the properties of fractal compression.

Keywords: Fractal image compression, IFS, PIFS, PSNR, Quadtree partitioning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1744
5061 Face Detection using Variance based Haar-Like feature and SVM

Authors: Cuong Nguyen Khac, Ju H. Park, Ho-Youl Jung

Abstract:

This paper proposes a new approach to perform the problem of real-time face detection. The proposed method combines primitive Haar-Like feature and variance value to construct a new feature, so-called Variance based Haar-Like feature. Face in image can be represented with a small quantity of features using this new feature. We used SVM instead of AdaBoost for training and classification. We made a database containing 5,000 face samples and 10,000 non-face samples extracted from real images for learning purposed. The 5,000 face samples contain many images which have many differences of light conditions. And experiments showed that face detection system using Variance based Haar-Like feature and SVM can be much more efficient than face detection system using primitive Haar-Like feature and AdaBoost. We tested our method on two Face databases and one Non-Face database. We have obtained 96.17% of correct detection rate on YaleB face database, which is higher 4.21% than that of using primitive Haar-Like feature and AdaBoost.

Keywords: AdaBoost, Haar-Like feature, SVM, variance, Variance based Haar-Like feature.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3740
5060 A Real Time Collision Avoidance Algorithm for Mobile Robot based on Elastic Force

Authors: Kyung Hyun, Choi, Minh Ngoc, Nong, M. Asif Ali, Rehmani

Abstract:

This present paper proposes the modified Elastic Strip method for mobile robot to avoid obstacles with a real time system in an uncertain environment. The method deals with the problem of robot in driving from an initial position to a target position based on elastic force and potential field force. To avoid the obstacles, the robot has to modify the trajectory based on signal received from the sensor system in the sampling times. It was evident that with the combination of Modification Elastic strip and Pseudomedian filter to process the nonlinear data from sensor uncertainties in the data received from the sensor system can be reduced. The simulations and experiments of these methods were carried out.

Keywords: Collision avoidance, Avoidance obstacle, Elastic Strip, Real time collision avoidance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2014