Search results for: Quick reduct.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 145

Search results for: Quick reduct.

145 A New Hybrid K-Mean-Quick Reduct Algorithm for Gene Selection

Authors: E. N. Sathishkumar, K. Thangavel, T. Chandrasekhar

Abstract:

Feature selection is a process to select features which are more informative. It is one of the important steps in knowledge discovery. The problem is that all genes are not important in gene expression data. Some of the genes may be redundant, and others may be irrelevant and noisy. Here a novel approach is proposed Hybrid K-Mean-Quick Reduct (KMQR) algorithm for gene selection from gene expression data. In this study, the entire dataset is divided into clusters by applying K-Means algorithm. Each cluster contains similar genes. The high class discriminated genes has been selected based on their degree of dependence by applying Quick Reduct algorithm to all the clusters. Average Correlation Value (ACV) is calculated for the high class discriminated genes. The clusters which have the ACV value as 1 is determined as significant clusters, whose classification accuracy will be equal or high when comparing to the accuracy of the entire dataset. The proposed algorithm is evaluated using WEKA classifiers and compared. The proposed work shows that the high classification accuracy.

Keywords: Clustering, Gene Selection, K-Mean-Quick Reduct, Rough Sets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2256
144 Pruning Algorithm for the Minimum Rule Reduct Generation

Authors: Şahin Emrah Amrahov, Fatih Aybar, Serhat Doğan

Abstract:

In this paper we consider the rule reduct generation problem. Rule Reduct Generation (RG) and Modified Rule Generation (MRG) algorithms, that are used to solve this problem, are well-known. Alternative to these algorithms, we develop Pruning Rule Generation (PRG) algorithm. We compare the PRG algorithm with RG and MRG.

Keywords: Rough sets, Decision rules, Rule induction, Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2008
143 Simultaneous Clustering and Feature Selection Method for Gene Expression Data

Authors: T. Chandrasekhar, K. Thangavel, E. N. Sathishkumar

Abstract:

Microarrays are made it possible to simultaneously monitor the expression profiles of thousands of genes under various experimental conditions. It is used to identify the co-expressed genes in specific cells or tissues that are actively used to make proteins. This method is used to analysis the gene expression, an important task in bioinformatics research. Cluster analysis of gene expression data has proved to be a useful tool for identifying co-expressed genes, biologically relevant groupings of genes and samples. In this work K-Means algorithms has been applied for clustering of Gene Expression Data. Further, rough set based Quick reduct algorithm has been applied for each cluster in order to select the most similar genes having high correlation. Then the ACV measure is used to evaluate the refined clusters and classification is used to evaluate the proposed method. They could identify compact clusters with feature selection method used to genes are selected.

Keywords: Clustering, Feature selection, Gene expression data, Quick reduct.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1929
142 Survey on the Possibility of Post -Earthquake Quick Inspection of Damaged Building by Ordinary People Using the European Macro-Seismic Scale 1998 (EMS-98)

Authors: Douangmala Kousnana, Toru Takahashi

Abstract:

In recent years, the number of natural disasters in the world has occurred frequently. After a strong earthquake occurs, multiple disasters due to tsunami, strong aftershocks or heavy snow can possible to occur. To prevent a secondary disaster and to save a life, the quick inspection of the damaged building is necessary. This paper investigated on a possibility of post earthquake quick inspection of damaged building by ordinary people which used the European Macro- Seismic Scale 1998 (EMS-98).

Keywords: Quick Assessment, EMS-98, Ordinary People, Post-Earthquake

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1772
141 “Magnetic Cleansing” for the Provision of a ‘Quick Clean’ to Oiled Wildlife

Authors: Lawrence N. Ngeh, John D. Orbell, Stephen W. Bigger, Kasup Munaweera, Peter Dann

Abstract:

This research is part of a broad program aimed at advancing the science and technology involved in the rescue and rehabilitation of oiled wildlife. One aspect of this research involves the use of oil-sequestering magnetic particles for the removal of contaminants from plumage – so-called “magnetic cleansing". This treatment offers a number of advantages over conventional detergent-based methods including portability - which offers the possibility of providing a “quick clean" to the animal upon first encounter in the field. This could be particularly advantageous when the contaminant is toxic and/or corrosive and/or where there is a delay in transporting the victim to a treatment centre. The method could also be useful as part of a stabilization protocol when large numbers of affected animals are awaiting treatment. This presentation describes the design, development and testing of a prototype field kit for providing a “quick clean" to contaminated wildlife in the field.

Keywords: Magnetic Particles, Oiled Wildlife, Quick Clean, Wildlife Rehabilitation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1805
140 Time and Cost Efficiency Analysis of Quick Die Change System on Metal Stamping Industry

Authors: Rudi Kurniawan Arief

Abstract:

Manufacturing cost and setup time are the hot topics to improve in Metal Stamping industry because material and components price are always rising up while costumer requires to cut down the component price year by year. The Single Minute Exchange of Die (SMED) is one of many methods to reduce waste in stamping industry. The Japanese Quick Die Change (QDC) dies system is one of SMED systems that could reduce both of setup time and manufacturing cost. However, this system is rarely used in stamping industries. This paper will analyze how deep the QDC dies system could reduce setup time and the manufacturing cost. The research is conducted by direct observation, simulating and comparing of QDC dies system with conventional dies system. In this research, we found that the QDC dies system could save up to 35% of manufacturing cost and reduce 70% of setup times. This simulation proved that the QDC die system is effective for cost reduction but must be applied in several parallel production processes.

Keywords: Press die, metal stamping, quick die change, QDC system, single minute exchange die, manufacturing cost saving, SMED.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1141
139 Approximation Incremental Training Algorithm Based on a Changeable Training Set

Authors: Yi-Fan Zhu, Wei Zhang, Xuan Zhou, Qun Li, Yong-Lin Lei

Abstract:

The quick training algorithms and accurate solution procedure for incremental learning aim at improving the efficiency of training of SVR, whereas there are some disadvantages for them, i.e. the nonconvergence of the formers for changeable training set and the inefficiency of the latter for a massive dataset. In order to handle the problems, a new training algorithm for a changeable training set, named Approximation Incremental Training Algorithm (AITA), was proposed. This paper explored the reason of nonconvergence theoretically and discussed the realization of AITA, and finally demonstrated the benefits of AITA both on precision and efficiency.

Keywords: support vector regression, incremental learning, changeable training set, quick training algorithm, accurate solutionprocedure

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1433
138 Quick Similarity Measurement of Binary Images via Probabilistic Pixel Mapping

Authors: Adnan A. Y. Mustafa

Abstract:

In this paper we present a quick technique to measure the similarity between binary images. The technique is based on a probabilistic mapping approach and is fast because only a minute percentage of the image pixels need to be compared to measure the similarity, and not the whole image. We exploit the power of the Probabilistic Matching Model for Binary Images (PMMBI) to arrive at an estimate of the similarity. We show that the estimate is a good approximation of the actual value, and the quality of the estimate can be improved further with increased image mappings. Furthermore, the technique is image size invariant; the similarity between big images can be measured as fast as that for small images. Examples of trials conducted on real images are presented.

Keywords: Big images, binary images, similarity, matching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 871
137 An Efficient and Generic Hybrid Framework for High Dimensional Data Clustering

Authors: Dharmveer Singh Rajput , P. K. Singh, Mahua Bhattacharya

Abstract:

Clustering in high dimensional space is a difficult problem which is recurrent in many fields of science and engineering, e.g., bioinformatics, image processing, pattern reorganization and data mining. In high dimensional space some of the dimensions are likely to be irrelevant, thus hiding the possible clustering. In very high dimensions it is common for all the objects in a dataset to be nearly equidistant from each other, completely masking the clusters. Hence, performance of the clustering algorithm decreases. In this paper, we propose an algorithmic framework which combines the (reduct) concept of rough set theory with the k-means algorithm to remove the irrelevant dimensions in a high dimensional space and obtain appropriate clusters. Our experiment on test data shows that this framework increases efficiency of the clustering process and accuracy of the results.

Keywords: High dimensional clustering, sub-space, k-means, rough set, discernibility matrix.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1889
136 Challenges to Enable Quick Start of an Environmental Monitoring with Wireless Sensor Network Technology

Authors: Masaki Ito, Hideyuki Tokuda, Takao Kawamura, Kazunori Sugahara

Abstract:

With the advancement of wireless sensor network technology, its practical utilization is becoming an important challange. This paper overviews my past environmental monitoring project, and discusses the process of starting the monitoring by classifying it into four steps. The steps to start environmental monitoring can be complicated, but not well discussed by researchers of wireless sensor network technology. This paper demonstrates our activity and challenges in each of the four steps to ease the process, and argues future challenges to enable quick start of environmental monitoring.

Keywords: Environmental Monitoring, Wireless Sensor Network, Field Experiment and Research Challenges.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1913
135 Quick Spatial Assessment of Drought Information Derived from MODIS Imagery Using Amplitude Analysis

Authors: Meng-Lung Lin, Qiubing Wang, Fujun Sun, Tzu-How Chu, Yi-Shiang Shiu

Abstract:

The normalized difference vegetation index (NDVI) and normalized difference moisture index (NDMI) derived from the moderate resolution imaging spectroradiometer (MODIS) have been widely used to identify spatial information of drought condition. The relationship between NDVI and NDMI has been analyzed using Pearson correlation analysis and showed strong positive relationship. The drought indices have detected drought conditions and identified spatial extents of drought. A comparison between normal year and drought year demonstrates that the amplitude analysis considered both vegetation and moisture condition is an effective method to identify drought condition. We proposed the amplitude analysis is useful for quick spatial assessment of drought information at a regional scale.

Keywords: NDVI, NDMI, Drought, remote sensing, spatialassessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2276
134 Data Quality Enhancement with String Length Distribution

Authors: Qi Xiu, Hiromu Hota, Yohsuke Ishii, Takuya Oda

Abstract:

Recently, collectable manufacturing data are rapidly increasing. On the other hand, mega recall is getting serious as a social problem. Under such circumstances, there are increasing needs for preventing mega recalls by defect analysis such as root cause analysis and abnormal detection utilizing manufacturing data. However, the time to classify strings in manufacturing data by traditional method is too long to meet requirement of quick defect analysis. Therefore, we present String Length Distribution Classification method (SLDC) to correctly classify strings in a short time. This method learns character features, especially string length distribution from Product ID, Machine ID in BOM and asset list. By applying the proposal to strings in actual manufacturing data, we verified that the classification time of strings can be reduced by 80%. As a result, it can be estimated that the requirement of quick defect analysis can be fulfilled.

Keywords: Data quality, feature selection, probability distribution, string classification, string length.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1285
133 Low Sulfur Diesel Like Fuel Oil from Quick Remediation Process of Waste Oil Sludge

Authors: Isam A. H. Al Zubaidi

Abstract:

Low sulfur diesel like fuel oil was produced from a quick remediation process of waste oil sludge (WOS). This quick process will reduce the volume of the WOS in petroleum refineries as well as oil fields by transferring the waste to more beneficial product. The practice includes mixing process of WOS with commercial diesel fuel. Different ratios of WOS to diesel fuel were prepared ranging 1:1 to 20:1 by mass. The mixture was continuously mixed for 10 minutes using a bench-type overhead stirrer, and followed by the filtration process to separate the soil waste from filtrate oil product. The quantity and the physical properties of the oil filtrate were measured. It was found that the addition of up to 15% WOS to diesel fuel was accepted without dramatic changes to the properties of diesel fuel. The amount of WOS was decreased by about 60% by mass. This means that about 60% of the mass of sludge was recovered as light fuel oil. The physical properties of the resulting fuel from 10% sludge mixing ratio showed that the specific gravity, ash content, carbon residue, asphaltene content, viscosity, diesel index, cetane number, and calorific value were affected slightly. The color was changed to light black. The sulfur content was increased also. This requires another process to reduce the sulfur content of resulting light fuel. A desulfurization process was achieved using adsorption techniques with activated biomaterial to reduce the sulfur content to acceptable limits. Adsorption process by ZnCl2 activated date palm kernel powder was effective for improvement of the physical properties of diesel like fuel. The final sulfur content was increased to 0.185 wt%. This diesel like fuel can be used in all tractors, buses, tracks inside and outside the refineries. The solid remaining seems to be smooth and can be mixed with asphalt mixture for asphalting the roads or can be used with other materials as asphalt coating material for constructed buildings. Through this process, valuable fuel has been recovered, and the amount of waste material had decreased.

Keywords: Oil sludge, diesel fuel, blending process, filtration process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 268
132 Quick Sequential Search Algorithm Used to Decode High-Frequency Matrices

Authors: Mohammed M. Siddeq, Mohammed H. Rasheed, Omar M. Salih, Marcos A. Rodrigues

Abstract:

This research proposes a data encoding and decoding method based on the Matrix Minimization algorithm. This algorithm is applied to high-frequency coefficients for compression/encoding. The algorithm starts by converting every three coefficients to a single value; this is accomplished based on three different keys. The decoding/decompression uses a search method called QSS (Quick Sequential Search) Decoding Algorithm presented in this research based on the sequential search to recover the exact coefficients. In the next step, the decoded data are saved in an auxiliary array. The basic idea behind the auxiliary array is to save all possible decoded coefficients; this is because another algorithm, such as conventional sequential search, could retrieve encoded/compressed data independently from the proposed algorithm. The experimental results showed that our proposed decoding algorithm retrieves original data faster than conventional sequential search algorithms.

Keywords: Matrix Minimization Algorithm, Decoding Sequential Search Algorithm, image compression, Discrete Cosine Transform, Discrete Wavelet Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 178
131 The Feasibility of Augmenting an Augmented Reality Image Card on a Quick Response Code

Authors: Alfred Chen, Shr Yu Lu, Cong Seng Hong, Yur-June Wang

Abstract:

This research attempts to study the feasibility of augmenting an augmented reality (AR) image card on a Quick Response (QR) code. The authors have developed a new visual tag, which contains a QR code and an augmented AR image card. The new visual tag has features of reading both of the revealed data of the QR code and the instant data from the AR image card. Furthermore, a handheld communicating device is used to read and decode the new visual tag, and then the concealed data of the new visual tag can be revealed and read through its visual display. In general, the QR code is designed to store the corresponding data or, as a key, to access the corresponding data from the server through internet. Those reveled data from the QR code are represented in text. Normally, the AR image card is designed to store the corresponding data in 3-Dimensional or animation/video forms. By using QR code's property of high fault tolerant rate, the new visual tag can access those two different types of data by using a handheld communicating device. The new visual tag has an advantage of carrying much more data than independent QR code or AR image card. The major findings of this research are: 1) the most efficient area for the designed augmented AR card augmenting on the QR code is 9% coverage area out of the total new visual tag-s area, and 2) the best location for the augmented AR image card augmenting on the QR code is located in the bottom-right corner of the new visual tag.

Keywords: Augmented reality, QR code, Visual tag, Handheldcommunicating device

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1504
130 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm

Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn

Abstract:

Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.

Keywords: Binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 688
129 A Quick Prediction for Shear Behaviour of RC Membrane Elements by Fixed-Angle Softened Truss Model with Tension-Stiffening

Authors: X. Wang, J. S. Kuang

Abstract:

The Fixed-angle Softened Truss Model with Tension-stiffening (FASTMT) has a superior performance in predicting the shear behaviour of reinforced concrete (RC) membrane elements, especially for the post-cracking behaviour. Nevertheless, massive computational work is inevitable due to the multiple transcendental equations involved in the stress-strain relationship. In this paper, an iterative root-finding technique is introduced to FASTMT for solving quickly the transcendental equations of the tension-stiffening effect of RC membrane elements. This fast FASTMT, which performs in MATLAB, uses the bisection method to calculate the tensile stress of the membranes. By adopting the simplification, the elapsed time of each loop is reduced significantly and the transcendental equations can be solved accurately. Owing to the high efficiency and good accuracy as compared with FASTMT, the fast FASTMT can be further applied in quick prediction of shear behaviour of complex large-scale RC structures.

Keywords: Bisection method, fixed-angle softened truss model with tension-stiffening, iterative root-finding technique, reinforced concrete membrane.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 787
128 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization

Authors: Hironori Karachi, Haruka Yamashita

Abstract:

Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.

Keywords: Data science, non-negative matrix factorization, missing data, quality of services.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 391
127 The Distance between a Point and a Bezier Curveon a Bezier Surface

Authors: Wen-Haw Chen, Sheng-Gwo Chen

Abstract:

The distance between two objects is an important problem in CAGD, CAD and CG etc. It will be presented in this paper that a simple and quick method to estimate the distance between a point and a Bezier curve on a Bezier surface.

Keywords: Geodesic-like curve, distance, projection, Bezier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2509
126 Integration Methods and Processes of Product Design and Flexible Production for Direct Production within the iCIM 3000 System

Authors: Roman Ružarovský, Radovan Holubek, Daynier Rolando Delgado Sobrino

Abstract:

Currently is characterized production engineering together with the integration of industrial automation and robotics such very quick view of to manufacture the products. The production range is continuously changing, expanding and producers have to be flexible in this regard. It means that need to offer production possibilities, which can respond to the quick change. Engineering product development is focused on supporting CAD software, such systems are mainly used for product design. That manufacturers are competitive, it should be kept procured machines made available capable of responding to output flexibility. In response to that problem is the development of flexible manufacturing systems, consisting of various automated systems. The integration of flexible manufacturing systems and subunits together with product design and of engineering is a possible solution for this issue. Integration is possible through the implementation of CIM systems. Such a solution and finding a hyphen between CAD and procurement system ICIM 3000 from Festo Co. is engaged in the research project and this contribution. This can be designed the products in CAD systems and watch the manufacturing process from order to shipping by the development of methods and processes of integration, This can be modeled in CAD systems products and watch the manufacturing process from order to shipping to develop methods and processes of integration, which will improve support for product design parameters by monitoring of the production process, by creating of programs for production using the CAD and therefore accelerates the a total of process from design to implementation.

Keywords: CAD- Computer Aided Design, CAM- Computer Aided Manufacturing, CIM- Computer integrated manufacturing, iCIM 3000, integration, direct production from CAD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2257
125 Profile Calculation in Water Phantom of Symmetric and Asymmetric Photon Beam

Authors: N. Chegeni, M. J. Tahmasebi Birgani

Abstract:

Nowadays, in most radiotherapy departments, the commercial treatment planning systems (TPS) used to calculate dose distributions needs to be verified; therefore, quick, easy-to-use and low cost dose distribution algorithms are desirable to test and verify the performance of the TPS. In this paper, we put forth an analytical method to calculate the phantom scatter contribution and depth dose on the central axis based on the equivalent square concept. Then, this method was generalized to calculate the profiles at any depth and for several field shapes regular or irregular fields under symmetry and asymmetry photon beam conditions. Varian 2100 C/D and Siemens Primus Plus Linacs with 6 and 18 MV photon beam were used for irradiations. Percentage depth doses (PDDs) were measured for a large number of square fields for both energies, and for 45º wedges which were employed to obtain the profiles in any depth. To assess the accuracy of the calculated profiles, several profile measurements were carried out for some treatment fields. The calculated and measured profiles were compared by gamma-index calculation. All γ–index calculations were based on a 3% dose criterion and a 3 mm dose-to-agreement (DTA) acceptance criterion. The γ values were less than 1 at most points. However, the maximum γ observed was about 1.10 in the penumbra region in most fields and in the central area for the asymmetric fields. This analytical approach provides a generally quick and fairly accurate algorithm to calculate dose distribution for some treatment fields in conventional radiotherapy.

Keywords: Dose distribution, equivalent field, asymmetric field, irregular field.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2991
124 Motion Capture Based Wizard of Oz Technique for Humanoid Robot

Authors: Rafal Stegierski, Krzysztof Dmitruk

Abstract:

The paper focus on robotic telepresence system build around humanoid robot operated with controller-less Wizard of Oz technique. Proposed solution gives possibility to quick start acting as a operator with short, if any, initial training.

Keywords: Robotics, Motion Capture, Wizard of Oz, Humanoid Robots, Human Robot Interaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1730
123 A Formatting Method for Transforming XML Data into HTML

Authors: Zhe JIN, Motomichi TOYAMA

Abstract:

In this paper, we propose a fixed formatting method of PPX(Pretty Printer for XML). PPX is a query language for XML database which has extensive formatting capability that produces HTML as the result of a query. The fixed formatting method is to completely specify the combination of variables and layout specification operators within the layout expression of the GENERATE clause of PPX. In the experiment, a quick comparison shows that PPX requires far less description compared to XSLT or XQuery programs doing the same tasks.

Keywords: PPX, XML, HTML, XSLT, XQuery, fixed formatting method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1324
122 Simulation of an Auto-Tuning Bicycle Suspension Fork with Quick Releasing Valves

Authors: Y. C. Mao, G. S. Chen

Abstract:

Bicycle configuration is not as large as those of motorcycles or automobiles, while it indeed composes a complicated dynamic system. People-s requirements on comfortability, controllability and safety grow higher as the research and development technologies improve. The shock absorber affects the vehicle suspension performances enormously. The absorber takes the vibration energy and releases it at a suitable time, keeping the wheel under a proper contact condition with road surface, maintaining the vehicle chassis stability. Suspension design for mountain bicycles is more difficult than that of city bikes since it encounters dynamic variations on road and loading conditions. Riders need a stiff damper as they exert to tread on the pedals when climbing, while a soft damper when they descend downhill. Various switchable shock absorbers are proposed in markets, however riders have to manually switch them among soft, hard and lock positions. This study proposes a novel design of the bicycle shock absorber, which provides automatic smooth tuning of the damping coefficient, from a predetermined lower bound to theoretically unlimited. An automatic quick releasing valve is involved in this design so that it can release the peak pressure when the suspension fork runs into a square-wave type obstacle and prevent the chassis from damage, avoiding the rider skeleton from injury. This design achieves the automatic tuning process by innovative plunger valve and fluidic passage arrangements without any electronic devices. Theoretical modelling of the damper and spring are established in this study. Design parameters of the valves and fluidic passages are determined. Relations between design parameters and shock absorber performances are discussed in this paper. The analytical results give directions to the shock absorber manufacture.

Keywords: Modelling, Simulation, Bicycle, Shock Absorber, Damping, Releasing Valve

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2843
121 Rapid Monitoring of Earthquake Damages Using Optical and SAR Data

Authors: Saeid Gharechelou, Ryutaro Tateishi

Abstract:

Earthquake is an inevitable catastrophic natural disaster. The damages of buildings and man-made structures, where most of the human activities occur are the major cause of casualties from earthquakes. A comparison of optical and SAR data is presented in the case of Kathmandu valley which was hardly shaken by 2015-Nepal Earthquake. Though many existing researchers have conducted optical data based estimated or suggested combined use of optical and SAR data for improved accuracy, however finding cloud-free optical images when urgently needed are not assured. Therefore, this research is specializd in developing SAR based technique with the target of rapid and accurate geospatial reporting. Should considers that limited time available in post-disaster situation offering quick computation exclusively based on two pairs of pre-seismic and co-seismic single look complex (SLC) images. The InSAR coherence pre-seismic, co-seismic and post-seismic was used to detect the change in damaged area. In addition, the ground truth data from field applied to optical data by random forest classification for detection of damaged area. The ground truth data collected in the field were used to assess the accuracy of supervised classification approach. Though a higher accuracy obtained from the optical data then integration by optical-SAR data. Limitation of cloud-free images when urgently needed for earthquak evevent are and is not assured, thus further research on improving the SAR based damage detection is suggested. Availability of very accurate damage information is expected for channelling the rescue and emergency operations. It is expected that the quick reporting of the post-disaster damage situation quantified by the rapid earthquake assessment should assist in channeling the rescue and emergency operations, and in informing the public about the scale of damage.

Keywords: Sentinel-1A data, Landsat-8, earthquake damage, InSAR, rapid monitoring, 2015-Nepal earthquake.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1010
120 Plant Layout Analysis by Computer Simulation for Electronic Manufacturing Service Plant

Authors: Visuwan D., Phruksaphanrat B

Abstract:

In this research, computer simulation is used for Electronic Manufacturing Service (EMS) plant layout analysis. The current layout of this manufacturing plant is a process layout, which is not suitable due to the nature of an EMS that has high-volume and high-variety environment. Moreover, quick response and high flexibility are also needed. Then, cellular manufacturing layout design was determined for the selected group of products. Systematic layout planning (SLP) was used to analyze and design the possible cellular layouts for the factory. The cellular layout was selected based on the main criteria of the plant. Computer simulation was used to analyze and compare the performance of the proposed cellular layout and the current layout. It found that the proposed cellular layout can generate better performances than the current layout. In this research, computer simulation is used for Electronic Manufacturing Service (EMS) plant layout analysis. The current layout of this manufacturing plant is a process layout, which is not suitable due to the nature of an EMS that has high-volume and high-variety environment. Moreover, quick response and high flexibility are also needed. Then, cellular manufacturing layout design was determined for the selected group of products. Systematic layout planning (SLP) was used to analyze and design the possible cellular layouts for the factory. The cellular layout was selected based on the main criteria of the plant. Computer simulation was used to analyze and compare the performance of the proposed cellular layout and the current layout. It found that the proposed cellular layout can generate better performances than the current layout. 

Keywords: Layout, Electronic Manufacturing Service Plant (EMS), Computer Simulation, Cellular Manufacturing System (CMS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3411
119 A New Approach to Feedback Shift Registers

Authors: Myat Su Mon Win

Abstract:

The pseudorandom number generators based on linear feedback shift registers (LFSRs), are very quick, easy and secure in the implementation of hardware and software. Thus they are very popular and widely used. But LFSRs lead to fairly easy cryptanalysis due to their completely linearity properties. In this paper, we propose a stochastic generator, which is called Random Feedback Shift Register (RFSR), using stochastic transformation (Random block) with one-way and non-linearity properties.

Keywords: Linear Feedback Shift Register, Non Linearity, R_Block, Random Feedback Shift Register

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1771
118 A New Method for Rapid DNA Extraction from Artemia (Branchiopoda, Crustacea)

Authors: R. Manaffar, R. Maleki, S. Zare, N. Agh, S. Soltanian, B. Sehatnia, P. Sorgeloos, P. Bossier, G. Van Stappen

Abstract:

Artemia is one of the most conspicuous invertebrates associated with aquaculture. It can be considered as a model organism, offering numerous advantages for comprehensive and multidisciplinary studies using morphologic or molecular methods. Since DNA extraction is an important step of any molecular experiment, a new and a rapid method of DNA extraction from adult Artemia was described in this study. Besides, the efficiency of this technique was compared with two widely used alternative techniques, namely Chelex® 100 resin and SDS-chloroform methods. Data analysis revealed that the new method is the easiest and the most cost effective method among the other methods which allows a quick and efficient extraction of DNA from the adult animal.

Keywords: APD, Artemia, DNA extraction, Molecularexperiments

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3144
117 SCM Challenges and Opportunitiesin the Timber Construction Sector

Authors: K. Reitner, F. Staberhofer, W. Ortner, M. Gerschberger

Abstract:

The purpose of this paper is to identify the main challenges faced by companies in the timber construction sector and to provide improvement opportunities that can be implemented on a short-, medium- and long-term basis. To identify the challenges and propose actions for each company a literature review and a multiple case research were conducted using the Quick Scan Audit Methodology. Finally, the findings and outcomes are compared with each other to support companies in the timer construction sector when implementing and restructuring their day-to-day activities.

Keywords: Supply chain management, supply chain challenges and opportunities, timber construction sector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1742
116 A Design and Implementation Model for Web Caching Using Server “URL Rewriting“

Authors: Mostafa E. Saleh, A. Abdel Nabi, A. Baith Mohamed

Abstract:

In order to make surfing the internet faster, and to save redundant processing load with each request for the same web page, many caching techniques have been developed to reduce latency of retrieving data on World Wide Web. In this paper we will give a quick overview of existing web caching techniques used for dynamic web pages then we will introduce a design and implementation model that take advantage of “URL Rewriting" feature in some popular web servers, e.g. Apache, to provide an effective approach of caching dynamic web pages.

Keywords: Web Caching, URL Rewriting, Optimizing Web Performance, Dynamic Web Pages Loading Time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1874