Search results for: distance detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2351

Search results for: distance detection

251 A Robust and Adaptive Unscented Kalman Filter for the Air Fine Alignment of the Strapdown Inertial Navigation System/GPS

Authors: Jian Shi, Baoguo Yu, Haonan Jia, Meng Liu, Ping Huang

Abstract:

Adapting to the flexibility of war, a large number of guided weapons launch from aircraft. Therefore, the inertial navigation system loaded in the weapon needs to undergo an alignment process in the air. This article proposes the following methods to the problem of inaccurate modeling of the system under large misalignment angles, the accuracy reduction of filtering caused by outliers, and the noise changes in GPS signals: first, considering the large misalignment errors of Strapdown Inertial Navigation System (SINS)/GPS, a more accurate model is made rather than to make a small-angle approximation, and the Unscented Kalman Filter (UKF) algorithms are used to estimate the state; then, taking into account the impact of GPS noise changes on the fine alignment algorithm, the innovation adaptive filtering algorithm is introduced to estimate the GPS’s noise in real-time; at the same time, in order to improve the anti-interference ability of the air fine alignment algorithm, a robust filtering algorithm based on outlier detection is combined with the air fine alignment algorithm to improve the robustness of the algorithm. The algorithm can improve the alignment accuracy and robustness under interference conditions, which is verified by simulation.

Keywords: Air alignment, fine alignment, inertial navigation system, integrated navigation system, UKF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 471
250 Graph Codes-2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval

Authors: Stefan Wagenpfeil, Felix Engel, Paul McKevitt, Matthias Hemmje

Abstract:

Multimedia Indexing and Retrieval is generally de-signed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, espe-cially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelisation. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.

Keywords: indexing, retrieval, multimedia, graph code, graph algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 374
249 Satellite Sensing for Evaluation of an Irrigation System in Cotton - Wheat Zone

Authors: Sadia Iqbal, Faheem Iqbal, Furqan Iqbal

Abstract:

Efficient utilization of existing water is a pressing need for Pakistan. Due to rising population, reduction in present storage capacity and poor delivery efficiency of 30 to 40% from canal. A study to evaluate an irrigation system in the cotton-wheat zone of Pakistan, after the watercourse lining was conducted. The study is made on the basis of cropping pattern and salinity to evaluate the system. This study employed an index-based approach of using Geographic information system with field data. The satellite images of different years were use to examine the effective area. Several combinations of the ratio of signals received in different spectral bands were used for development of this index. Near Infrared and Thermal IR spectral bands proved to be most effective as this combination helped easy detection of salt affected area and cropping pattern of the study area. Result showed that 9.97% area under salinity in 1992, 9.17% in 2000 and it left 2.29% in year 2005. Similarly in 1992, 45% area is under vegetation it improves to 56% and 65% in 2000 and 2005 respectively. On the basis of these results evaluation is done 30% performance is increase after the watercourse improvement.

Keywords: Salinity, remote sensing index, salinity index, cropping pattern.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1641
248 Exploring Influence Range of Tainan City Using Electronic Toll Collection Big Data

Authors: Chen Chou, Feng-Tyan Lin

Abstract:

Big Data has been attracted a lot of attentions in many fields for analyzing research issues based on a large number of maternal data. Electronic Toll Collection (ETC) is one of Intelligent Transportation System (ITS) applications in Taiwan, used to record starting point, end point, distance and travel time of vehicle on the national freeway. This study, taking advantage of ETC big data, combined with urban planning theory, attempts to explore various phenomena of inter-city transportation activities. ETC, one of government's open data, is numerous, complete and quick-update. One may recall that living area has been delimited with location, population, area and subjective consciousness. However, these factors cannot appropriately reflect what people’s movement path is in daily life. In this study, the concept of "Living Area" is replaced by "Influence Range" to show dynamic and variation with time and purposes of activities. This study uses data mining with Python and Excel, and visualizes the number of trips with GIS to explore influence range of Tainan city and the purpose of trips, and discuss living area delimited in current. It dialogues between the concepts of "Central Place Theory" and "Living Area", presents the new point of view, integrates the application of big data, urban planning and transportation. The finding will be valuable for resource allocation and land apportionment of spatial planning.

Keywords: Big Data, ITS, influence range, living area, central place theory, visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 932
247 Detection of Arcobacter and Helicobacter pylori Contamination in Organic Vegetables by Cultural and PCR Methods

Authors: Miguel García-Ferrús, Ana González, María A. Ferrús

Abstract:

The most demanded organic foods worldwide are those that are consumed fresh, such as fruits and vegetables. However, there is a knowledge gap about some aspects of organic food microbiological quality and safety. Organic fruits and vegetables are more exposed to pathogenic microorganisms due to surface contact with natural fertilizers such as animal manure, wastes and vermicompost used during farming. Therefore, the objective of this work was to study the contamination of organic fresh green leafy vegetables by two emergent pathogens, Arcobacter spp. and Helicobacter pylori. For this purpose, a total of 24 vegetable samples, 13 lettuce and 11 spinach were acquired from 10 different ecological supermarkets and greengroceries and analyzed by culture and PCR. Arcobacter spp. was detected in five samples (20%) by PCR, four spinach and one lettuce. One spinach sample was found to be also positive by culture. For H. pylori, the H. pylori VacA gene-specific band was detected in 12 vegetable samples (50%), 10 lettuces and two spinach. Isolation in the selective medium did not yield any positive result, possibly because of low contamination levels together with the presence of the organism in its viable but non-culturable form. Results showed significant levels of H. pylori and Arcobacter contamination in organic vegetables that are generally consumed raw, which seems to confirm that these foods can act as transmission vehicles to humans.

Keywords: Arcobacter spp., Helicobacter pylori, organic vegetables, Polymerase Chain Reaction, PCR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 309
246 Fault and Theft Recognition Using Toro Dial Sensor in Programmable Current Relay for Feeder Security

Authors: R. Kamalakannan, N. Ravi Kumar

Abstract:

Feeder protection is important in transmission and distribution side because if any fault occurs in any feeder or transformer, man power is needed to identify the problem and it will take more time. In the existing system, directional overcurrent elements with load further secured by a load encroachment function can be used to provide necessary security and sensitivity for faults on remote points in a circuit. It is validated only in renewable plant collector circuit protection applications over a wide range of operating conditions. In this method, the directional overcurrent feeder protection is developed by using monitoring of feeder section through internet. In this web based monitoring, the fault and power theft are identified by using Toro dial sensor and its information is received by SCADA (Supervisory Control and Data Acquisition) and controlled by ARM microcontroller. This web based monitoring is also used to monitor the feeder management, directional current detection, demand side management, overload fault. This monitoring system is capable of monitoring the distribution feeder over a large area depending upon the cost. It is also used to reduce the power theft, time and man power. The simulation is done by MATLAB software.

Keywords: Current sensor, distribution feeder protection, directional overcurrent, power theft, protective relay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 758
245 An Efficient Architecture for Interleaved Modular Multiplication

Authors: Ahmad M. Abdel Fattah, Ayman M. Bahaa El-Din, Hossam M.A. Fahmy

Abstract:

Modular multiplication is the basic operation in most public key cryptosystems, such as RSA, DSA, ECC, and DH key exchange. Unfortunately, very large operands (in order of 1024 or 2048 bits) must be used to provide sufficient security strength. The use of such big numbers dramatically slows down the whole cipher system, especially when running on embedded processors. So far, customized hardware accelerators - developed on FPGAs or ASICs - were the best choice for accelerating modular multiplication in embedded environments. On the other hand, many algorithms have been developed to speed up such operations. Examples are the Montgomery modular multiplication and the interleaved modular multiplication algorithms. Combining both customized hardware with an efficient algorithm is expected to provide a much faster cipher system. This paper introduces an enhanced architecture for computing the modular multiplication of two large numbers X and Y modulo a given modulus M. The proposed design is compared with three previous architectures depending on carry save adders and look up tables. Look up tables should be loaded with a set of pre-computed values. Our proposed architecture uses the same carry save addition, but replaces both look up tables and pre-computations with an enhanced version of sign detection techniques. The proposed architecture supports higher frequencies than other architectures. It also has a better overall absolute time for a single operation.

Keywords: Montgomery multiplication, modular multiplication, efficient architecture, FPGA, RSA

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2400
244 Effect of L-Arginine on Neuromuscular Transmission of the Chick Biventer Cervicis Muscle

Authors: S. Asadzadeh Vostakolaei

Abstract:

In this study, the effect of L-arginine was examined at the neuromuscular junction of the chick biventer cervicis muscle. LArginine at 500 μg/ ml, decreased twitch response to electerical stimulation, and produced rightward shift of the dose- response curve for acetylcholine or carbachol. L-Arginine at 1000μg/ ml produced a strong shift to the right of the dose – response curve for acetylcholine or carbachol with a reduction in the efficacy. The inhibitory effect of L-arginine on the twitch response was blocked by caffeine (200μg/ ml). NO levels were also measured in the chick biventer cervicis muscle homogenates, using spectrophotometric method for the direct detection of NO, nitrite and nitrate. Total nitrite (nitrite + nitrate) was measured by a spectrophotometer at 540 nm after the conversion of nitrate to nitrite by copperized cadmium granules. NO levels were found to be significantly increased in concentrations 500 and 1000μg/ ml of L-arginine in comparison with the control group (p<0.001). These findings indicate a possible role of increased NO levels in the suppressive action of L-arginine on the twitch response. In addition, the results indicate that the post- junctional antagonistic action of L-arginine is probably the result of impaired sarcoplasmic reticulum (SR) Ca+2 releases.

Keywords: Chick, L-Arginine, Nitric Oxide, Skeletal muscle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1674
243 A Novel Neighborhood Defined Feature Selection on Phase Congruency Images for Recognition of Faces with Extreme Variations

Authors: Satyanadh Gundimada, Vijayan K Asari

Abstract:

A novel feature selection strategy to improve the recognition accuracy on the faces that are affected due to nonuniform illumination, partial occlusions and varying expressions is proposed in this paper. This technique is applicable especially in scenarios where the possibility of obtaining a reliable intra-class probability distribution is minimal due to fewer numbers of training samples. Phase congruency features in an image are defined as the points where the Fourier components of that image are maximally inphase. These features are invariant to brightness and contrast of the image under consideration. This property allows to achieve the goal of lighting invariant face recognition. Phase congruency maps of the training samples are generated and a novel modular feature selection strategy is implemented. Smaller sub regions from a predefined neighborhood within the phase congruency images of the training samples are merged to obtain a large set of features. These features are arranged in the order of increasing distance between the sub regions involved in merging. The assumption behind the proposed implementation of the region merging and arrangement strategy is that, local dependencies among the pixels are more important than global dependencies. The obtained feature sets are then arranged in the decreasing order of discriminating capability using a criterion function, which is the ratio of the between class variance to the within class variance of the sample set, in the PCA domain. The results indicate high improvement in the classification performance compared to baseline algorithms.

Keywords: Discriminant analysis, intra-class probability distribution, principal component analysis, phase congruency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1811
242 A Study on the Developing Method of the BIM (Building Information Modeling) Software Based On Cloud Computing Environment

Authors: Byung-Kon Kim

Abstract:

According as the Architecture, Engineering and Construction (AEC) Industry projects have grown more complex and larger, the number of utilization of BIM for 3D design and simulation is increasing significantly. Therefore, typical applications of BIM such as clash detection and alternative measures based on 3-dimenstional planning are expanded to process management, cost and quantity management, structural analysis, check for regulation, and various domains for virtual design and construction. Presently, commercial BIM software is operated on single-user environment, so initial cost is so high and the investment may be wasted frequently. Cloud computing that is a next-generation internet technology enables simple internet devices (such as PC, Tablet, Smart phone etc) to use services and resources of BIM software. In this paper, we suggested developing method of the BIM software based on cloud computing environment in order to expand utilization of BIM and reduce cost of BIM software. First, for the benchmarking, we surveyed successful case of BIM and cloud computing. And we analyzed needs and opportunities of BIM and cloud computing in AEC Industry. Finally, we suggested main functions of BIM software based on cloud computing environment and developed a simple prototype of cloud computing BIM software for basic BIM model viewing.

Keywords: Construction IT, BIM(Building Information Modeling), Cloud Computing, BIM Service Based Cloud Computing, Viewer Based BIM Server, 3D Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4042
241 Testing Loaded Programs Using Fault Injection Technique

Authors: S. Manaseer, F. A. Masooud, A. A. Sharieh

Abstract:

Fault tolerance is critical in many of today's large computer systems. This paper focuses on improving fault tolerance through testing. Moreover, it concentrates on the memory faults: how to access the editable part of a process memory space and how this part is affected. A special Software Fault Injection Technique (SFIT) is proposed for this purpose. This is done by sequentially scanning the memory of the target process, and trying to edit maximum number of bytes inside that memory. The technique was implemented and tested on a group of programs in software packages such as jet-audio, Notepad, Microsoft Word, Microsoft Excel, and Microsoft Outlook. The results from the test sample process indicate that the size of the scanned area depends on several factors. These factors are: process size, process type, and virtual memory size of the machine under test. The results show that increasing the process size will increase the scanned memory space. They also show that input-output processes have more scanned area size than other processes. Increasing the virtual memory size will also affect the size of the scanned area but to a certain limit.

Keywords: Complex software systems, Error detection, Fault tolerance, Injection and testing methodology, Memory faults, Process and virtual memory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1835
240 Assessing the Effect of the Position of the Cavities on the Inner Plate of the Steel Shear Wall under Time History Dynamic Analysis

Authors: Masoud Mahdavi, Mojtaba Farzaneh Moghadam

Abstract:

The seismic forces caused by the waves created in the depths of the earth during the earthquake hit the structure and cause the building to vibrate. Creating large seismic forces will cause low-strength sections in the structure to suffer extensive surface damage. The use of new steel shear walls in steel structures has caused the strength of the building and its main members (columns) to increase due to the reduction and depreciation of seismic forces during earthquakes. In the present study, an attempt was made to evaluate a type of steel shear wall that has regular holes in the inner sheet by modeling the finite element model with Abacus software. The shear wall of the steel plate, measuring 6000 × 3000 mm (one floor) and 3 mm thickness, was modeled with four different pores with a cross-sectional area. The shear wall was dynamically subjected to a time history of 5 seconds by three accelerators, El Centro, Imperial Valley and Kobe. The results showed that increasing the distance between the geometric center of the hole and the geometric center of the inner plate in the steel shear wall (increasing the RCS index) caused the total maximum acceleration to be transferred from the perimeter of the hole to horizontal and vertical beams. The results also show that there is no direct relationship between RCS index and total acceleration in steel shear wall and RCS index is separate from the peak ground acceleration value of earthquake.

Keywords: Hollow Steel plate shear wall, time history analysis, finite element method, Abaqus Software.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 514
239 The Role of Velocity Map Quality in Estimation of Intravascular Pressure Distribution

Authors: Ali Pashaee, Parisa Shooshtari, Gholamreza Atae, Nasser Fatouraee

Abstract:

Phase-Contrast MR imaging methods are widely used for measurement of blood flow velocity components. Also there are some other tools such as CT and Ultrasound for velocity map detection in intravascular studies. These data are used in deriving flow characteristics. Some clinical applications are investigated which use pressure distribution in diagnosis of intravascular disorders such as vascular stenosis. In this paper an approach to the problem of measurement of intravascular pressure field by using velocity field obtained from flow images is proposed. The method presented in this paper uses an algorithm to calculate nonlinear equations of Navier- Stokes, assuming blood as an incompressible and Newtonian fluid. Flow images usually suffer the lack of spatial resolution. Our attempt is to consider the effect of spatial resolution on the pressure distribution estimated from this method. In order to achieve this aim, velocity map of a numerical phantom is derived at six different spatial resolutions. To determine the effects of vascular stenoses on pressure distribution, a stenotic phantom geometry is considered. A comparison between the pressure distribution obtained from the phantom and the pressure resulted from the algorithm is presented. In this regard we also compared the effects of collocated and staggered computational grids on the pressure distribution resulted from this algorithm.

Keywords: Flow imaging, pressure distribution estimation, phantom, resolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
238 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact

Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed

Abstract:

Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).

Keywords: Classification, Bayesian network; structure learning, K2 algorithm, expert knowledge, surface water analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 450
237 A Case Study of Applying Virtual Prototyping in Construction

Authors: Stephen C. W. Kong

Abstract:

The use of 3D computer-aided design (CAD) models to support construction project planning has been increasing in the previous year. 3D CAD models reveal more planning ideas by visually showing the construction site environment in different stages of the construction process. Using 3D CAD models together with scheduling software to prepare construction plan can identify errors in process sequence and spatial arrangement, which is vital to the success of a construction project. A number of 4D (3D plus time) CAD tools has been developed and utilized in different construction projects due to the awareness of their importance. Virtual prototyping extends the idea of 4D CAD by integrating more features for simulating real construction process. Virtual prototyping originates from the manufacturing industry where production of products such as cars and airplanes are virtually simulated in computer before they are built in the factory. Virtual prototyping integrates 3D CAD, simulation engine, analysis tools (like structural analysis and collision detection), and knowledgebase to streamline the whole product design and production process. In this paper, we present the application of a virtual prototyping software which has been used in a few construction projects in Hong Kong to support construction project planning. Specifically, the paper presents an implementation of virtual prototyping in a residential building project in Hong Kong. The applicability, difficulties and benefits of construction virtual prototyping are examined based on this project.

Keywords: construction project planning, prefabrication, simulation, virtual prototyping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2774
236 Adaptive Block State Update Method for Separating Background

Authors: Youngsuck Ji, Youngjoon Han, Hernsoo Hahn

Abstract:

In this paper, we proposed the robust mobile object detection method for light effect in the night street image block based updating reference background model using block state analysis. Experiment image is acquired sequence color video from steady camera. When suddenly appeared artificial illumination, reference background model update this information such as street light, sign light. Generally natural illumination is change by temporal, but artificial illumination is suddenly appearance. So in this paper for exactly detect artificial illumination have 2 state process. First process is compare difference between current image and reference background by block based, it can know changed blocks. Second process is difference between current image-s edge map and reference background image-s edge map, it possible to estimate illumination at any block. This information is possible to exactly detect object, artificial illumination and it was generating reference background more clearly. Block is classified by block-state analysis. Block-state has a 4 state (i.e. transient, stationary, background, artificial illumination). Fig. 1 is show characteristic of block-state respectively [1]. Experimental results show that the presented approach works well in the presence of illumination variance.

Keywords: Block-state, Edge component, Reference backgroundi, Artificial illumination.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1276
235 Laser Registration and Supervisory Control of neuroArm Robotic Surgical System

Authors: Hamidreza Hoshyarmanesh, Hosein Madieh, Sanju Lama, Yaser Maddahi, Garnette R. Sutherland, Kourosh Zareinia

Abstract:

This paper illustrates the concept of an algorithm to register specified markers on the neuroArm surgical manipulators, an image-guided MR-compatible tele-operated robot for microsurgery and stereotaxy. Two range-finding algorithms, namely time-of-flight and phase-shift, are evaluated for registration and supervisory control. The time-of-flight approach is implemented in a semi-field experiment to determine the precise position of a tiny retro-reflective moving object. The moving object simulates a surgical tool tip. The tool is a target that would be connected to the neuroArm end-effector during surgery inside the magnet bore of the MR imaging system. In order to apply flight approach, a 905-nm pulsed laser diode and an avalanche photodiode are utilized as the transmitter and receiver, respectively. For the experiment, a high frequency time to digital converter was designed using a field-programmable gate arrays. In the phase-shift approach, a continuous green laser beam with a wavelength of 530 nm was used as the transmitter. Results showed that a positioning error of 0.1 mm occurred when the scanner-target point distance was set in the range of 2.5 to 3 meters. The effectiveness of this non-contact approach exhibited that the method could be employed as an alternative for conventional mechanical registration arm. Furthermore, the approach is not limited by physical contact and extension of joint angles.

Keywords: 3D laser scanner, intraoperative MR imaging, neuroArm, real time registration, robot-assisted surgery, supervisory control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1000
234 A Holographic Infotainment System for Connected and Driverless Cars: An Exploratory Study of Gesture Based Interaction

Authors: Nicholas Lambert, Seungyeon Ryu, Mehmet Mulla, Albert Kim

Abstract:

In this paper, an interactive in-car interface called HoloDash is presented. It is intended to provide information and infotainment in both autonomous vehicles and ‘connected cars’, vehicles equipped with Internet access via cellular services. The research focuses on the development of interactive avatars for this system and its gesture-based control system. This is a case study for the development of a possible human-centred means of presenting a connected or autonomous vehicle’s On-Board Diagnostics through a projected ‘holographic’ infotainment system. This system is termed a Holographic Human Vehicle Interface (HHIV), as it utilises a dashboard projection unit and gesture detection. The research also examines the suitability for gestures in an automotive environment, given that it might be used in both driver-controlled and driverless vehicles. Using Human Centred Design methods, questions were posed to test subjects and preferences discovered in terms of the gesture interface and the user experience for passengers within the vehicle. These affirm the benefits of this mode of visual communication for both connected and driverless cars.

Keywords: Holographic interface, human-computer interaction, user-centered design, Gesture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 988
233 End-to-End Spanish-English Sequence Learning Translation Model

Authors: Vidhu Mitha Goutham, Ruma Mukherjee

Abstract:

The low availability of well-trained, unlimited, dynamic-access models for specific languages makes it hard for corporate users to adopt quick translation techniques and incorporate them into product solutions. As translation tasks increasingly require a dynamic sequence learning curve; stable, cost-free opensource models are scarce. We survey and compare current translation techniques and propose a modified sequence to sequence model repurposed with attention techniques. Sequence learning using an encoder-decoder model is now paving the path for higher precision levels in translation. Using a Convolutional Neural Network (CNN) encoder and a Recurrent Neural Network (RNN) decoder background, we use Fairseq tools to produce an end-to-end bilingually trained Spanish-English machine translation model including source language detection. We acquire competitive results using a duo-lingo-corpus trained model to provide for prospective, ready-made plug-in use for compound sentences and document translations. Our model serves a decent system for large, organizational data translation needs. While acknowledging its shortcomings and future scope, it also identifies itself as a well-optimized deep neural network model and solution.

Keywords: Attention, encoder-decoder, Fairseq, Seq2Seq, Spanish, translation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 416
232 Numerical Investigation of Nozzle Shape Effect on Shock Wave in Natural Gas Processing

Authors: Esam I. Jassim, Mohamed M. Awad

Abstract:

Natural gas flow contains undesirable solid particles, liquid condensation, and/or oil droplets and requires reliable removing equipment to perform filtration. Recent natural gas processing applications are demanded compactness and reliability of process equipment. Since conventional means are sophisticated in design, poor in efficiency, and continue lacking robust, a supersonic nozzle has been introduced as an alternative means to meet such demands. A 3-D Convergent-Divergent Nozzle is simulated using commercial Code for pressure ratio (NPR) varies from 1.2 to 2. Six different shapes of nozzle are numerically examined to illustrate the position of shock-wave as such spot could be considered as a benchmark of particle separation. Rectangle, triangle, circular, elliptical, pentagon, and hexagon nozzles are simulated using Fluent Code with all have same cross-sectional area. The simple one-dimensional inviscid theory does not describe the actual features of fluid flow precisely as it ignores the impact of nozzle configuration on the flow properties. CFD Simulation results, however, show that nozzle geometry influences the flow structures including location of shock wave. The CFD analysis predicts shock appearance when p01/pa>1.2 for almost all geometry and locates at the lower area ratio (Ae/At). Simulation results showed that shock wave in Elliptical nozzle has the farthest distance from the throat among the others at relatively small NPR. As NPR increases, hexagon would be the farthest. The numerical result is compared with available experimental data and has shown good agreement in terms of shock location and flow structure.

Keywords: CFD, Particle Separation, Shock wave, Supersonic Nozzle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3201
231 Thermographic Tests of Curved GFRP Structures with Delaminations: Numerical Modelling vs. Experimental Validation

Authors: P. D. Pastuszak

Abstract:

The present work is devoted to thermographic studies of curved composite panels (unidirectional GFRP) with subsurface defects. Various artificial defects, created by inserting PTFE stripe between individual layers of a laminate during manufacturing stage are studied. The analysis is conducted both with the use finite element method and experiments. To simulate transient heat transfer in 3D model with embedded various defect sizes, the ANSYS package is used. Pulsed Thermography combined with optical excitation source provides good results for flat surfaces. Composite structures are mostly used in complex components, e.g., pipes, corners and stiffeners. Local decrease of mechanical properties in these regions can have significant influence on strength decrease of the entire structure. Application of active procedures of thermography to defect detection and evaluation in this type of elements seems to be more appropriate that other NDT techniques. Nevertheless, there are various uncertainties connected with correct interpretation of acquired data. In this paper, important factors concerning Infrared Thermography measurements of curved surfaces in the form of cylindrical panels are considered. In addition, temperature effects on the surface resulting from complex geometry and embedded and real defect are also presented.

Keywords: Active thermography, finite element analysis, composite, curved structures, defects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1665
230 Use of Caffeine and Human Pharmaceutical Compounds to Identify Sewage Contamination

Authors: Jingming Wu, Junqi Yue, Ruikang Hu, Zhaoguang Yang, Lifeng Zhang

Abstract:

Fecal coliform bacteria are widely used as indicators of sewage contamination in surface water. However, there are some disadvantages in these microbial techniques including time consuming (18-48h) and inability in discriminating between human and animal fecal material sources. Therefore, it is necessary to seek a more specific indicator of human sanitary waste. In this study, the feasibility was investigated to apply caffeine and human pharmaceutical compounds to identify the human-source contamination. The correlation between caffeine and fecal coliform was also explored. Surface water samples were collected from upstream, middle-stream and downstream points respectively, along Rochor Canal, as well as 8 locations of Marina Bay. Results indicate that caffeine is a suitable chemical tracer in Singapore because of its easy detection (in the range of 0.30-2.0 ng/mL), compared with other chemicals monitored. Relative low concentrations of human pharmaceutical compounds (< 0.07 ng/mL) in Rochor Canal and Marina Bay water samples make them hard to be detected and difficult to be chemical tracer. However, their existence can help to validate sewage contamination. In addition, it was discovered the high correlation exists between caffeine concentration and fecal coliform density in the Rochor Canal water samples, demonstrating that caffeine is highly related to the human-source contamination.

Keywords: Caffeine, Human Pharmaceutical Compounds, Chemical Tracer, Sewage Contamination.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2471
229 Skin Lesion Segmentation Using Color Channel Optimization and Clustering-based Histogram Thresholding

Authors: Rahil Garnavi, Mohammad Aldeen, M. Emre Celebi, Alauddin Bhuiyan, Constantinos Dolianitis, George Varigos

Abstract:

Automatic segmentation of skin lesions is the first step towards the automated analysis of malignant melanoma. Although numerous segmentation methods have been developed, few studies have focused on determining the most effective color space for melanoma application. This paper proposes an automatic segmentation algorithm based on color space analysis and clustering-based histogram thresholding, a process which is able to determine the optimal color channel for detecting the borders in dermoscopy images. The algorithm is tested on a set of 30 high resolution dermoscopy images. A comprehensive evaluation of the results is provided, where borders manually drawn by four dermatologists, are compared to automated borders detected by the proposed algorithm, applying three previously used metrics of accuracy, sensitivity, and specificity and a new metric of similarity. By performing ROC analysis and ranking the metrics, it is demonstrated that the best results are obtained with the X and XoYoR color channels, resulting in an accuracy of approximately 97%. The proposed method is also compared with two state-of-theart skin lesion segmentation methods.

Keywords: Border detection, Color space analysis, Dermoscopy, Histogram thresholding, Melanoma, Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2196
228 Certain Data Dimension Reduction Techniques for application with ANN based MCS for Study of High Energy Shower

Authors: Gitanjali Devi, Kandarpa Kumar Sarma, Pranayee Datta, Anjana Kakoti Mahanta

Abstract:

Cosmic showers, from their places of origin in space, after entering earth generate secondary particles called Extensive Air Shower (EAS). Detection and analysis of EAS and similar High Energy Particle Showers involve a plethora of experimental setups with certain constraints for which soft-computational tools like Artificial Neural Network (ANN)s can be adopted. The optimality of ANN classifiers can be enhanced further by the use of Multiple Classifier System (MCS) and certain data - dimension reduction techniques. This work describes the performance of certain data dimension reduction techniques like Principal Component Analysis (PCA), Independent Component Analysis (ICA) and Self Organizing Map (SOM) approximators for application with an MCS formed using Multi Layer Perceptron (MLP), Recurrent Neural Network (RNN) and Probabilistic Neural Network (PNN). The data inputs are obtained from an array of detectors placed in a circular arrangement resembling a practical detector grid which have a higher dimension and greater correlation among themselves. The PCA, ICA and SOM blocks reduce the correlation and generate a form suitable for real time practical applications for prediction of primary energy and location of EAS from density values captured using detectors in a circular grid.

Keywords: EAS, Shower, Core, ANN, Location.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1567
227 A Pairwise-Gaussian-Merging Approach: Towards Genome Segmentation for Copy Number Analysis

Authors: Chih-Hao Chen, Hsing-Chung Lee, Qingdong Ling, Hsiao-Jung Chen, Sun-Chong Wang, Li-Ching Wu, H.C. Lee

Abstract:

Segmentation, filtering out of measurement errors and identification of breakpoints are integral parts of any analysis of microarray data for the detection of copy number variation (CNV). Existing algorithms designed for these tasks have had some successes in the past, but they tend to be O(N2) in either computation time or memory requirement, or both, and the rapid advance of microarray resolution has practically rendered such algorithms useless. Here we propose an algorithm, SAD, that is much faster and much less thirsty for memory – O(N) in both computation time and memory requirement -- and offers higher accuracy. The two key ingredients of SAD are the fundamental assumption in statistics that measurement errors are normally distributed and the mathematical relation that the product of two Gaussians is another Gaussian (function). We have produced a computer program for analyzing CNV based on SAD. In addition to being fast and small it offers two important features: quantitative statistics for predictions and, with only two user-decided parameters, ease of use. Its speed shows little dependence on genomic profile. Running on an average modern computer, it completes CNV analyses for a 262 thousand-probe array in ~1 second and a 1.8 million-probe array in 9 seconds

Keywords: Cancer, pathogenesis, chromosomal aberration, copy number variation, segmentation analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1421
226 An Efficient Watermarking Method for MP3 Audio Files

Authors: Dimitrios Koukopoulos, Yiannis Stamatiou

Abstract:

In this work, we present for the first time in our perception an efficient digital watermarking scheme for mpeg audio layer 3 files that operates directly in the compressed data domain, while manipulating the time and subband/channel domain. In addition, it does not need the original signal to detect the watermark. Our scheme was implemented taking special care for the efficient usage of the two limited resources of computer systems: time and space. It offers to the industrial user the capability of watermark embedding and detection in time immediately comparable to the real music time of the original audio file that depends on the mpeg compression, while the end user/audience does not face any artifacts or delays hearing the watermarked audio file. Furthermore, it overcomes the disadvantage of algorithms operating in the PCMData domain to be vulnerable to compression/recompression attacks, as it places the watermark in the scale factors domain and not in the digitized sound audio data. The strength of our scheme, that allows it to be used with success in both authentication and copyright protection, relies on the fact that it gives to the users the enhanced capability their ownership of the audio file not to be accomplished simply by detecting the bit pattern that comprises the watermark itself, but by showing that the legal owner knows a hard to compute property of the watermark.

Keywords: Audio watermarking, mpeg audio layer 3, hard instance generation, NP-completeness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1790
225 High Accuracy ESPRIT-TLS Technique for Wind Turbine Fault Discrimination

Authors: Saad Chakkor, Mostafa Baghouri, Abderrahmane Hajraoui

Abstract:

ESPRIT-TLS method appears a good choice for high resolution fault detection in induction machines. It has a very high effectiveness in the frequency and amplitude identification. Contrariwise, it presents a high computation complexity which affects its implementation in real time fault diagnosis. To avoid this problem, a Fast-ESPRIT algorithm that combined the IIR band-pass filtering technique, the decimation technique and the original ESPRIT-TLS method was employed to enhance extracting accurately frequencies and their magnitudes from the wind stator current with less computation cost. The proposed algorithm has been applied to verify the wind turbine machine need in the implementation of an online, fast, and proactive condition monitoring. This type of remote and periodic maintenance provides an acceptable machine lifetime, minimize its downtimes and maximize its productivity. The developed technique has evaluated by computer simulations under many fault scenarios. Study results prove the performance of Fast- ESPRIT offering rapid and high resolution harmonics recognizing with minimum computation time and less memory cost.

Keywords: Spectral Estimation, ESPRIT-TLS, Real Time, Diagnosis, Wind Turbine Faults, Band-Pass Filtering, Decimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2203
224 Net Regularity and Its Ethical Implications on Internet Stake Holders

Authors: Nourhan Elshenawi

Abstract:

Net Neutrality (NN) is the principle of treating all online data the same without any prioritization of some over others. A research gap in current scholarship about “violations of NN” and the subsequent ethical concerns paves the way for the following research question: To what extent violations of NN entail ethical concerns and implications for Internet stakeholders? To answer this question, NR is examined using the two major action-based ethical theories, Kantian and Utilitarian, across the relevant Internet stakeholders. First some necessary IT background is provided that shapes how the Internet works and who the key stakeholders are. Following the IT background, the relationship between the stakeholders, users, Internet Service Providers (ISPs) and content providers is discussed and illustrated. Then some violations of NN that are currently occurring is covered, without attracting any attention from the general public from an ethical perspective, as a new term Net Regularity (NR). Afterwards, the current scholarship on NN and its violations are discussed, that are mainly from an economic and sociopolitical perspectives to highlight the lack of ethical discussions on the issue. Before moving on to the ethical analysis however, websites are presented as digital entities that are affected by NR and their happiness is measured using functionalism. The analysis concludes that NR is prone to an unethical treatment of Internet stakeholders in the perspective of both theories. Finally, the current Digital Divide in the world is presented to be able to better illustrate the implications of NR. The implications present the new Internet divide that will take place between individuals within society. Through answering the research question using ethical analysis, it attempts to shed some light on the issue of NR and what kind of society it would lead to. NR would not just lead to a divided society, but divided individuals that are separated by something greater than distance, the Internet.

Keywords: Digital divide, digital entities, digital ontology, net neutrality, internet ethics, internet law, internet service providers, websites as beings.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1525
223 A Lactose-Free Yogurt Using Membrane Systems and Modified Milk Protein Concentrate: Production and Characterization

Authors: Shahram Naghizadeh Raeisi, Ali Alghooneh

Abstract:

Using membrane technology and modification of milk protein structural properties, a lactose free yogurt was developed. The functional, textural and structural properties of the sample were evaluated and compared with the commercial ones. Results showed that the modification of protein in high fat set yogurt resulted in 11.55%, 18%, 20.21% and 7.08% higher hardness, consistency, water holding capacity, and shininess values compared with the control one. Furthermore, these indices of modified low fat set yogurt were 21.40%, 25.41%, 28.15% & 10.58% higher than the control one, which could be related to the gel network microstructural properties in yogurt formulated with modified protein. In this way, in comparison with the control one, the index of linkage strength (A), the number of linkages (z), and time scale of linkages (λrel) of the high fat modified yogurt were 22.10%, 50.68%, 21.82% higher than the control one; whereas, the average linear distance between two adjacent crosslinks (ξ), was 16.77% lower than the control one. For low fat modified yogurt, A, z, λrel, and ξ indices were 34.30%, 61.70% and 42.60% higher and 19.20% lower than the control one, respectively. The shelf life of modified yogurt was extended to 10 weeks in the refrigerator, while, the control set yogurt had a 3 weeks shelf life. The acidity of high fat and low fat modified yogurts increased from 76 to 84 and 72 to 80 Dornic degrees during 10 weeks of storage, respectively, whereas for control high fat and low fat yogurts they increased from 82 to 122 and 77 to 112 Dornic degrees, respectively. This behavior could be due to the elimination of microorganism’s source of energy in modified yogurt. Furthermore, the calories of high fat and low fat lactose free yogurts were 25% and 40% lower than their control samples, respectively. Generally, results showed that the lactose free yogurt with modified protein, despite of 1% lower protein content than the control one, showed better functional properties, nutritional properties, network parameters, and shelf stability, which could be promising in the set yogurt industry.

Keywords: Lactose free, low calorie, network properties, protein modification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188
222 Event Information Extraction System (EIEE): FSM vs HMM

Authors: Shaukat Wasi, Zubair A. Shaikh, Sajid Qasmi, Hussain Sachwani, Rehman Lalani, Aamir Chagani

Abstract:

Automatic Extraction of Event information from social text stream (emails, social network sites, blogs etc) is a vital requirement for many applications like Event Planning and Management systems and security applications. The key information components needed from Event related text are Event title, location, participants, date and time. Emails have very unique distinctions over other social text streams from the perspective of layout and format and conversation style and are the most commonly used communication channel for broadcasting and planning events. Therefore we have chosen emails as our dataset. In our work, we have employed two statistical NLP methods, named as Finite State Machines (FSM) and Hidden Markov Model (HMM) for the extraction of event related contextual information. An application has been developed providing a comparison among the two methods over the event extraction task. It comprises of two modules, one for each method, and works for both bulk as well as direct user input. The results are evaluated using Precision, Recall and F-Score. Experiments show that both methods produce high performance and accuracy, however HMM was good enough over Title extraction and FSM proved to be better for Venue, Date, and time.

Keywords: Emails, Event Extraction, Event Detection, Finite state machines, Hidden Markov Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2274