Search results for: Canny Edge Detection
189 Investigating Student Behavior in Adopting Online Formative Assessment Feedback
Authors: Peter Clutterbuck, Terry Rowlands, Owen Seamons
Abstract:
In this paper we describe one critical research program within a complex, ongoing multi-year project (2010 to 2014 inclusive) with the overall goal to improve the learning outcomes for first year undergraduate commerce/business students within an Information Systems (IS) subject with very large enrolment. The single research program described in this paper is the analysis of student attitudes and decision making in relation to the availability of formative assessment feedback via Web-based real time conferencing and document exchange software (Adobe Connect). The formative assessment feedback between teaching staff and students is in respect of an authentic problem-based, team-completed assignment. The analysis of student attitudes and decision making is investigated via both qualitative (firstly) and quantitative (secondly) application of the Theory of Planned Behavior (TPB) with a two statistically-significant and separate trial samples of the enrolled students. The initial qualitative TPB investigation revealed that perceived self-efficacy, improved time-management, and lecturer-student relationship building were the major factors in shaping an overall favorable student attitude to online feedback, whilst some students expressed valid concerns with perceived control limitations identified within the online feedback protocols. The subsequent quantitative TPB investigation then confirmed that attitude towards usage, subjective norms surrounding usage, and perceived behavioral control of usage were all significant in shaping student intention to use the online feedback protocol, with these three variables explaining 63 percent of the variance in the behavioral intention to use the online feedback protocol. The identification in this research of perceived behavioral control as a significant determinant in student usage of a specific technology component within a virtual learning environment (VLE) suggests that VLEs could now be viewed not as a single, atomic entity, but as a spectrum of technology offerings ranging from the mature and simple (e.g., email, Web downloads) to the cutting-edge and challenging (e.g., Web conferencing and real-time document exchange). That is, that all VLEs should not be considered the same. The results of this research suggest that tertiary students have the technological sophistication to assess a VLE in this more selective manner.
Keywords: Formative assessment feedback, virtual learning environment, theory of planned behavior, perceived behavioral control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2090188 In Search of an SVD and QRcp Based Optimization Technique of ANN for Automatic Classification of Abnormal Heart Sounds
Authors: Samit Ari, Goutam Saha
Abstract:
Artificial Neural Network (ANN) has been extensively used for classification of heart sounds for its discriminative training ability and easy implementation. However, it suffers from overparameterization if the number of nodes is not chosen properly. In such cases, when the dataset has redundancy within it, ANN is trained along with this redundant information that results in poor validation. Also a larger network means more computational expense resulting more hardware and time related cost. Therefore, an optimum design of neural network is needed towards real-time detection of pathological patterns, if any from heart sound signal. The aims of this work are to (i) select a set of input features that are effective for identification of heart sound signals and (ii) make certain optimum selection of nodes in the hidden layer for a more effective ANN structure. Here, we present an optimization technique that involves Singular Value Decomposition (SVD) and QR factorization with column pivoting (QRcp) methodology to optimize empirically chosen over-parameterized ANN structure. Input nodes present in ANN structure is optimized by SVD followed by QRcp while only SVD is required to prune undesirable hidden nodes. The result is presented for classifying 12 common pathological cases and normal heart sound.Keywords: ANN, Classification of heart diseases, murmurs, optimization, Phonocardiogram, QRcp, SVD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2075187 Researches on Simulation and Validation of Airborne Enhanced Ground Proximity Warning System
Authors: Ma Shidong, He Yuncheng, Wang Zhong, Yang Guoqing
Abstract:
In this paper, enhanced ground proximity warning simulation and validation system is designed and implemented. First, based on square grid and sub-grid structure, the global digital terrain database is designed and constructed. Terrain data searching is implemented through querying the latitude and longitude bands and separated zones of global terrain database with the current aircraft position. A combination of dynamic scheduling and hierarchical scheduling is adopted to schedule the terrain data, and the terrain data can be read and delete dynamically in the memory. Secondly, according to the scope, distance, approach speed information etc. to the dangerous terrain in front, and using security profiles calculating method, collision threat detection is executed in real-time, and provides caution and warning alarm. According to this scheme, the implementation of the enhanced ground proximity warning simulation system is realized. Simulations are carried out to verify a good real-time in terrain display and alarm trigger, and the results show simulation system is realized correctly, reasonably and stable.
Keywords: enhanced ground proximity warning system, digital terrain, look-ahead terrain alarm, terrain display, simulation and validation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1700186 The Comparation of Activation Nuclear Factor Kappa Beta (NFKB) at Rattus Novergicus Strain Wistar Induced by Various Duration High Fat Diet (HFD)
Authors: Titin Andri Wihastuti, Djanggan Sargowo
Abstract:
NFκB is a transcription factor regulating many function of the vessel wall. In the normal condition , NFκB is revealed diffuse cytoplasmic expressionsuggesting that the system is inactive. The presence of activation NFκB provide a potential pathway for the rapid transcriptional of a variety of genes encoding cytokines, growth factors, adhesion molecules and procoagulatory factors. It is likely to play an important role in chronic inflamatory disease involved atherosclerosis. There are many stimuli with the potential to active NFκB, including hyperlipidemia. We used 24 mice which was divided in 6 groups. The HFD given by et libitum procedure during 2, 4, and 6 months. The parameters in this study were the amount of NFKB activation ,H2O2 as ROS and VCAM-1 as a product of NFKB activation. H2O2 colorimetryc assay performed directly using Anti Rat H2O2 ELISA Kit. The NFKB and VCAM-1 detection obtained from aorta mice, measured by ELISA kit and imunohistochemistry. There was a significant difference activation of H2O2, NFKB and VCAM-1 level at induce HFD after 2, 4 and 6 months. It suggest that HFD induce ROS formation and increase the activation of NFKB as one of atherosclerosis marker that caused by hyperlipidemia as classical atheroschlerosis risk factor.Keywords: High Fat Diet, NFKB, H2O2, atherosclerosis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035185 Urban Land Cover Change of Olomouc City Using LANDSAT Images
Authors: Miloš Marjanović, Jaroslav Burian, Ja kub Miřijovský, Jan Harbula
Abstract:
This paper regards the phenomena of intensive suburbanization and urbanization in Olomouc city and in Olomouc region in general for the period of 1986–2009. A Remote Sensing approach that involves tracking of changes in Land Cover units is proposed to quantify the urbanization state and trends in temporal and spatial aspects. It actually consisted of two approaches, Experiment 1 and Experiment 2 which implied two different image classification solutions in order to provide Land Cover maps for each 1986–2009 time split available in the Landsat image set. Experiment 1 dealt with the unsupervised classification, while Experiment 2 involved semi- supervised classification, using a combination of object-based and pixel-based classifiers. The resulting Land Cover maps were subsequently quantified for the proportion of urban area unit and its trend through time, and also for the urban area unit stability, yielding the relation of spatial and temporal development of the urban area unit. Some outcomes seem promising but there is indisputably room for improvements of source data and also processing and filtering.
Keywords: Change detection, image classification, land cover, Landsat images, Olomouc city, urbanization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1834184 Intrinsic Electromagnetic Fields and Atom-Field Coupling in Living Cells
Authors: Masroor H. S. Bukhari, Z. H. Shah
Abstract:
The possibility of intrinsic electromagnetic fields within living cells and their resonant self-interaction and interaction with ambient electromagnetic fields is suggested on the basis of a theoretical and experimental study. It is reported that intrinsic electromagnetic fields are produced in the form of radio-frequency and infra-red photons within atoms (which may be coupled or uncoupled) in cellular structures, such as the cell cytoskeleton and plasma membrane. A model is presented for the interaction of these photons among themselves or with atoms under a dipole-dipole coupling, induced by single-photon or two-photon processes. This resonance is manifested by conspicuous field amplification and it is argued that it is possible for these resonant photons to undergo tunnelling in the form of evanescent waves to a short range (of a few nanometers to micrometres). This effect, suggested as a resonant photon tunnelling mechanism in this report, may enable these fields to act as intracellular signal communication devices and as bridges between macromolecules or cellular structures in the cell cytoskeleton, organelles or membrane. A brief overview of an experimental technique and a review of some preliminary results are presented, in the detection of these fields produced in living cell membranes under physiological conditions.Keywords: bioelectromagnetism, cell membrane, evanescentwaves, photon tunnelling, resonance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897183 Optical Fish Tracking in Fishways using Neural Networks
Authors: Alvaro Rodriguez, Maria Bermudez, Juan R. Rabuñal, Jeronimo Puertas
Abstract:
One of the main issues in Computer Vision is to extract the movement of one or several points or objects of interest in an image or video sequence to conduct any kind of study or control process. Different techniques to solve this problem have been applied in numerous areas such as surveillance systems, analysis of traffic, motion capture, image compression, navigation systems and others, where the specific characteristics of each scenario determine the approximation to the problem. This paper puts forward a Computer Vision based algorithm to analyze fish trajectories in high turbulence conditions in artificial structures called vertical slot fishways, designed to allow the upstream migration of fish through obstructions in rivers. The suggested algorithm calculates the position of the fish at every instant starting from images recorded with a camera and using neural networks to execute fish detection on images. Different laboratory tests have been carried out in a full scale fishway model and with living fishes, allowing the reconstruction of the fish trajectory and the measurement of velocities and accelerations of the fish. These data can provide useful information to design more effective vertical slot fishways.
Keywords: Computer Vision, Neural Network, Fishway, Fish Trajectory, Tracking
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2005182 A Watermarking Scheme for MP3 Audio Files
Authors: Dimitrios Koukopoulos, Yiannis Stamatiou
Abstract:
In this work, we present for the first time in our perception an efficient digital watermarking scheme for mpeg audio layer 3 files that operates directly in the compressed data domain, while manipulating the time and subband/channel domain. In addition, it does not need the original signal to detect the watermark. Our scheme was implemented taking special care for the efficient usage of the two limited resources of computer systems: time and space. It offers to the industrial user the capability of watermark embedding and detection in time immediately comparable to the real music time of the original audio file that depends on the mpeg compression, while the end user/audience does not face any artifacts or delays hearing the watermarked audio file. Furthermore, it overcomes the disadvantage of algorithms operating in the PCMData domain to be vulnerable to compression/recompression attacks, as it places the watermark in the scale factors domain and not in the digitized sound audio data. The strength of our scheme, that allows it to be used with success in both authentication and copyright protection, relies on the fact that it gives to the users the enhanced capability their ownership of the audio file not to be accomplished simply by detecting the bit pattern that comprises the watermark itself, but by showing that the legal owner knows a hard to compute property of the watermark.Keywords: Audio watermarking, mpeg audio layer 3, hardinstance generation, NP-completeness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654181 A Robust and Adaptive Unscented Kalman Filter for the Air Fine Alignment of the Strapdown Inertial Navigation System/GPS
Authors: Jian Shi, Baoguo Yu, Haonan Jia, Meng Liu, Ping Huang
Abstract:
Adapting to the flexibility of war, a large number of guided weapons launch from aircraft. Therefore, the inertial navigation system loaded in the weapon needs to undergo an alignment process in the air. This article proposes the following methods to the problem of inaccurate modeling of the system under large misalignment angles, the accuracy reduction of filtering caused by outliers, and the noise changes in GPS signals: first, considering the large misalignment errors of Strapdown Inertial Navigation System (SINS)/GPS, a more accurate model is made rather than to make a small-angle approximation, and the Unscented Kalman Filter (UKF) algorithms are used to estimate the state; then, taking into account the impact of GPS noise changes on the fine alignment algorithm, the innovation adaptive filtering algorithm is introduced to estimate the GPS’s noise in real-time; at the same time, in order to improve the anti-interference ability of the air fine alignment algorithm, a robust filtering algorithm based on outlier detection is combined with the air fine alignment algorithm to improve the robustness of the algorithm. The algorithm can improve the alignment accuracy and robustness under interference conditions, which is verified by simulation.
Keywords: Air alignment, fine alignment, inertial navigation system, integrated navigation system, UKF.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 546180 Graph Codes-2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval
Authors: Stefan Wagenpfeil, Felix Engel, Paul McKevitt, Matthias Hemmje
Abstract:
Multimedia Indexing and Retrieval is generally de-signed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, espe-cially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelisation. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.
Keywords: indexing, retrieval, multimedia, graph code, graph algorithm
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 448179 Satellite Sensing for Evaluation of an Irrigation System in Cotton - Wheat Zone
Authors: Sadia Iqbal, Faheem Iqbal, Furqan Iqbal
Abstract:
Efficient utilization of existing water is a pressing need for Pakistan. Due to rising population, reduction in present storage capacity and poor delivery efficiency of 30 to 40% from canal. A study to evaluate an irrigation system in the cotton-wheat zone of Pakistan, after the watercourse lining was conducted. The study is made on the basis of cropping pattern and salinity to evaluate the system. This study employed an index-based approach of using Geographic information system with field data. The satellite images of different years were use to examine the effective area. Several combinations of the ratio of signals received in different spectral bands were used for development of this index. Near Infrared and Thermal IR spectral bands proved to be most effective as this combination helped easy detection of salt affected area and cropping pattern of the study area. Result showed that 9.97% area under salinity in 1992, 9.17% in 2000 and it left 2.29% in year 2005. Similarly in 1992, 45% area is under vegetation it improves to 56% and 65% in 2000 and 2005 respectively. On the basis of these results evaluation is done 30% performance is increase after the watercourse improvement.Keywords: Salinity, remote sensing index, salinity index, cropping pattern.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1682178 Detection of Arcobacter and Helicobacter pylori Contamination in Organic Vegetables by Cultural and PCR Methods
Authors: Miguel García-Ferrús, Ana González, María A. Ferrús
Abstract:
The most demanded organic foods worldwide are those that are consumed fresh, such as fruits and vegetables. However, there is a knowledge gap about some aspects of organic food microbiological quality and safety. Organic fruits and vegetables are more exposed to pathogenic microorganisms due to surface contact with natural fertilizers such as animal manure, wastes and vermicompost used during farming. Therefore, the objective of this work was to study the contamination of organic fresh green leafy vegetables by two emergent pathogens, Arcobacter spp. and Helicobacter pylori. For this purpose, a total of 24 vegetable samples, 13 lettuce and 11 spinach were acquired from 10 different ecological supermarkets and greengroceries and analyzed by culture and PCR. Arcobacter spp. was detected in five samples (20%) by PCR, four spinach and one lettuce. One spinach sample was found to be also positive by culture. For H. pylori, the H. pylori VacA gene-specific band was detected in 12 vegetable samples (50%), 10 lettuces and two spinach. Isolation in the selective medium did not yield any positive result, possibly because of low contamination levels together with the presence of the organism in its viable but non-culturable form. Results showed significant levels of H. pylori and Arcobacter contamination in organic vegetables that are generally consumed raw, which seems to confirm that these foods can act as transmission vehicles to humans.
Keywords: Arcobacter spp., Helicobacter pylori, organic vegetables, Polymerase Chain Reaction, PCR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 410177 Fault and Theft Recognition Using Toro Dial Sensor in Programmable Current Relay for Feeder Security
Authors: R. Kamalakannan, N. Ravi Kumar
Abstract:
Feeder protection is important in transmission and distribution side because if any fault occurs in any feeder or transformer, man power is needed to identify the problem and it will take more time. In the existing system, directional overcurrent elements with load further secured by a load encroachment function can be used to provide necessary security and sensitivity for faults on remote points in a circuit. It is validated only in renewable plant collector circuit protection applications over a wide range of operating conditions. In this method, the directional overcurrent feeder protection is developed by using monitoring of feeder section through internet. In this web based monitoring, the fault and power theft are identified by using Toro dial sensor and its information is received by SCADA (Supervisory Control and Data Acquisition) and controlled by ARM microcontroller. This web based monitoring is also used to monitor the feeder management, directional current detection, demand side management, overload fault. This monitoring system is capable of monitoring the distribution feeder over a large area depending upon the cost. It is also used to reduce the power theft, time and man power. The simulation is done by MATLAB software.
Keywords: Current sensor, distribution feeder protection, directional overcurrent, power theft, protective relay.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 797176 An Efficient Architecture for Interleaved Modular Multiplication
Authors: Ahmad M. Abdel Fattah, Ayman M. Bahaa El-Din, Hossam M.A. Fahmy
Abstract:
Modular multiplication is the basic operation in most public key cryptosystems, such as RSA, DSA, ECC, and DH key exchange. Unfortunately, very large operands (in order of 1024 or 2048 bits) must be used to provide sufficient security strength. The use of such big numbers dramatically slows down the whole cipher system, especially when running on embedded processors. So far, customized hardware accelerators - developed on FPGAs or ASICs - were the best choice for accelerating modular multiplication in embedded environments. On the other hand, many algorithms have been developed to speed up such operations. Examples are the Montgomery modular multiplication and the interleaved modular multiplication algorithms. Combining both customized hardware with an efficient algorithm is expected to provide a much faster cipher system. This paper introduces an enhanced architecture for computing the modular multiplication of two large numbers X and Y modulo a given modulus M. The proposed design is compared with three previous architectures depending on carry save adders and look up tables. Look up tables should be loaded with a set of pre-computed values. Our proposed architecture uses the same carry save addition, but replaces both look up tables and pre-computations with an enhanced version of sign detection techniques. The proposed architecture supports higher frequencies than other architectures. It also has a better overall absolute time for a single operation.Keywords: Montgomery multiplication, modular multiplication, efficient architecture, FPGA, RSA
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2458175 Effect of L-Arginine on Neuromuscular Transmission of the Chick Biventer Cervicis Muscle
Authors: S. Asadzadeh Vostakolaei
Abstract:
In this study, the effect of L-arginine was examined at the neuromuscular junction of the chick biventer cervicis muscle. LArginine at 500 μg/ ml, decreased twitch response to electerical stimulation, and produced rightward shift of the dose- response curve for acetylcholine or carbachol. L-Arginine at 1000μg/ ml produced a strong shift to the right of the dose – response curve for acetylcholine or carbachol with a reduction in the efficacy. The inhibitory effect of L-arginine on the twitch response was blocked by caffeine (200μg/ ml). NO levels were also measured in the chick biventer cervicis muscle homogenates, using spectrophotometric method for the direct detection of NO, nitrite and nitrate. Total nitrite (nitrite + nitrate) was measured by a spectrophotometer at 540 nm after the conversion of nitrate to nitrite by copperized cadmium granules. NO levels were found to be significantly increased in concentrations 500 and 1000μg/ ml of L-arginine in comparison with the control group (p<0.001). These findings indicate a possible role of increased NO levels in the suppressive action of L-arginine on the twitch response. In addition, the results indicate that the post- junctional antagonistic action of L-arginine is probably the result of impaired sarcoplasmic reticulum (SR) Ca+2 releases.
Keywords: Chick, L-Arginine, Nitric Oxide, Skeletal muscle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724174 A Study on the Developing Method of the BIM (Building Information Modeling) Software Based On Cloud Computing Environment
Authors: Byung-Kon Kim
Abstract:
According as the Architecture, Engineering and Construction (AEC) Industry projects have grown more complex and larger, the number of utilization of BIM for 3D design and simulation is increasing significantly. Therefore, typical applications of BIM such as clash detection and alternative measures based on 3-dimenstional planning are expanded to process management, cost and quantity management, structural analysis, check for regulation, and various domains for virtual design and construction. Presently, commercial BIM software is operated on single-user environment, so initial cost is so high and the investment may be wasted frequently. Cloud computing that is a next-generation internet technology enables simple internet devices (such as PC, Tablet, Smart phone etc) to use services and resources of BIM software. In this paper, we suggested developing method of the BIM software based on cloud computing environment in order to expand utilization of BIM and reduce cost of BIM software. First, for the benchmarking, we surveyed successful case of BIM and cloud computing. And we analyzed needs and opportunities of BIM and cloud computing in AEC Industry. Finally, we suggested main functions of BIM software based on cloud computing environment and developed a simple prototype of cloud computing BIM software for basic BIM model viewing.
Keywords: Construction IT, BIM(Building Information Modeling), Cloud Computing, BIM Service Based Cloud Computing, Viewer Based BIM Server, 3D Design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4106173 Testing Loaded Programs Using Fault Injection Technique
Authors: S. Manaseer, F. A. Masooud, A. A. Sharieh
Abstract:
Fault tolerance is critical in many of today's large computer systems. This paper focuses on improving fault tolerance through testing. Moreover, it concentrates on the memory faults: how to access the editable part of a process memory space and how this part is affected. A special Software Fault Injection Technique (SFIT) is proposed for this purpose. This is done by sequentially scanning the memory of the target process, and trying to edit maximum number of bytes inside that memory. The technique was implemented and tested on a group of programs in software packages such as jet-audio, Notepad, Microsoft Word, Microsoft Excel, and Microsoft Outlook. The results from the test sample process indicate that the size of the scanned area depends on several factors. These factors are: process size, process type, and virtual memory size of the machine under test. The results show that increasing the process size will increase the scanned memory space. They also show that input-output processes have more scanned area size than other processes. Increasing the virtual memory size will also affect the size of the scanned area but to a certain limit.Keywords: Complex software systems, Error detection, Fault tolerance, Injection and testing methodology, Memory faults, Process and virtual memory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1890172 The Role of Velocity Map Quality in Estimation of Intravascular Pressure Distribution
Authors: Ali Pashaee, Parisa Shooshtari, Gholamreza Atae, Nasser Fatouraee
Abstract:
Phase-Contrast MR imaging methods are widely used for measurement of blood flow velocity components. Also there are some other tools such as CT and Ultrasound for velocity map detection in intravascular studies. These data are used in deriving flow characteristics. Some clinical applications are investigated which use pressure distribution in diagnosis of intravascular disorders such as vascular stenosis. In this paper an approach to the problem of measurement of intravascular pressure field by using velocity field obtained from flow images is proposed. The method presented in this paper uses an algorithm to calculate nonlinear equations of Navier- Stokes, assuming blood as an incompressible and Newtonian fluid. Flow images usually suffer the lack of spatial resolution. Our attempt is to consider the effect of spatial resolution on the pressure distribution estimated from this method. In order to achieve this aim, velocity map of a numerical phantom is derived at six different spatial resolutions. To determine the effects of vascular stenoses on pressure distribution, a stenotic phantom geometry is considered. A comparison between the pressure distribution obtained from the phantom and the pressure resulted from the algorithm is presented. In this regard we also compared the effects of collocated and staggered computational grids on the pressure distribution resulted from this algorithm.Keywords: Flow imaging, pressure distribution estimation, phantom, resolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1684171 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact
Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed
Abstract:
Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).
Keywords: Classification, Bayesian network; structure learning, K2 algorithm, expert knowledge, surface water analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 517170 A Case Study of Applying Virtual Prototyping in Construction
Authors: Stephen C. W. Kong
Abstract:
The use of 3D computer-aided design (CAD) models to support construction project planning has been increasing in the previous year. 3D CAD models reveal more planning ideas by visually showing the construction site environment in different stages of the construction process. Using 3D CAD models together with scheduling software to prepare construction plan can identify errors in process sequence and spatial arrangement, which is vital to the success of a construction project. A number of 4D (3D plus time) CAD tools has been developed and utilized in different construction projects due to the awareness of their importance. Virtual prototyping extends the idea of 4D CAD by integrating more features for simulating real construction process. Virtual prototyping originates from the manufacturing industry where production of products such as cars and airplanes are virtually simulated in computer before they are built in the factory. Virtual prototyping integrates 3D CAD, simulation engine, analysis tools (like structural analysis and collision detection), and knowledgebase to streamline the whole product design and production process. In this paper, we present the application of a virtual prototyping software which has been used in a few construction projects in Hong Kong to support construction project planning. Specifically, the paper presents an implementation of virtual prototyping in a residential building project in Hong Kong. The applicability, difficulties and benefits of construction virtual prototyping are examined based on this project.Keywords: construction project planning, prefabrication, simulation, virtual prototyping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2832169 A Holographic Infotainment System for Connected and Driverless Cars: An Exploratory Study of Gesture Based Interaction
Authors: Nicholas Lambert, Seungyeon Ryu, Mehmet Mulla, Albert Kim
Abstract:
In this paper, an interactive in-car interface called HoloDash is presented. It is intended to provide information and infotainment in both autonomous vehicles and ‘connected cars’, vehicles equipped with Internet access via cellular services. The research focuses on the development of interactive avatars for this system and its gesture-based control system. This is a case study for the development of a possible human-centred means of presenting a connected or autonomous vehicle’s On-Board Diagnostics through a projected ‘holographic’ infotainment system. This system is termed a Holographic Human Vehicle Interface (HHIV), as it utilises a dashboard projection unit and gesture detection. The research also examines the suitability for gestures in an automotive environment, given that it might be used in both driver-controlled and driverless vehicles. Using Human Centred Design methods, questions were posed to test subjects and preferences discovered in terms of the gesture interface and the user experience for passengers within the vehicle. These affirm the benefits of this mode of visual communication for both connected and driverless cars.
Keywords: Holographic interface, human-computer interaction, user-centered design, Gesture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1073168 End-to-End Spanish-English Sequence Learning Translation Model
Authors: Vidhu Mitha Goutham, Ruma Mukherjee
Abstract:
The low availability of well-trained, unlimited, dynamic-access models for specific languages makes it hard for corporate users to adopt quick translation techniques and incorporate them into product solutions. As translation tasks increasingly require a dynamic sequence learning curve; stable, cost-free opensource models are scarce. We survey and compare current translation techniques and propose a modified sequence to sequence model repurposed with attention techniques. Sequence learning using an encoder-decoder model is now paving the path for higher precision levels in translation. Using a Convolutional Neural Network (CNN) encoder and a Recurrent Neural Network (RNN) decoder background, we use Fairseq tools to produce an end-to-end bilingually trained Spanish-English machine translation model including source language detection. We acquire competitive results using a duo-lingo-corpus trained model to provide for prospective, ready-made plug-in use for compound sentences and document translations. Our model serves a decent system for large, organizational data translation needs. While acknowledging its shortcomings and future scope, it also identifies itself as a well-optimized deep neural network model and solution.
Keywords: Attention, encoder-decoder, Fairseq, Seq2Seq, Spanish, translation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 480167 Thermographic Tests of Curved GFRP Structures with Delaminations: Numerical Modelling vs. Experimental Validation
Authors: P. D. Pastuszak
Abstract:
The present work is devoted to thermographic studies of curved composite panels (unidirectional GFRP) with subsurface defects. Various artificial defects, created by inserting PTFE stripe between individual layers of a laminate during manufacturing stage are studied. The analysis is conducted both with the use finite element method and experiments. To simulate transient heat transfer in 3D model with embedded various defect sizes, the ANSYS package is used. Pulsed Thermography combined with optical excitation source provides good results for flat surfaces. Composite structures are mostly used in complex components, e.g., pipes, corners and stiffeners. Local decrease of mechanical properties in these regions can have significant influence on strength decrease of the entire structure. Application of active procedures of thermography to defect detection and evaluation in this type of elements seems to be more appropriate that other NDT techniques. Nevertheless, there are various uncertainties connected with correct interpretation of acquired data. In this paper, important factors concerning Infrared Thermography measurements of curved surfaces in the form of cylindrical panels are considered. In addition, temperature effects on the surface resulting from complex geometry and embedded and real defect are also presented.Keywords: Active thermography, finite element analysis, composite, curved structures, defects.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1714166 Use of Caffeine and Human Pharmaceutical Compounds to Identify Sewage Contamination
Authors: Jingming Wu, Junqi Yue, Ruikang Hu, Zhaoguang Yang, Lifeng Zhang
Abstract:
Fecal coliform bacteria are widely used as indicators of sewage contamination in surface water. However, there are some disadvantages in these microbial techniques including time consuming (18-48h) and inability in discriminating between human and animal fecal material sources. Therefore, it is necessary to seek a more specific indicator of human sanitary waste. In this study, the feasibility was investigated to apply caffeine and human pharmaceutical compounds to identify the human-source contamination. The correlation between caffeine and fecal coliform was also explored. Surface water samples were collected from upstream, middle-stream and downstream points respectively, along Rochor Canal, as well as 8 locations of Marina Bay. Results indicate that caffeine is a suitable chemical tracer in Singapore because of its easy detection (in the range of 0.30-2.0 ng/mL), compared with other chemicals monitored. Relative low concentrations of human pharmaceutical compounds (< 0.07 ng/mL) in Rochor Canal and Marina Bay water samples make them hard to be detected and difficult to be chemical tracer. However, their existence can help to validate sewage contamination. In addition, it was discovered the high correlation exists between caffeine concentration and fecal coliform density in the Rochor Canal water samples, demonstrating that caffeine is highly related to the human-source contamination.Keywords: Caffeine, Human Pharmaceutical Compounds, Chemical Tracer, Sewage Contamination.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2517165 Skin Lesion Segmentation Using Color Channel Optimization and Clustering-based Histogram Thresholding
Authors: Rahil Garnavi, Mohammad Aldeen, M. Emre Celebi, Alauddin Bhuiyan, Constantinos Dolianitis, George Varigos
Abstract:
Automatic segmentation of skin lesions is the first step towards the automated analysis of malignant melanoma. Although numerous segmentation methods have been developed, few studies have focused on determining the most effective color space for melanoma application. This paper proposes an automatic segmentation algorithm based on color space analysis and clustering-based histogram thresholding, a process which is able to determine the optimal color channel for detecting the borders in dermoscopy images. The algorithm is tested on a set of 30 high resolution dermoscopy images. A comprehensive evaluation of the results is provided, where borders manually drawn by four dermatologists, are compared to automated borders detected by the proposed algorithm, applying three previously used metrics of accuracy, sensitivity, and specificity and a new metric of similarity. By performing ROC analysis and ranking the metrics, it is demonstrated that the best results are obtained with the X and XoYoR color channels, resulting in an accuracy of approximately 97%. The proposed method is also compared with two state-of-theart skin lesion segmentation methods.Keywords: Border detection, Color space analysis, Dermoscopy, Histogram thresholding, Melanoma, Segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2249164 Certain Data Dimension Reduction Techniques for application with ANN based MCS for Study of High Energy Shower
Authors: Gitanjali Devi, Kandarpa Kumar Sarma, Pranayee Datta, Anjana Kakoti Mahanta
Abstract:
Cosmic showers, from their places of origin in space, after entering earth generate secondary particles called Extensive Air Shower (EAS). Detection and analysis of EAS and similar High Energy Particle Showers involve a plethora of experimental setups with certain constraints for which soft-computational tools like Artificial Neural Network (ANN)s can be adopted. The optimality of ANN classifiers can be enhanced further by the use of Multiple Classifier System (MCS) and certain data - dimension reduction techniques. This work describes the performance of certain data dimension reduction techniques like Principal Component Analysis (PCA), Independent Component Analysis (ICA) and Self Organizing Map (SOM) approximators for application with an MCS formed using Multi Layer Perceptron (MLP), Recurrent Neural Network (RNN) and Probabilistic Neural Network (PNN). The data inputs are obtained from an array of detectors placed in a circular arrangement resembling a practical detector grid which have a higher dimension and greater correlation among themselves. The PCA, ICA and SOM blocks reduce the correlation and generate a form suitable for real time practical applications for prediction of primary energy and location of EAS from density values captured using detectors in a circular grid.Keywords: EAS, Shower, Core, ANN, Location.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611163 A Pairwise-Gaussian-Merging Approach: Towards Genome Segmentation for Copy Number Analysis
Authors: Chih-Hao Chen, Hsing-Chung Lee, Qingdong Ling, Hsiao-Jung Chen, Sun-Chong Wang, Li-Ching Wu, H.C. Lee
Abstract:
Segmentation, filtering out of measurement errors and identification of breakpoints are integral parts of any analysis of microarray data for the detection of copy number variation (CNV). Existing algorithms designed for these tasks have had some successes in the past, but they tend to be O(N2) in either computation time or memory requirement, or both, and the rapid advance of microarray resolution has practically rendered such algorithms useless. Here we propose an algorithm, SAD, that is much faster and much less thirsty for memory – O(N) in both computation time and memory requirement -- and offers higher accuracy. The two key ingredients of SAD are the fundamental assumption in statistics that measurement errors are normally distributed and the mathematical relation that the product of two Gaussians is another Gaussian (function). We have produced a computer program for analyzing CNV based on SAD. In addition to being fast and small it offers two important features: quantitative statistics for predictions and, with only two user-decided parameters, ease of use. Its speed shows little dependence on genomic profile. Running on an average modern computer, it completes CNV analyses for a 262 thousand-probe array in ~1 second and a 1.8 million-probe array in 9 secondsKeywords: Cancer, pathogenesis, chromosomal aberration, copy number variation, segmentation analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1481162 An Efficient Watermarking Method for MP3 Audio Files
Authors: Dimitrios Koukopoulos, Yiannis Stamatiou
Abstract:
In this work, we present for the first time in our perception an efficient digital watermarking scheme for mpeg audio layer 3 files that operates directly in the compressed data domain, while manipulating the time and subband/channel domain. In addition, it does not need the original signal to detect the watermark. Our scheme was implemented taking special care for the efficient usage of the two limited resources of computer systems: time and space. It offers to the industrial user the capability of watermark embedding and detection in time immediately comparable to the real music time of the original audio file that depends on the mpeg compression, while the end user/audience does not face any artifacts or delays hearing the watermarked audio file. Furthermore, it overcomes the disadvantage of algorithms operating in the PCMData domain to be vulnerable to compression/recompression attacks, as it places the watermark in the scale factors domain and not in the digitized sound audio data. The strength of our scheme, that allows it to be used with success in both authentication and copyright protection, relies on the fact that it gives to the users the enhanced capability their ownership of the audio file not to be accomplished simply by detecting the bit pattern that comprises the watermark itself, but by showing that the legal owner knows a hard to compute property of the watermark.
Keywords: Audio watermarking, mpeg audio layer 3, hard instance generation, NP-completeness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1838161 High Accuracy ESPRIT-TLS Technique for Wind Turbine Fault Discrimination
Authors: Saad Chakkor, Mostafa Baghouri, Abderrahmane Hajraoui
Abstract:
ESPRIT-TLS method appears a good choice for high resolution fault detection in induction machines. It has a very high effectiveness in the frequency and amplitude identification. Contrariwise, it presents a high computation complexity which affects its implementation in real time fault diagnosis. To avoid this problem, a Fast-ESPRIT algorithm that combined the IIR band-pass filtering technique, the decimation technique and the original ESPRIT-TLS method was employed to enhance extracting accurately frequencies and their magnitudes from the wind stator current with less computation cost. The proposed algorithm has been applied to verify the wind turbine machine need in the implementation of an online, fast, and proactive condition monitoring. This type of remote and periodic maintenance provides an acceptable machine lifetime, minimize its downtimes and maximize its productivity. The developed technique has evaluated by computer simulations under many fault scenarios. Study results prove the performance of Fast- ESPRIT offering rapid and high resolution harmonics recognizing with minimum computation time and less memory cost.
Keywords: Spectral Estimation, ESPRIT-TLS, Real Time, Diagnosis, Wind Turbine Faults, Band-Pass Filtering, Decimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2261160 Event Information Extraction System (EIEE): FSM vs HMM
Authors: Shaukat Wasi, Zubair A. Shaikh, Sajid Qasmi, Hussain Sachwani, Rehman Lalani, Aamir Chagani
Abstract:
Automatic Extraction of Event information from social text stream (emails, social network sites, blogs etc) is a vital requirement for many applications like Event Planning and Management systems and security applications. The key information components needed from Event related text are Event title, location, participants, date and time. Emails have very unique distinctions over other social text streams from the perspective of layout and format and conversation style and are the most commonly used communication channel for broadcasting and planning events. Therefore we have chosen emails as our dataset. In our work, we have employed two statistical NLP methods, named as Finite State Machines (FSM) and Hidden Markov Model (HMM) for the extraction of event related contextual information. An application has been developed providing a comparison among the two methods over the event extraction task. It comprises of two modules, one for each method, and works for both bulk as well as direct user input. The results are evaluated using Precision, Recall and F-Score. Experiments show that both methods produce high performance and accuracy, however HMM was good enough over Title extraction and FSM proved to be better for Venue, Date, and time.Keywords: Emails, Event Extraction, Event Detection, Finite state machines, Hidden Markov Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2324