Search results for: Collision Detection
192 A Robust Diverged Localization and Recognition of License Registration Characters
Authors: M. Sankari, R. Bremananth, C.Meena
Abstract:
Localization and Recognition of License registration characters from the moving vehicle is a computationally complex task in the field of machine vision and is of substantial interest because of its diverse applications such as cross border security, law enforcement and various other intelligent transportation applications. Previous research used the plate specific details such as aspect ratio, character style, color or dimensions of the plate in the complex task of plate localization. In this paper, license registration character is localized by Enhanced Weight based density map (EWBDM) method, which is independent of such constraints. In connection with our previous method, this paper proposes a method that relaxes constraints in lighting conditions, different fonts of character occurred in the plate and plates with hand-drawn characters in various aspect quotients. The robustness of this method is well suited for applications where the appearance of plates seems to be varied widely. Experimental results show that this approach is suited for recognizing license plates in different external environments.
Keywords: Character segmentation, Connectivity checking, Edge detection, Image analysis, license plate localization, license number recognition, Quality frame selection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1895191 Optimization of Thermopile Sensor Performance of Polycrystalline Silicon Film
Authors: Li Long, Thomas Ortlepp
Abstract:
A theoretical model for the optimization of thermopile sensor performance is developed for thermoelectric-based infrared radiation detection. It is shown that the performance of polycrystalline silicon film thermopile sensor can be optimized according to the thermoelectric quality factor, sensor layer structure factor and sensor layout shape factor. Based on the properties of electrons, phonons, grain boundaries and their interactions, the thermoelectric quality factor of polycrystalline silicon is analyzed with the relaxation time approximation of Boltzmann transport equation. The model includes the effects of grain structure, grain boundary trap properties and doping concentration. The layer structure factor of sensor is analyzed with respect to infrared absorption coefficient. The effect of layout design is characterized with the shape factor, which is calculated for different sensor designs. Double layer polycrystalline silicon thermopile infrared sensors on suspended support membrane have been designed and fabricated with a CMOS-compatible process. The theoretical approach is confirmed with measurement results.
Keywords: Polycrystalline silicon film, relaxation time approximation, specific detectivity, thermal conductivity, thermopile infrared sensor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 231190 Harnessing the Power of AI: Transforming DevSecOps for Enhanced Cloud Security
Authors: Ashly Joseph, Jithu Paulose
Abstract:
The increased usage of cloud computing has revolutionized the IT landscape, but it has also raised new security concerns. DevSecOps emerged as a way for tackling these difficulties by integrating security into the software development process. However, the rising complexity and sophistication of cyber threats need more advanced solutions. This paper looks into the usage of artificial intelligence (AI) techniques in the DevSecOps framework to increase cloud security. This study uses quantitative and qualitative techniques to assess the usefulness of AI approaches such as machine learning, natural language processing, and deep learning in reducing security issues. This paper thoroughly examines the symbiotic relationship between AI and DevSecOps, concentrating on how AI may be seamlessly integrated into the continuous integration and continuous delivery (CI/CD) pipeline, automated security testing, and real-time monitoring methods. The findings emphasize AI's huge potential to improve threat detection, risk assessment, and incident response skills. Furthermore, the paper examines the implications and challenges of using AI in DevSecOps workflows, considering factors like as scalability, interpretability, and adaptability. This paper adds to a better understanding of AI's revolutionary role in cloud security and provides valuable insights for practitioners and scholars in the field.
Keywords: Cloud Security, DevSecOps, Artificial Intelligence, AI, Machine Learning, Natural Language Processing, NLP, cybersecurity, AI-driven Security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 132189 Analysis of Reflectance Photoplethysmograph Sensors
Authors: Fu-Hsuan Huang, Po-Jung Yuan, Kang-Ping Lin, Hen-Hong Chang, Cheng-Lun Tsai
Abstract:
Photoplethysmography is a simple measurement of the variation in blood volume in tissue. It detects the pulse signal of heart beat as well as the low frequency signal of vasoconstriction and vasodilation. The transmission type measurement is limited to only a few specific positions for example the index finger that have a short path length for light. The reflectance type measurement can be conveniently applied on most parts of the body surface. This study analyzed the factors that determine the quality of reflectance photoplethysmograph signal including the emitter-detector distance, wavelength, light intensity, and optical properties of skin tissue. Light emitting diodes (LEDs) with four different visible wavelengths were used as the light emitters. A phototransistor was used as the light detector. A micro translation stage adjusts the emitter-detector distance from 2 mm to 15 mm. The reflective photoplethysmograph signals were measured on different sites. The optimal emitter-detector distance was chosen to have a large dynamic range for low frequency drifting without signal saturation and a high perfusion index. Among these four wavelengths, a yellowish green (571nm) light with a proper emitter-detection distance of 2mm is the most suitable for obtaining a steady and reliable reflectance photoplethysmograph signalKeywords: Reflectance photoplethysmograph, Perfusion index, Signal-to-noise ratio
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2212188 Recognition Machine (RM) for On-line and Isolated Flight Deck Officer (FDO) Gestures
Authors: Deniz T. Sodiri, Venkat V S S Sastry
Abstract:
The paper presents an on-line recognition machine (RM) for continuous/isolated, dynamic and static gestures that arise in Flight Deck Officer (FDO) training. RM is based on generic pattern recognition framework. Gestures are represented as templates using summary statistics. The proposed recognition algorithm exploits temporal and spatial characteristics of gestures via dynamic programming and Markovian process. The algorithm predicts corresponding index of incremental input data in the templates in an on-line mode. Accumulated consistency in the sequence of prediction provides a similarity measurement (Score) between input data and the templates. The algorithm provides an intuitive mechanism for automatic detection of start/end frames of continuous gestures. In the present paper, we consider isolated gestures. The performance of RM is evaluated using four datasets - artificial (W TTest), hand motion (Yang) and FDO (tracker, vision-based ). RM achieves comparable results which are in agreement with other on-line and off-line algorithms such as hidden Markov model (HMM) and dynamic time warping (DTW). The proposed algorithm has the additional advantage of providing timely feedback for training purposes.Keywords: On-line Recognition Algorithm, IsolatedDynamic/Static Gesture Recognition, On-line Markovian/DynamicProgramming, Training in Virtual Environments.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1463187 In Search of an SVD and QRcp Based Optimization Technique of ANN for Automatic Classification of Abnormal Heart Sounds
Authors: Samit Ari, Goutam Saha
Abstract:
Artificial Neural Network (ANN) has been extensively used for classification of heart sounds for its discriminative training ability and easy implementation. However, it suffers from overparameterization if the number of nodes is not chosen properly. In such cases, when the dataset has redundancy within it, ANN is trained along with this redundant information that results in poor validation. Also a larger network means more computational expense resulting more hardware and time related cost. Therefore, an optimum design of neural network is needed towards real-time detection of pathological patterns, if any from heart sound signal. The aims of this work are to (i) select a set of input features that are effective for identification of heart sound signals and (ii) make certain optimum selection of nodes in the hidden layer for a more effective ANN structure. Here, we present an optimization technique that involves Singular Value Decomposition (SVD) and QR factorization with column pivoting (QRcp) methodology to optimize empirically chosen over-parameterized ANN structure. Input nodes present in ANN structure is optimized by SVD followed by QRcp while only SVD is required to prune undesirable hidden nodes. The result is presented for classifying 12 common pathological cases and normal heart sound.Keywords: ANN, Classification of heart diseases, murmurs, optimization, Phonocardiogram, QRcp, SVD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2071186 The Comparation of Activation Nuclear Factor Kappa Beta (NFKB) at Rattus Novergicus Strain Wistar Induced by Various Duration High Fat Diet (HFD)
Authors: Titin Andri Wihastuti, Djanggan Sargowo
Abstract:
NFκB is a transcription factor regulating many function of the vessel wall. In the normal condition , NFκB is revealed diffuse cytoplasmic expressionsuggesting that the system is inactive. The presence of activation NFκB provide a potential pathway for the rapid transcriptional of a variety of genes encoding cytokines, growth factors, adhesion molecules and procoagulatory factors. It is likely to play an important role in chronic inflamatory disease involved atherosclerosis. There are many stimuli with the potential to active NFκB, including hyperlipidemia. We used 24 mice which was divided in 6 groups. The HFD given by et libitum procedure during 2, 4, and 6 months. The parameters in this study were the amount of NFKB activation ,H2O2 as ROS and VCAM-1 as a product of NFKB activation. H2O2 colorimetryc assay performed directly using Anti Rat H2O2 ELISA Kit. The NFKB and VCAM-1 detection obtained from aorta mice, measured by ELISA kit and imunohistochemistry. There was a significant difference activation of H2O2, NFKB and VCAM-1 level at induce HFD after 2, 4 and 6 months. It suggest that HFD induce ROS formation and increase the activation of NFKB as one of atherosclerosis marker that caused by hyperlipidemia as classical atheroschlerosis risk factor.Keywords: High Fat Diet, NFKB, H2O2, atherosclerosis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2031185 Urban Land Cover Change of Olomouc City Using LANDSAT Images
Authors: Miloš Marjanović, Jaroslav Burian, Ja kub Miřijovský, Jan Harbula
Abstract:
This paper regards the phenomena of intensive suburbanization and urbanization in Olomouc city and in Olomouc region in general for the period of 1986–2009. A Remote Sensing approach that involves tracking of changes in Land Cover units is proposed to quantify the urbanization state and trends in temporal and spatial aspects. It actually consisted of two approaches, Experiment 1 and Experiment 2 which implied two different image classification solutions in order to provide Land Cover maps for each 1986–2009 time split available in the Landsat image set. Experiment 1 dealt with the unsupervised classification, while Experiment 2 involved semi- supervised classification, using a combination of object-based and pixel-based classifiers. The resulting Land Cover maps were subsequently quantified for the proportion of urban area unit and its trend through time, and also for the urban area unit stability, yielding the relation of spatial and temporal development of the urban area unit. Some outcomes seem promising but there is indisputably room for improvements of source data and also processing and filtering.
Keywords: Change detection, image classification, land cover, Landsat images, Olomouc city, urbanization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1830184 Intrinsic Electromagnetic Fields and Atom-Field Coupling in Living Cells
Authors: Masroor H. S. Bukhari, Z. H. Shah
Abstract:
The possibility of intrinsic electromagnetic fields within living cells and their resonant self-interaction and interaction with ambient electromagnetic fields is suggested on the basis of a theoretical and experimental study. It is reported that intrinsic electromagnetic fields are produced in the form of radio-frequency and infra-red photons within atoms (which may be coupled or uncoupled) in cellular structures, such as the cell cytoskeleton and plasma membrane. A model is presented for the interaction of these photons among themselves or with atoms under a dipole-dipole coupling, induced by single-photon or two-photon processes. This resonance is manifested by conspicuous field amplification and it is argued that it is possible for these resonant photons to undergo tunnelling in the form of evanescent waves to a short range (of a few nanometers to micrometres). This effect, suggested as a resonant photon tunnelling mechanism in this report, may enable these fields to act as intracellular signal communication devices and as bridges between macromolecules or cellular structures in the cell cytoskeleton, organelles or membrane. A brief overview of an experimental technique and a review of some preliminary results are presented, in the detection of these fields produced in living cell membranes under physiological conditions.Keywords: bioelectromagnetism, cell membrane, evanescentwaves, photon tunnelling, resonance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1893183 Human Face Detection and Segmentation using Eigenvalues of Covariance Matrix, Hough Transform and Raster Scan Algorithms
Authors: J. Prakash, K. Rajesh
Abstract:
In this paper we propose a novel method for human face segmentation using the elliptical structure of the human head. It makes use of the information present in the edge map of the image. In this approach we use the fact that the eigenvalues of covariance matrix represent the elliptical structure. The large and small eigenvalues of covariance matrix are associated with major and minor axial lengths of an ellipse. The other elliptical parameters are used to identify the centre and orientation of the face. Since an Elliptical Hough Transform requires 5D Hough Space, the Circular Hough Transform (CHT) is used to evaluate the elliptical parameters. Sparse matrix technique is used to perform CHT, as it squeeze zero elements, and have only a small number of non-zero elements, thereby having an advantage of less storage space and computational time. Neighborhood suppression scheme is used to identify the valid Hough peaks. The accurate position of the circumference pixels for occluded and distorted ellipses is identified using Bresenham-s Raster Scan Algorithm which uses the geometrical symmetry properties. This method does not require the evaluation of tangents for curvature contours, which are very sensitive to noise. The method has been evaluated on several images with different face orientations.Keywords: Circular Hough Transform, Covariance matrix, Eigenvalues, Elliptical Hough Transform, Face segmentation, Raster Scan Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2517182 Optical Fish Tracking in Fishways using Neural Networks
Authors: Alvaro Rodriguez, Maria Bermudez, Juan R. Rabuñal, Jeronimo Puertas
Abstract:
One of the main issues in Computer Vision is to extract the movement of one or several points or objects of interest in an image or video sequence to conduct any kind of study or control process. Different techniques to solve this problem have been applied in numerous areas such as surveillance systems, analysis of traffic, motion capture, image compression, navigation systems and others, where the specific characteristics of each scenario determine the approximation to the problem. This paper puts forward a Computer Vision based algorithm to analyze fish trajectories in high turbulence conditions in artificial structures called vertical slot fishways, designed to allow the upstream migration of fish through obstructions in rivers. The suggested algorithm calculates the position of the fish at every instant starting from images recorded with a camera and using neural networks to execute fish detection on images. Different laboratory tests have been carried out in a full scale fishway model and with living fishes, allowing the reconstruction of the fish trajectory and the measurement of velocities and accelerations of the fish. These data can provide useful information to design more effective vertical slot fishways.
Keywords: Computer Vision, Neural Network, Fishway, Fish Trajectory, Tracking
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2001181 A Watermarking Scheme for MP3 Audio Files
Authors: Dimitrios Koukopoulos, Yiannis Stamatiou
Abstract:
In this work, we present for the first time in our perception an efficient digital watermarking scheme for mpeg audio layer 3 files that operates directly in the compressed data domain, while manipulating the time and subband/channel domain. In addition, it does not need the original signal to detect the watermark. Our scheme was implemented taking special care for the efficient usage of the two limited resources of computer systems: time and space. It offers to the industrial user the capability of watermark embedding and detection in time immediately comparable to the real music time of the original audio file that depends on the mpeg compression, while the end user/audience does not face any artifacts or delays hearing the watermarked audio file. Furthermore, it overcomes the disadvantage of algorithms operating in the PCMData domain to be vulnerable to compression/recompression attacks, as it places the watermark in the scale factors domain and not in the digitized sound audio data. The strength of our scheme, that allows it to be used with success in both authentication and copyright protection, relies on the fact that it gives to the users the enhanced capability their ownership of the audio file not to be accomplished simply by detecting the bit pattern that comprises the watermark itself, but by showing that the legal owner knows a hard to compute property of the watermark.Keywords: Audio watermarking, mpeg audio layer 3, hardinstance generation, NP-completeness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1651180 A Robust and Adaptive Unscented Kalman Filter for the Air Fine Alignment of the Strapdown Inertial Navigation System/GPS
Authors: Jian Shi, Baoguo Yu, Haonan Jia, Meng Liu, Ping Huang
Abstract:
Adapting to the flexibility of war, a large number of guided weapons launch from aircraft. Therefore, the inertial navigation system loaded in the weapon needs to undergo an alignment process in the air. This article proposes the following methods to the problem of inaccurate modeling of the system under large misalignment angles, the accuracy reduction of filtering caused by outliers, and the noise changes in GPS signals: first, considering the large misalignment errors of Strapdown Inertial Navigation System (SINS)/GPS, a more accurate model is made rather than to make a small-angle approximation, and the Unscented Kalman Filter (UKF) algorithms are used to estimate the state; then, taking into account the impact of GPS noise changes on the fine alignment algorithm, the innovation adaptive filtering algorithm is introduced to estimate the GPS’s noise in real-time; at the same time, in order to improve the anti-interference ability of the air fine alignment algorithm, a robust filtering algorithm based on outlier detection is combined with the air fine alignment algorithm to improve the robustness of the algorithm. The algorithm can improve the alignment accuracy and robustness under interference conditions, which is verified by simulation.
Keywords: Air alignment, fine alignment, inertial navigation system, integrated navigation system, UKF.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 541179 Graph Codes-2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval
Authors: Stefan Wagenpfeil, Felix Engel, Paul McKevitt, Matthias Hemmje
Abstract:
Multimedia Indexing and Retrieval is generally de-signed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, espe-cially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelisation. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.
Keywords: indexing, retrieval, multimedia, graph code, graph algorithm
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 443178 Satellite Sensing for Evaluation of an Irrigation System in Cotton - Wheat Zone
Authors: Sadia Iqbal, Faheem Iqbal, Furqan Iqbal
Abstract:
Efficient utilization of existing water is a pressing need for Pakistan. Due to rising population, reduction in present storage capacity and poor delivery efficiency of 30 to 40% from canal. A study to evaluate an irrigation system in the cotton-wheat zone of Pakistan, after the watercourse lining was conducted. The study is made on the basis of cropping pattern and salinity to evaluate the system. This study employed an index-based approach of using Geographic information system with field data. The satellite images of different years were use to examine the effective area. Several combinations of the ratio of signals received in different spectral bands were used for development of this index. Near Infrared and Thermal IR spectral bands proved to be most effective as this combination helped easy detection of salt affected area and cropping pattern of the study area. Result showed that 9.97% area under salinity in 1992, 9.17% in 2000 and it left 2.29% in year 2005. Similarly in 1992, 45% area is under vegetation it improves to 56% and 65% in 2000 and 2005 respectively. On the basis of these results evaluation is done 30% performance is increase after the watercourse improvement.Keywords: Salinity, remote sensing index, salinity index, cropping pattern.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1678177 Detection of Arcobacter and Helicobacter pylori Contamination in Organic Vegetables by Cultural and PCR Methods
Authors: Miguel García-Ferrús, Ana González, María A. Ferrús
Abstract:
The most demanded organic foods worldwide are those that are consumed fresh, such as fruits and vegetables. However, there is a knowledge gap about some aspects of organic food microbiological quality and safety. Organic fruits and vegetables are more exposed to pathogenic microorganisms due to surface contact with natural fertilizers such as animal manure, wastes and vermicompost used during farming. Therefore, the objective of this work was to study the contamination of organic fresh green leafy vegetables by two emergent pathogens, Arcobacter spp. and Helicobacter pylori. For this purpose, a total of 24 vegetable samples, 13 lettuce and 11 spinach were acquired from 10 different ecological supermarkets and greengroceries and analyzed by culture and PCR. Arcobacter spp. was detected in five samples (20%) by PCR, four spinach and one lettuce. One spinach sample was found to be also positive by culture. For H. pylori, the H. pylori VacA gene-specific band was detected in 12 vegetable samples (50%), 10 lettuces and two spinach. Isolation in the selective medium did not yield any positive result, possibly because of low contamination levels together with the presence of the organism in its viable but non-culturable form. Results showed significant levels of H. pylori and Arcobacter contamination in organic vegetables that are generally consumed raw, which seems to confirm that these foods can act as transmission vehicles to humans.
Keywords: Arcobacter spp., Helicobacter pylori, organic vegetables, Polymerase Chain Reaction, PCR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 402176 Fault and Theft Recognition Using Toro Dial Sensor in Programmable Current Relay for Feeder Security
Authors: R. Kamalakannan, N. Ravi Kumar
Abstract:
Feeder protection is important in transmission and distribution side because if any fault occurs in any feeder or transformer, man power is needed to identify the problem and it will take more time. In the existing system, directional overcurrent elements with load further secured by a load encroachment function can be used to provide necessary security and sensitivity for faults on remote points in a circuit. It is validated only in renewable plant collector circuit protection applications over a wide range of operating conditions. In this method, the directional overcurrent feeder protection is developed by using monitoring of feeder section through internet. In this web based monitoring, the fault and power theft are identified by using Toro dial sensor and its information is received by SCADA (Supervisory Control and Data Acquisition) and controlled by ARM microcontroller. This web based monitoring is also used to monitor the feeder management, directional current detection, demand side management, overload fault. This monitoring system is capable of monitoring the distribution feeder over a large area depending upon the cost. It is also used to reduce the power theft, time and man power. The simulation is done by MATLAB software.
Keywords: Current sensor, distribution feeder protection, directional overcurrent, power theft, protective relay.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 793175 An Efficient Architecture for Interleaved Modular Multiplication
Authors: Ahmad M. Abdel Fattah, Ayman M. Bahaa El-Din, Hossam M.A. Fahmy
Abstract:
Modular multiplication is the basic operation in most public key cryptosystems, such as RSA, DSA, ECC, and DH key exchange. Unfortunately, very large operands (in order of 1024 or 2048 bits) must be used to provide sufficient security strength. The use of such big numbers dramatically slows down the whole cipher system, especially when running on embedded processors. So far, customized hardware accelerators - developed on FPGAs or ASICs - were the best choice for accelerating modular multiplication in embedded environments. On the other hand, many algorithms have been developed to speed up such operations. Examples are the Montgomery modular multiplication and the interleaved modular multiplication algorithms. Combining both customized hardware with an efficient algorithm is expected to provide a much faster cipher system. This paper introduces an enhanced architecture for computing the modular multiplication of two large numbers X and Y modulo a given modulus M. The proposed design is compared with three previous architectures depending on carry save adders and look up tables. Look up tables should be loaded with a set of pre-computed values. Our proposed architecture uses the same carry save addition, but replaces both look up tables and pre-computations with an enhanced version of sign detection techniques. The proposed architecture supports higher frequencies than other architectures. It also has a better overall absolute time for a single operation.Keywords: Montgomery multiplication, modular multiplication, efficient architecture, FPGA, RSA
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2454174 Effect of L-Arginine on Neuromuscular Transmission of the Chick Biventer Cervicis Muscle
Authors: S. Asadzadeh Vostakolaei
Abstract:
In this study, the effect of L-arginine was examined at the neuromuscular junction of the chick biventer cervicis muscle. LArginine at 500 μg/ ml, decreased twitch response to electerical stimulation, and produced rightward shift of the dose- response curve for acetylcholine or carbachol. L-Arginine at 1000μg/ ml produced a strong shift to the right of the dose – response curve for acetylcholine or carbachol with a reduction in the efficacy. The inhibitory effect of L-arginine on the twitch response was blocked by caffeine (200μg/ ml). NO levels were also measured in the chick biventer cervicis muscle homogenates, using spectrophotometric method for the direct detection of NO, nitrite and nitrate. Total nitrite (nitrite + nitrate) was measured by a spectrophotometer at 540 nm after the conversion of nitrate to nitrite by copperized cadmium granules. NO levels were found to be significantly increased in concentrations 500 and 1000μg/ ml of L-arginine in comparison with the control group (p<0.001). These findings indicate a possible role of increased NO levels in the suppressive action of L-arginine on the twitch response. In addition, the results indicate that the post- junctional antagonistic action of L-arginine is probably the result of impaired sarcoplasmic reticulum (SR) Ca+2 releases.
Keywords: Chick, L-Arginine, Nitric Oxide, Skeletal muscle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719173 A Study on the Developing Method of the BIM (Building Information Modeling) Software Based On Cloud Computing Environment
Authors: Byung-Kon Kim
Abstract:
According as the Architecture, Engineering and Construction (AEC) Industry projects have grown more complex and larger, the number of utilization of BIM for 3D design and simulation is increasing significantly. Therefore, typical applications of BIM such as clash detection and alternative measures based on 3-dimenstional planning are expanded to process management, cost and quantity management, structural analysis, check for regulation, and various domains for virtual design and construction. Presently, commercial BIM software is operated on single-user environment, so initial cost is so high and the investment may be wasted frequently. Cloud computing that is a next-generation internet technology enables simple internet devices (such as PC, Tablet, Smart phone etc) to use services and resources of BIM software. In this paper, we suggested developing method of the BIM software based on cloud computing environment in order to expand utilization of BIM and reduce cost of BIM software. First, for the benchmarking, we surveyed successful case of BIM and cloud computing. And we analyzed needs and opportunities of BIM and cloud computing in AEC Industry. Finally, we suggested main functions of BIM software based on cloud computing environment and developed a simple prototype of cloud computing BIM software for basic BIM model viewing.
Keywords: Construction IT, BIM(Building Information Modeling), Cloud Computing, BIM Service Based Cloud Computing, Viewer Based BIM Server, 3D Design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4101172 Testing Loaded Programs Using Fault Injection Technique
Authors: S. Manaseer, F. A. Masooud, A. A. Sharieh
Abstract:
Fault tolerance is critical in many of today's large computer systems. This paper focuses on improving fault tolerance through testing. Moreover, it concentrates on the memory faults: how to access the editable part of a process memory space and how this part is affected. A special Software Fault Injection Technique (SFIT) is proposed for this purpose. This is done by sequentially scanning the memory of the target process, and trying to edit maximum number of bytes inside that memory. The technique was implemented and tested on a group of programs in software packages such as jet-audio, Notepad, Microsoft Word, Microsoft Excel, and Microsoft Outlook. The results from the test sample process indicate that the size of the scanned area depends on several factors. These factors are: process size, process type, and virtual memory size of the machine under test. The results show that increasing the process size will increase the scanned memory space. They also show that input-output processes have more scanned area size than other processes. Increasing the virtual memory size will also affect the size of the scanned area but to a certain limit.Keywords: Complex software systems, Error detection, Fault tolerance, Injection and testing methodology, Memory faults, Process and virtual memory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1886171 The Role of Velocity Map Quality in Estimation of Intravascular Pressure Distribution
Authors: Ali Pashaee, Parisa Shooshtari, Gholamreza Atae, Nasser Fatouraee
Abstract:
Phase-Contrast MR imaging methods are widely used for measurement of blood flow velocity components. Also there are some other tools such as CT and Ultrasound for velocity map detection in intravascular studies. These data are used in deriving flow characteristics. Some clinical applications are investigated which use pressure distribution in diagnosis of intravascular disorders such as vascular stenosis. In this paper an approach to the problem of measurement of intravascular pressure field by using velocity field obtained from flow images is proposed. The method presented in this paper uses an algorithm to calculate nonlinear equations of Navier- Stokes, assuming blood as an incompressible and Newtonian fluid. Flow images usually suffer the lack of spatial resolution. Our attempt is to consider the effect of spatial resolution on the pressure distribution estimated from this method. In order to achieve this aim, velocity map of a numerical phantom is derived at six different spatial resolutions. To determine the effects of vascular stenoses on pressure distribution, a stenotic phantom geometry is considered. A comparison between the pressure distribution obtained from the phantom and the pressure resulted from the algorithm is presented. In this regard we also compared the effects of collocated and staggered computational grids on the pressure distribution resulted from this algorithm.Keywords: Flow imaging, pressure distribution estimation, phantom, resolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1682170 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact
Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed
Abstract:
Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).
Keywords: Classification, Bayesian network; structure learning, K2 algorithm, expert knowledge, surface water analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 512169 Adaptive Block State Update Method for Separating Background
Authors: Youngsuck Ji, Youngjoon Han, Hernsoo Hahn
Abstract:
In this paper, we proposed the robust mobile object detection method for light effect in the night street image block based updating reference background model using block state analysis. Experiment image is acquired sequence color video from steady camera. When suddenly appeared artificial illumination, reference background model update this information such as street light, sign light. Generally natural illumination is change by temporal, but artificial illumination is suddenly appearance. So in this paper for exactly detect artificial illumination have 2 state process. First process is compare difference between current image and reference background by block based, it can know changed blocks. Second process is difference between current image-s edge map and reference background image-s edge map, it possible to estimate illumination at any block. This information is possible to exactly detect object, artificial illumination and it was generating reference background more clearly. Block is classified by block-state analysis. Block-state has a 4 state (i.e. transient, stationary, background, artificial illumination). Fig. 1 is show characteristic of block-state respectively [1]. Experimental results show that the presented approach works well in the presence of illumination variance.Keywords: Block-state, Edge component, Reference backgroundi, Artificial illumination.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1321168 A Holographic Infotainment System for Connected and Driverless Cars: An Exploratory Study of Gesture Based Interaction
Authors: Nicholas Lambert, Seungyeon Ryu, Mehmet Mulla, Albert Kim
Abstract:
In this paper, an interactive in-car interface called HoloDash is presented. It is intended to provide information and infotainment in both autonomous vehicles and ‘connected cars’, vehicles equipped with Internet access via cellular services. The research focuses on the development of interactive avatars for this system and its gesture-based control system. This is a case study for the development of a possible human-centred means of presenting a connected or autonomous vehicle’s On-Board Diagnostics through a projected ‘holographic’ infotainment system. This system is termed a Holographic Human Vehicle Interface (HHIV), as it utilises a dashboard projection unit and gesture detection. The research also examines the suitability for gestures in an automotive environment, given that it might be used in both driver-controlled and driverless vehicles. Using Human Centred Design methods, questions were posed to test subjects and preferences discovered in terms of the gesture interface and the user experience for passengers within the vehicle. These affirm the benefits of this mode of visual communication for both connected and driverless cars.
Keywords: Holographic interface, human-computer interaction, user-centered design, Gesture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1067167 End-to-End Spanish-English Sequence Learning Translation Model
Authors: Vidhu Mitha Goutham, Ruma Mukherjee
Abstract:
The low availability of well-trained, unlimited, dynamic-access models for specific languages makes it hard for corporate users to adopt quick translation techniques and incorporate them into product solutions. As translation tasks increasingly require a dynamic sequence learning curve; stable, cost-free opensource models are scarce. We survey and compare current translation techniques and propose a modified sequence to sequence model repurposed with attention techniques. Sequence learning using an encoder-decoder model is now paving the path for higher precision levels in translation. Using a Convolutional Neural Network (CNN) encoder and a Recurrent Neural Network (RNN) decoder background, we use Fairseq tools to produce an end-to-end bilingually trained Spanish-English machine translation model including source language detection. We acquire competitive results using a duo-lingo-corpus trained model to provide for prospective, ready-made plug-in use for compound sentences and document translations. Our model serves a decent system for large, organizational data translation needs. While acknowledging its shortcomings and future scope, it also identifies itself as a well-optimized deep neural network model and solution.
Keywords: Attention, encoder-decoder, Fairseq, Seq2Seq, Spanish, translation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 475166 Thermographic Tests of Curved GFRP Structures with Delaminations: Numerical Modelling vs. Experimental Validation
Authors: P. D. Pastuszak
Abstract:
The present work is devoted to thermographic studies of curved composite panels (unidirectional GFRP) with subsurface defects. Various artificial defects, created by inserting PTFE stripe between individual layers of a laminate during manufacturing stage are studied. The analysis is conducted both with the use finite element method and experiments. To simulate transient heat transfer in 3D model with embedded various defect sizes, the ANSYS package is used. Pulsed Thermography combined with optical excitation source provides good results for flat surfaces. Composite structures are mostly used in complex components, e.g., pipes, corners and stiffeners. Local decrease of mechanical properties in these regions can have significant influence on strength decrease of the entire structure. Application of active procedures of thermography to defect detection and evaluation in this type of elements seems to be more appropriate that other NDT techniques. Nevertheless, there are various uncertainties connected with correct interpretation of acquired data. In this paper, important factors concerning Infrared Thermography measurements of curved surfaces in the form of cylindrical panels are considered. In addition, temperature effects on the surface resulting from complex geometry and embedded and real defect are also presented.Keywords: Active thermography, finite element analysis, composite, curved structures, defects.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1711165 Use of Caffeine and Human Pharmaceutical Compounds to Identify Sewage Contamination
Authors: Jingming Wu, Junqi Yue, Ruikang Hu, Zhaoguang Yang, Lifeng Zhang
Abstract:
Fecal coliform bacteria are widely used as indicators of sewage contamination in surface water. However, there are some disadvantages in these microbial techniques including time consuming (18-48h) and inability in discriminating between human and animal fecal material sources. Therefore, it is necessary to seek a more specific indicator of human sanitary waste. In this study, the feasibility was investigated to apply caffeine and human pharmaceutical compounds to identify the human-source contamination. The correlation between caffeine and fecal coliform was also explored. Surface water samples were collected from upstream, middle-stream and downstream points respectively, along Rochor Canal, as well as 8 locations of Marina Bay. Results indicate that caffeine is a suitable chemical tracer in Singapore because of its easy detection (in the range of 0.30-2.0 ng/mL), compared with other chemicals monitored. Relative low concentrations of human pharmaceutical compounds (< 0.07 ng/mL) in Rochor Canal and Marina Bay water samples make them hard to be detected and difficult to be chemical tracer. However, their existence can help to validate sewage contamination. In addition, it was discovered the high correlation exists between caffeine concentration and fecal coliform density in the Rochor Canal water samples, demonstrating that caffeine is highly related to the human-source contamination.Keywords: Caffeine, Human Pharmaceutical Compounds, Chemical Tracer, Sewage Contamination.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2512164 Skin Lesion Segmentation Using Color Channel Optimization and Clustering-based Histogram Thresholding
Authors: Rahil Garnavi, Mohammad Aldeen, M. Emre Celebi, Alauddin Bhuiyan, Constantinos Dolianitis, George Varigos
Abstract:
Automatic segmentation of skin lesions is the first step towards the automated analysis of malignant melanoma. Although numerous segmentation methods have been developed, few studies have focused on determining the most effective color space for melanoma application. This paper proposes an automatic segmentation algorithm based on color space analysis and clustering-based histogram thresholding, a process which is able to determine the optimal color channel for detecting the borders in dermoscopy images. The algorithm is tested on a set of 30 high resolution dermoscopy images. A comprehensive evaluation of the results is provided, where borders manually drawn by four dermatologists, are compared to automated borders detected by the proposed algorithm, applying three previously used metrics of accuracy, sensitivity, and specificity and a new metric of similarity. By performing ROC analysis and ranking the metrics, it is demonstrated that the best results are obtained with the X and XoYoR color channels, resulting in an accuracy of approximately 97%. The proposed method is also compared with two state-of-theart skin lesion segmentation methods.Keywords: Border detection, Color space analysis, Dermoscopy, Histogram thresholding, Melanoma, Segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2247163 Certain Data Dimension Reduction Techniques for application with ANN based MCS for Study of High Energy Shower
Authors: Gitanjali Devi, Kandarpa Kumar Sarma, Pranayee Datta, Anjana Kakoti Mahanta
Abstract:
Cosmic showers, from their places of origin in space, after entering earth generate secondary particles called Extensive Air Shower (EAS). Detection and analysis of EAS and similar High Energy Particle Showers involve a plethora of experimental setups with certain constraints for which soft-computational tools like Artificial Neural Network (ANN)s can be adopted. The optimality of ANN classifiers can be enhanced further by the use of Multiple Classifier System (MCS) and certain data - dimension reduction techniques. This work describes the performance of certain data dimension reduction techniques like Principal Component Analysis (PCA), Independent Component Analysis (ICA) and Self Organizing Map (SOM) approximators for application with an MCS formed using Multi Layer Perceptron (MLP), Recurrent Neural Network (RNN) and Probabilistic Neural Network (PNN). The data inputs are obtained from an array of detectors placed in a circular arrangement resembling a practical detector grid which have a higher dimension and greater correlation among themselves. The PCA, ICA and SOM blocks reduce the correlation and generate a form suitable for real time practical applications for prediction of primary energy and location of EAS from density values captured using detectors in a circular grid.Keywords: EAS, Shower, Core, ANN, Location.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1608