Search results for: Digital Detector
640 Joint Microstatistic Multiuser Detection and Cancellation of Nonlinear Distortion Effects for the Uplink of MC-CDMA Systems Using Golay Codes
Authors: Peter Drotar, Juraj Gazda, Pavol Galajda, Dusan Kocur
Abstract:
The study in this paper underlines the importance of correct joint selection of the spreading codes for uplink of multicarrier code division multiple access (MC-CDMA) at the transmitter side and detector at the receiver side in the presence of nonlinear distortion due to high power amplifier (HPA). The bit error rate (BER) of system for different spreading sequences (Walsh code, Gold code, orthogonal Gold code, Golay code and Zadoff-Chu code) and different kinds of receivers (minimum mean-square error receiver (MMSE-MUD) and microstatistic multi-user receiver (MSF-MUD)) is compared by means of simulations for MC-CDMA transmission system. Finally, the results of analysis will show, that the application of MSF-MUD in combination with Golay codes can outperform significantly the other tested spreading codes and receivers for all mostly used models of HPA.Keywords: HPA, MC-CDMA, microstatistic filter, multi-user receivers, PAPR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1761639 Multi Band Frequency Synthesizer Based on ISPD PLL with Adapted LC Tuned VCO
Authors: Bilel Gassara, Mahmoud Abdellaoui, Nouri Masmoud
Abstract:
The 4G front-end transceiver needs a high performance which can be obtained mainly with an optimal architecture and a multi-band Local Oscillator. In this study, we proposed and presented a new architecture of multi-band frequency synthesizer based on an Inverse Sine Phase Detector Phase Locked Loop (ISPD PLL) without any filters and any controlled gain block and associated with adapted multi band LC tuned VCO using a several numeric controlled capacitive branches but not binary weighted. The proposed architecture, based on 0.35μm CMOS process technology, supporting Multi-band GSM/DCS/DECT/ UMTS/WiMax application and gives a good performances: a phase noise @1MHz -127dBc and a Factor Of Merit (FOM) @ 1MHz - 186dB and a wide band frequency range (from 0.83GHz to 3.5GHz), that make the proposed architecture amenable for monolithic integration and 4G multi-band application.Keywords: GSM/DCS/DECT/UMTS/WiMax, ISPD PLL, keep and capture range, Multi-Band, Synthesizer, Wireless.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2000638 Transform-Domain Rate-Distortion Optimization Accelerator for H.264/AVC Video Encoding
Authors: Mohammed Golam Sarwer, Lai Man Po, Kai Guo, Q.M. Jonathan Wu
Abstract:
In H.264/AVC video encoding, rate-distortion optimization for mode selection plays a significant role to achieve outstanding performance in compression efficiency and video quality. However, this mode selection process also makes the encoding process extremely complex, especially in the computation of the ratedistortion cost function, which includes the computations of the sum of squared difference (SSD) between the original and reconstructed image blocks and context-based entropy coding of the block. In this paper, a transform-domain rate-distortion optimization accelerator based on fast SSD (FSSD) and VLC-based rate estimation algorithm is proposed. This algorithm could significantly simplify the hardware architecture for the rate-distortion cost computation with only ignorable performance degradation. An efficient hardware structure for implementing the proposed transform-domain rate-distortion optimization accelerator is also proposed. Simulation results demonstrated that the proposed algorithm reduces about 47% of total encoding time with negligible degradation of coding performance. The proposed method can be easily applied to many mobile video application areas such as a digital camera and a DMB (Digital Multimedia Broadcasting) phone.Keywords: Context-adaptive variable length coding (CAVLC), H.264/AVC, rate-distortion optimization (RDO), sum of squareddifference (SSD).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1606637 Timescape-Based Panoramic View for Historic Landmarks
Authors: H. Ali, A. Whitehead
Abstract:
Providing a panoramic view of famous landmarks around the world offers artistic and historic value for historians, tourists, and researchers. Exploring the history of famous landmarks by presenting a comprehensive view of a temporal panorama merged with geographical and historical information presents a unique challenge of dealing with images that span a long period, from the 1800’s up to the present. This work presents the concept of temporal panorama through a timeline display of aligned historic and modern images for many famous landmarks. Utilization of this panorama requires a collection of hundreds of thousands of landmark images from the Internet comprised of historic images and modern images of the digital age. These images have to be classified for subset selection to keep the more suitable images that chronologically document a landmark’s history. Processing of historic images captured using older analog technology under various different capturing conditions represents a big challenge when they have to be used with modern digital images. Successful processing of historic images to prepare them for next steps of temporal panorama creation represents an active contribution in cultural heritage preservation through the fulfillment of one of UNESCO goals in preservation and displaying famous worldwide landmarks.
Keywords: Cultural heritage, image registration, image subset selection, registered image similarity, temporal panorama, timescapes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1051636 Efficient Detection Using Sequential Probability Ratio Test in Mobile Cognitive Radio Systems
Authors: Yeon-Jea Cho, Sang-Uk Park, Won-Chul Choi, Dong-Jo Park
Abstract:
This paper proposes a smart design strategy for a sequential detector to reliably detect the primary user-s signal, especially in fast fading environments. We study the computation of the log-likelihood ratio for coping with a fast changing received signal and noise sample variances, which are considered random variables. First, we analyze the detectability of the conventional generalized log-likelihood ratio (GLLR) scheme when considering fast changing statistics of unknown parameters caused by fast fading effects. Secondly, we propose an efficient sensing algorithm for performing the sequential probability ratio test in a robust and efficient manner when the channel statistics are unknown. Finally, the proposed scheme is compared to the conventional method with simulation results with respect to the average number of samples required to reach a detection decision.
Keywords: Cognitive radio, fast fading, sequential detection, spectrum sensing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1743635 Feature Point Detection by Combining Advantages of Intensity-based Approach and Edge-based Approach
Authors: Sungho Kim, Chaehoon Park, Yukyung Choi, Soon Kwon, In So Kweon
Abstract:
In this paper, a novel corner detection method is presented to stably extract geometrically important corners. Intensity-based corner detectors such as the Harris corner can detect corners in noisy environments but has inaccurate corner position and misses the corners of obtuse angles. Edge-based corner detectors such as Curvature Scale Space can detect structural corners but show unstable corner detection due to incomplete edge detection in noisy environments. The proposed image-based direct curvature estimation can overcome limitations in both inaccurate structural corner detection of the Harris corner detector (intensity-based) and the unstable corner detection of Curvature Scale Space caused by incomplete edge detection. Various experimental results validate the robustness of the proposed method.Keywords: Feature, intensity, contour, hybrid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1830634 Mitigation of ISI for Next Generation Wireless Channels in Outdoor Vehicular Environments
Authors: Mohd. Israil, M. Salim Beg
Abstract:
In order to accommodate various multimedia services, next generation wireless networks are characterized by very high transmission bit rates. Thus, in such systems and networks, the received signal is not only limited by noise but - especially with increasing symbols rate often more significantly by the intersymbol interference (ISI) caused by the time dispersive radio channels such as those are used in this work. This paper deals with the study of the performance of detector for high bit rate transmission on some worst case models of frequency selective fading channels for outdoor mobile radio environments. This paper deals with a number of different wireless channels with different power profiles and different number of resolvable paths. All the radio channels generated in this paper are for outdoor vehicular environments with Doppler spread of 100 Hz. A carrier frequency of 1800 MHz is used and all the channels used in this work are such that they are useful for next generation wireless systems. Schemes for mitigation of ISI with adaptive equalizers of different types have been investigated and their performances have been investigated in terms of BER measured as a function of SNR.Keywords: Mobile channels, Rayleigh Fading, Equalization, NMLD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1421633 Use of Fuzzy Edge Image in Block Truncation Coding for Image Compression
Authors: Amarunnishad T.M., Govindan V.K., Abraham T. Mathew
Abstract:
An image compression method has been developed using fuzzy edge image utilizing the basic Block Truncation Coding (BTC) algorithm. The fuzzy edge image has been validated with classical edge detectors on the basis of the results of the well-known Canny edge detector prior to applying to the proposed method. The bit plane generated by the conventional BTC method is replaced with the fuzzy bit plane generated by the logical OR operation between the fuzzy edge image and the corresponding conventional BTC bit plane. The input image is encoded with the block mean and standard deviation and the fuzzy bit plane. The proposed method has been tested with test images of 8 bits/pixel and size 512×512 and found to be superior with better Peak Signal to Noise Ratio (PSNR) when compared to the conventional BTC, and adaptive bit plane selection BTC (ABTC) methods. The raggedness and jagged appearance, and the ringing artifacts at sharp edges are greatly reduced in reconstructed images by the proposed method with the fuzzy bit plane.Keywords: Image compression, Edge detection, Ground truth image, Peak signal to noise ratio
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1699632 The Traditional Malay Textile (TMT)Knowledge Model: Transformation towards Automated Mapping
Authors: Syerina Azlin Md Nasir, Nor Laila Md Noor, Suriyati Razali
Abstract:
The growing interest on national heritage preservation has led to intensive efforts on digital documentation of cultural heritage knowledge. Encapsulated within this effort is the focus on ontology development that will help facilitate the organization and retrieval of the knowledge. Ontologies surrounding cultural heritage domain are related to archives, museum and library information such as archaeology, artifacts, paintings, etc. The growth in number and size of ontologies indicates the well acceptance of its semantic enrichment in many emerging applications. Nowadays, there are many heritage information systems available for access. Among others is community-based e-museum designed to support the digital cultural heritage preservation. This work extends previous effort of developing the Traditional Malay Textile (TMT) Knowledge Model where the model is designed with the intention of auxiliary mapping with CIDOC CRM. Due to its internal constraints, the model needs to be transformed in advance. This paper addresses the issue by reviewing the previous harmonization works with CIDOC CRM as exemplars in refining the facets in the model particularly involving TMT-Artifact class. The result is an extensible model which could lead to a common view for automated mapping with CIDOC CRM. Hence, it promotes integration and exchange of textile information especially batik-related between communities in e-museum applications.Keywords: automated mapping, cultural heritage, knowledgemodel, textile practice
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2302631 Towards Integrating Statistical Color Features for Human Skin Detection
Authors: Mohd Zamri Osman, Mohd Aizaini Maarof, Mohd Foad Rohani
Abstract:
Human skin detection recognized as the primary step in most of the applications such as face detection, illicit image filtering, hand recognition and video surveillance. The performance of any skin detection applications greatly relies on the two components: feature extraction and classification method. Skin color is the most vital information used for skin detection purpose. However, color feature alone sometimes could not handle images with having same color distribution with skin color. A color feature of pixel-based does not eliminate the skin-like color due to the intensity of skin and skin-like color fall under the same distribution. Hence, the statistical color analysis will be exploited such mean and standard deviation as an additional feature to increase the reliability of skin detector. In this paper, we studied the effectiveness of statistical color feature for human skin detection. Furthermore, the paper analyzed the integrated color and texture using eight classifiers with three color spaces of RGB, YCbCr, and HSV. The experimental results show that the integrating statistical feature using Random Forest classifier achieved a significant performance with an F1-score 0.969.
Keywords: Color space, neural network, random forest, skin detection, statistical feature.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1953630 Acoustic and Thermal Isolation Performance Comparison between Recycled and Ceramic Roof Tiles Using Digital Holographic Interferometry
Authors: A. Araceli Sánchez, I. Manuel H. De la Torre, S. Fernando Mendoza, R. Cesar Tavera, R. Manuel de J. Briones
Abstract:
Recycling, as part of any sustainable environment, is continuously evolving and impacting on new materials in manufacturing. One example of this is the recycled solid waste of Tetra Pak ™ packaging, which is a highly pollutant waste as it is not biodegradable since it is manufactured with different materials. The Tetra Pak ™ container consists of thermally joined layers of paper, aluminum and polyethylene. Once disposed, this packaging is recycled by completely separating the paperboard from the rest of the materials. The aluminum and the polyethylene remain together and are used to create the poly-aluminum, which is widely used to manufacture roof tiles. These recycled tiles have different thermal and acoustic properties compared with traditional manufactured ceramic and cement tiles. In this work, we compare a group of tiles using nondestructive optical testing to measure the superficial micro deformations of the tiles under well controlled experiments. The results of the acoustic and thermal tests show remarkable differences between the recycled tile and the traditional ones. These results help to determine which tile could be better suited to the specific environmental conditions in countries where extreme climates, ranging from tropical, desert-like, to very cold are experienced throughout the year.Keywords: Digital holographic interferometry, nondestructive testing, recycled, sustainable, thermal study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2139629 Diagnosing Dangerous Arrhythmia of Patients by Automatic Detecting of QRS Complexes in ECG
Authors: Jia-Rong Yeh, Ai-Hsien Li, Jiann-Shing Shieh, Yen-An Su, Chi-Yu Yang
Abstract:
In this paper, an automatic detecting algorithm for QRS complex detecting was applied for analyzing ECG recordings and five criteria for dangerous arrhythmia diagnosing are applied for a protocol type of automatic arrhythmia diagnosing system. The automatic detecting algorithm applied in this paper detected the distribution of QRS complexes in ECG recordings and related information, such as heart rate and RR interval. In this investigation, twenty sampled ECG recordings of patients with different pathologic conditions were collected for off-line analysis. A combinative application of four digital filters for bettering ECG signals and promoting detecting rate for QRS complex was proposed as pre-processing. Both of hardware filters and digital filters were applied to eliminate different types of noises mixed with ECG recordings. Then, an automatic detecting algorithm of QRS complex was applied for verifying the distribution of QRS complex. Finally, the quantitative clinic criteria for diagnosing arrhythmia were programmed in a practical application for automatic arrhythmia diagnosing as a post-processor. The results of diagnoses by automatic dangerous arrhythmia diagnosing were compared with the results of off-line diagnoses by experienced clinic physicians. The results of comparison showed the application of automatic dangerous arrhythmia diagnosis performed a matching rate of 95% compared with an experienced physician-s diagnoses.Keywords: Signal processing, electrocardiography (ECG), QRS complex, arrhythmia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1517628 An E-Maintenance IoT Sensor Node Designed for Fleets of Diverse Heavy-Duty Vehicles
Authors: George Charkoftakis, Panagiotis Liosatos, Nicolas-Alexander Tatlas, Dimitrios Goustouridis, Stelios M. Potirakis
Abstract:
E-maintenance is a relatively recent concept, generally referring to maintenance management by monitoring assets over the Internet. One of the key links in the chain of an e-maintenance system is data acquisition and transmission. Specifically for the case of a fleet of heavy-duty vehicles, where the main challenge is the diversity of the vehicles and vehicle-embedded self-diagnostic/reporting technologies, the design of the data acquisition and transmission unit is a demanding task. This is clear if one takes into account that a heavy-vehicles fleet assortment may range from vehicles with only a limited number of analog sensors monitored by dashboard light indicators and gauges to vehicles with plethora of sensors monitored by a vehicle computer producing digital reporting. The present work proposes an adaptable internet of things (IoT) sensor node that is capable of addressing this challenge. The proposed sensor node architecture is based on the increasingly popular single-board computer – expansion boards approach. In the proposed solution, the expansion boards undertake the tasks of position identification, cellular connectivity, connectivity to the vehicle computer, and connectivity to analog and digital sensors by means of a specially targeted design of expansion board. Specifically, the latter offers a number of adaptability features to cope with the diverse sensor types employed in different vehicles. In standard mode, the IoT sensor node communicates to the data center through cellular network, transmitting all digital/digitized sensor data, IoT device identity and position. Moreover, the proposed IoT sensor node offers connectivity, through WiFi and an appropriate application, to smart phones or tablets allowing the registration of additional vehicle- and driver-specific information and these data are also forwarded to the data center. All control and communication tasks of the IoT sensor node are performed by dedicated firmware.
Keywords: IoT sensor nodes, e-maintenance, single-board computers, sensor expansion boards, on-board diagnostics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 594627 Ethereum Based Smart Contracts for Trade and Finance
Authors: Rishabh Garg
Abstract:
Traditionally, business parties build trust with a centralized operating mechanism, such as payment by letter of credit. However, the increase in cyber-attacks and malicious hacking has jeopardized business operations and finance practices. Emerging markets, due to their high banking risks and the large presence of digital financing, are looking for technology that enables transparency and traceability of any transaction in trade, finance or supply chain management. Blockchain systems, in the absence of any central authority, enable transactions across the globe with the help of decentralized applications. DApps consist of a front-end, a blockchain back-end, and middleware, that is, the code that connects the two. The front-end can be a sophisticated web app or mobile app, which is used to implement the functions/methods on the smart contract. Web apps can employ technologies such as HTML, CSS, React and Express. In this wake, fintech and blockchain products are popping up in brokerages, digital wallets, exchanges, post-trade clearance, settlement, middleware, infrastructure and base protocols. The present paper provides a technology driven solution, financial inclusion and innovative working paradigm for business and finance.
Keywords: Authentication, blockchain, channel, cryptography, DApps, data portability, Decentralized Public Key Infrastructure, Ethereum, hash function, Hashgraph, Privilege creep, Proof of Work algorithm, revocation, storage variables, Zero Knowledge Proof.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 577626 Ship Detection Requirements Analysis for Different Sea States: Validation on Real SAR Data
Authors: Jaime Martín-de-Nicolás, David Mata-Moya, Nerea del-Rey-Maestre, Pedro Gómez-del-Hoyo, María-Pilar Jarabo-Amores
Abstract:
Ship detection is nowadays quite an important issue in tasks related to sea traffic control, fishery management and ship search and rescue. Although it has traditionally been carried out by patrol ships or aircrafts, coverage and weather conditions and sea state can become a problem. Synthetic aperture radars can surpass these coverage limitations and work under any climatological condition. A fast CFAR ship detector based on a robust statistical modeling of sea clutter with respect to sea states in SAR images is used. In this paper, the minimum SNR required to obtain a given detection probability with a given false alarm rate for any sea state is determined. A Gaussian target model using real SAR data is considered. Results show that SNR does not depend heavily on the class considered. Provided there is some variation in the backscattering of targets in SAR imagery, the detection probability is limited and a post-processing stage based on morphology would be suitable.Keywords: SAR, generalized gamma distribution, detection curves, radar detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1174625 Poincaré Plot for Heart Rate Variability
Authors: Mazhar B. Tayel, Eslam I. AlSaba
Abstract:
Heart is the most important part in the body of living organisms. It affects and is affected by any factor in the body. Therefore, it is a good detector for all conditions in the body. Heart signal is a non-stationary signal; thus, it is utmost important to study the variability of heart signal. The Heart Rate Variability (HRV) has attracted considerable attention in psychology, medicine and has become important dependent measure in psychophysiology and behavioral medicine. The standards of measurements, physiological interpretation and clinical use for HRV that are most often used were described in many researcher papers, however, remain complex issues are fraught with pitfalls. This paper presents one of the nonlinear techniques to analyze HRV. It discusses many points like, what Poincaré plot is and how Poincaré plot works; also, Poincaré plot's merits especially in HRV. Besides, it discusses the limitation of Poincaré cause of standard deviation SD1, SD2 and how to overcome this limitation by using complex correlation measure (CCM). The CCM is most sensitive to changes in temporal structure of the Poincaré plot as compared toSD1 and SD2.
Keywords: Heart rate variability, chaotic system, Poincaré, variance, standard deviation, complex correlation measure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7449624 Mathematical Analysis of EEG of Patients with Non-fatal Nonspecific Diffuse Encephalitis
Authors: Mukesh Doble, Sunil K Narayan
Abstract:
Diffuse viral encephalitis may lack fever and other cardinal signs of infection and hence its distinction from other acute encephalopathic illnesses is challenging. Often, the EEG changes seen routinely are nonspecific and reflect diffuse encephalopathic changes only. The aim of this study was to use nonlinear dynamic mathematical techniques for analyzing the EEG data in order to look for any characteristic diagnostic patterns in diffuse forms of encephalitis.It was diagnosed on clinical, imaging and cerebrospinal fluid criteria in three young male patients. Metabolic and toxic encephalopathies were ruled out through appropriate investigations. Digital EEGs were done on the 3rd to 5th day of onset. The digital EEGs of 5 male and 5 female age and sex matched healthy volunteers served as controls.Two sample t-test indicated that there was no statistically significant difference between the average values in amplitude between the two groups. However, the standard deviation (or variance) of the EEG signals at FP1-F7 and FP2-F8 are significantly higher for the patients than the normal subjects. The regularisation dimension is significantly less for the patients (average between 1.24-1.43) when compared to the normal persons (average between 1.41-1.63) for the EEG signals from all locations except for the Fz-Cz signal. Similarly the wavelet dimension is significantly less (P = 0.05*) for the patients (1.122) when compared to the normal person (1.458). EEGs are subdued in the case of the patients with presence of uniform patterns, manifested in the values of regularisation and wavelet dimensions, when compared to the normal person, indicating a decrease in chaotic nature.
Keywords: Chaos, Diffuse encephalitis, Electroencephalogram, Fractal dimension, Fourier spectrum.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2209623 Negative Selection as a Means of Discovering Unknown Temporal Patterns
Authors: Wanli Ma, Dat Tran, Dharmendra Sharma
Abstract:
The temporal nature of negative selection is an under exploited area. In a negative selection system, newly generated antibodies go through a maturing phase, and the survivors of the phase then wait to be activated by the incoming antigens after certain number of matches. These without having enough matches will age and die, while these with enough matches (i.e., being activated) will become active detectors. A currently active detector may also age and die if it cannot find any match in a pre-defined (lengthy) period of time. Therefore, what matters in a negative selection system is the dynamics of the involved parties in the current time window, not the whole time duration, which may be up to eternity. This property has the potential to define the uniqueness of negative selection in comparison with the other approaches. On the other hand, a negative selection system is only trained with “normal" data samples. It has to learn and discover unknown “abnormal" data patterns on the fly by itself. Consequently, it is more appreciate to utilize negation selection as a system for pattern discovery and recognition rather than just pattern recognition. In this paper, we study the potential of using negative selection in discovering unknown temporal patterns.
Keywords: Artificial Immune Systems, ComputationalIntelligence, Negative Selection, Pattern Discovery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1665622 Dynamic Reroute Modeling for Emergency Evacuation: Case Study of Brunswick City, Germany
Authors: Yun-Pang Flötteröd, Jakob Erdmann
Abstract:
The human behaviors during evacuations are quite complex. One of the critical behaviors which affect the efficiency of evacuation is route choice. Therefore, the respective simulation modeling work needs to function properly. In this paper, Simulation of Urban Mobility’s (SUMO) current dynamic route modeling during evacuation, i.e. the rerouting functions, is examined with a real case study. The result consistency of the simulation and the reality is checked as well. Four influence factors (1) time to get information, (2) probability to cancel a trip, (3) probability to use navigation equipment, and (4) rerouting and information updating period are considered to analyze possible traffic impacts during the evacuation and to examine the rerouting functions in SUMO. Furthermore, some behavioral characters of the case study are analyzed with use of the corresponding detector data and applied in the simulation. The experiment results show that the dynamic route modeling in SUMO can deal with the proposed scenarios properly. Some issues and function needs related to route choice are discussed and further improvements are suggested.
Keywords: Evacuation, microscopic traffic simulation, rerouting, SUMO.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1181621 An Edge Detection and Filtering Mechanism of Two Dimensional Digital Objects Based on Fuzzy Inference
Authors: Ayman A. Aly, Abdallah A. Alshnnaway
Abstract:
The general idea behind the filter is to average a pixel using other pixel values from its neighborhood, but simultaneously to take care of important image structures such as edges. The main concern of the proposed filter is to distinguish between any variations of the captured digital image due to noise and due to image structure. The edges give the image the appearance depth and sharpness. A loss of edges makes the image appear blurred or unfocused. However, noise smoothing and edge enhancement are traditionally conflicting tasks. Since most noise filtering behaves like a low pass filter, the blurring of edges and loss of detail seems a natural consequence. Techniques to remedy this inherent conflict often encompass generation of new noise due to enhancement. In this work a new fuzzy filter is presented for the noise reduction of images corrupted with additive noise. The filter consists of three stages. (1) Define fuzzy sets in the input space to computes a fuzzy derivative for eight different directions (2) construct a set of IFTHEN rules by to perform fuzzy smoothing according to contributions of neighboring pixel values and (3) define fuzzy sets in the output space to get the filtered and edged image. Experimental results are obtained to show the feasibility of the proposed approach with two dimensional objects.Keywords: Additive noise, edge preserving filtering, fuzzy image filtering, noise reduction, two dimensional mechanical images.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568620 Separating Permanent and Induced Magnetic Signature: A Simple Approach
Authors: O. J. G. Somsen, G. P. M. Wagemakers
Abstract:
Magnetic signature detection provides sensitive detection of metal objects, especially in the natural environment. Our group is developing a tabletop setup for magnetic signatures of various small and model objects. A particular issue is the separation of permanent and induced magnetization. While the latter depends only on the composition and shape of the object, the former also depends on the magnetization history. With common deperming techniques, a significant permanent signature may still remain, which confuses measurements of the induced component. We investigate a basic technique of separating the two. Measurements were done by moving the object along an aluminum rail while the three field components are recorded by a detector attached near the center. This is done first with the rail parallel to the Earth magnetic field and then with anti-parallel orientation. The reversal changes the sign of the induced- but not the permanent magnetization so that the two can be separated. Our preliminary results on a small iron block show excellent reproducibility. A considerable permanent magnetization was indeed present, resulting in a complex asymmetric signature. After separation, a much more symmetric induced signature was obtained that can be studied in detail and compared with theoretical calculations.
Keywords: Magnetic signature, data analysis, magnetization, deperming techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1076619 An Evaluation of Digital Elevation Models to Short-Term Monitoring of a High Energy Barrier Island, Northeast Brazil
Authors: Venerando E. Amaro, Francisco Gabriel F. de Lima, Marcelo S.T. Santos
Abstract:
The morphological short-term evolution of Ponta do Tubarão Island (PTI) was investigated through high accurate surveys based on post-processed kinematic (PPK) relative positioning on Global Navigation Satellite Systems (GNSS). PTI is part of a barrier island system on a high energy northeast Brazilian coastal environment and also an area of high environmental sensitivity. Surveys were carried out quarterly over a two years period from May 2010 to May 2012. This paper assesses statically the performance of digital elevation models (DEM) derived from different interpolation methods to represent morphologic features and to quantify volumetric changes and TIN models shown the best results to that purposes. The MDE allowed quantifying surfaces and volumes in detail as well as identifying the most vulnerable segments of the PTI to erosion and/or accumulation of sediments and relate the alterations to climate conditions. The coastal setting and geometry of PTI protects a significant mangrove ecosystem and some oil and gas facilities installed in the vicinities from damaging effects of strong oceanwaves and currents. Thus, the maintenance of PTI is extremely required but the prediction of its longevity is uncertain because results indicate an irregularity of sedimentary balance and a substantial decline in sediment supply to this coastal area.
Keywords: DEM, GNSS, short-term monitoring, Brazil.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2628618 Anomaly Detection with ANN and SVM for Telemedicine Networks
Authors: Edward Guillén, Jeisson Sánchez, Carlos Omar Ramos
Abstract:
In recent years, a wide variety of applications are developed with Support Vector Machines -SVM- methods and Artificial Neural Networks -ANN-. In general, these methods depend on intrusion knowledge databases such as KDD99, ISCX, and CAIDA among others. New classes of detectors are generated by machine learning techniques, trained and tested over network databases. Thereafter, detectors are employed to detect anomalies in network communication scenarios according to user’s connections behavior. The first detector based on training dataset is deployed in different real-world networks with mobile and non-mobile devices to analyze the performance and accuracy over static detection. The vulnerabilities are based on previous work in telemedicine apps that were developed on the research group. This paper presents the differences on detections results between some network scenarios by applying traditional detectors deployed with artificial neural networks and support vector machines.Keywords: Anomaly detection, back-propagation neural networks, network intrusion detection systems, support vector machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2008617 Hiding Data in Images Using PCP
Authors: Souvik Bhattacharyya, Gautam Sanyal
Abstract:
In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.Keywords: Cover Image, LSB, Pixel Coordinate Position (PCP), Stego Image.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821616 Computational Model for Prediction of Soil-Gas Radon-222 Concentration in Soil-Depths and Soil Grain Size Particles
Authors: I. M. Yusuff, O. M. Oni, A. A. Aremu
Abstract:
Percentage of soil-gas radon-222 concentration (222Rn) from soil-depths contributing to outdoor radon atmospheric level depends largely on some physical parameters of the soil. To determine its dependency in soil-depths, survey tests were carried out on soil depths and grain size particles using in-situ measurement method of soil-gas radon-222 concentration at different soil depths. The measurements were carried out with an electronic active radon detector (RAD-7) manufactured by Durridge Company USA. Radon-222 concentrations (222Rn) in soil-gas were measured at four different soil depths of 20, 40, 60 and 100 cm in five feasible locations. At each soil depth, soil samples were collected for grain size particle analysis using soil grasp sampler. The result showed that highest value of radon-222 concentration (24,680 ± 1960 Bqm-3) was measured at 100 cm depth with utmost grain size particle of 17.64% while the lowest concentration (7370 ± 1139 Bqm-3) was measured at 100 cm depth with least grain size particle of 10.75% respectively. A computational model was derived using SPSS regression package. This model could be a yardstick for prediction on soil gas radon concentration reference to soil grain size particle at different soil-depths.
Keywords: Concentration, radon, porosity, diffusion, colorectal, emanation, yardstick.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 714615 An Examination and Validation of the Theoretical Resistivity-Temperature Relationship for Conductors
Authors: Fred Lacy
Abstract:
Electrical resistivity is a fundamental parameter of metals or electrical conductors. Since resistivity is a function of temperature, in order to completely understand the behavior of metals, a temperature dependent theoretical model is needed. A model based on physics principles has recently been developed to obtain an equation that relates electrical resistivity to temperature. This equation is dependent upon a parameter associated with the electron travel time before being scattered, and a parameter that relates the energy of the atoms and their separation distance. Analysis of the energy parameter reveals that the equation is optimized if the proportionality term in the equation is not constant but varies over the temperature range. Additional analysis reveals that the theoretical equation can be used to determine the mean free path of conduction electrons, the number of defects in the atomic lattice, and the ‘equivalent’ charge associated with the metallic bonding of the atoms. All of this analysis provides validation for the theoretical model and provides insight into the behavior of metals where performance is affected by temperatures (e.g., integrated circuits and temperature sensors).
Keywords: Callendar–van Dusen, conductivity, mean free path, resistance temperature detector, temperature sensor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2185614 Investigating Polynomial Interpolation Functions for Zooming Low Resolution Digital Medical Images
Authors: Maninder Pal
Abstract:
Medical digital images usually have low resolution because of nature of their acquisition. Therefore, this paper focuses on zooming these images to obtain better level of information, required for the purpose of medical diagnosis. For this purpose, a strategy for selecting pixels in zooming operation is proposed. It is based on the principle of analog clock and utilizes a combination of point and neighborhood image processing. In this approach, the hour hand of clock covers the portion of image to be processed. For alignment, the center of clock points at middle pixel of the selected portion of image. The minute hand is longer in length, and is used to gain information about pixels of the surrounding area. This area is called neighborhood pixels region. This information is used to zoom the selected portion of the image. The proposed algorithm is implemented and its performance is evaluated for many medical images obtained from various sources such as X-ray, Computerized Tomography (CT) scan and Magnetic Resonance Imaging (MRI). However, for illustration and simplicity, the results obtained from a CT scanned image of head is presented. The performance of algorithm is evaluated in comparison to various traditional algorithms in terms of Peak signal-to-noise ratio (PSNR), maximum error, SSIM index, mutual information and processing time. From the results, the proposed algorithm is found to give better performance than traditional algorithms.
Keywords: Zooming, interpolation, medical images, resolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1575613 Swarm Intelligence based Optimal Linear Phase FIR High Pass Filter Design using Particle Swarm Optimization with Constriction Factor and Inertia Weight Approach
Authors: Sangeeta Mandal, Rajib Kar, Durbadal Mandal, Sakti Prasad Ghoshal
Abstract:
In this paper, an optimal design of linear phase digital high pass finite impulse response (FIR) filter using Particle Swarm Optimization with Constriction Factor and Inertia Weight Approach (PSO-CFIWA) has been presented. In the design process, the filter length, pass band and stop band frequencies, feasible pass band and stop band ripple sizes are specified. FIR filter design is a multi-modal optimization problem. The conventional gradient based optimization techniques are not efficient for digital filter design. Given the filter specifications to be realized, the PSO-CFIWA algorithm generates a set of optimal filter coefficients and tries to meet the ideal frequency response characteristic. In this paper, for the given problem, the designs of the optimal FIR high pass filters of different orders have been performed. The simulation results have been compared to those obtained by the well accepted algorithms such as Parks and McClellan algorithm (PM), genetic algorithm (GA). The results justify that the proposed optimal filter design approach using PSOCFIWA outperforms PM and GA, not only in the accuracy of the designed filter but also in the convergence speed and solution quality.Keywords: FIR Filter; PSO-CFIWA; PSO; Parks and McClellanAlgorithm, Evolutionary Optimization Technique; MagnitudeResponse; Convergence; High Pass Filter
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1554612 Preconcentration and Determination of Cyproheptadine in Biological Samples by Hollow Fiber Liquid Phase Microextraction Coupled with High Performance Liquid Chromatography
Authors: Najari Moghadam Sh., Qomi M., Raofie F., Khadiv J.
Abstract:
In this study, a liquid phase microextraction by hollow fiber (HF-LPME) combined with high performance liquid chromatography-UV detector was applied to preconcentrate and determine trace levels of Cyproheptadine in human urine and plasma samples. Cyproheptadine was extracted from 10 mL alkaline aqueous solution (pH: 9.81) into an organic solvent (n-octnol) which was immobilized in the wall pores of a hollow fiber. Then was back-extracted into an acidified aqueous solution (pH: 2.59) located inside the lumen of the hollow fiber. This method is simple, efficient and cost-effective. It is based on pH gradient and differences between two aqueous phases. In order to optimize the HF-LPME some affecting parameters including the pH of donor and acceptor phases, the type of organic solvent, ionic strength, stirring rate, extraction time and temperature were studied and optimized. Under optimal conditions enrichment factor, limit of detection (LOD) and relative standard deviation (RSD(%), n=3) were up to 112, 15 μg.L−1 and 2.7, respectively.
Keywords: Biological samples, Cyproheptadine, hollow fiber, liquid phase microextraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2232611 Sperm Identification Using Elliptic Model and Tail Detection
Authors: Vahid Reza Nafisi, Mohammad Hasan Moradi, Mohammad Hosain Nasr-Esfahani
Abstract:
The conventional assessment of human semen is a highly subjective assessment, with considerable intra- and interlaboratory variability. Computer-Assisted Sperm Analysis (CASA) systems provide a rapid and automated assessment of the sperm characteristics, together with improved standardization and quality control. However, the outcome of CASA systems is sensitive to the method of experimentation. While conventional CASA systems use digital microscopes with phase-contrast accessories, producing higher contrast images, we have used raw semen samples (no staining materials) and a regular light microscope, with a digital camera directly attached to its eyepiece, to insure cost benefits and simple assembling of the system. However, since the accurate finding of sperms in the semen image is the first step in the examination and analysis of the semen, any error in this step can affect the outcome of the analysis. This article introduces and explains an algorithm for finding sperms in low contrast images: First, an image enhancement algorithm is applied to remove extra particles from the image. Then, the foreground particles (including sperms and round cells) are segmented form the background. Finally, based on certain features and criteria, sperms are separated from other cells.Keywords: Computer-Assisted Sperm Analysis (CASA), Sperm identification, Tail detection, Elliptic shape model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928