Search results for: facial feature detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2322

Search results for: facial feature detection

162 Detection of Linkages Between Extreme Flow Measures and Climate Indices

Authors: Mohammed Sharif, Donald Burn

Abstract:

Large scale climate signals and their teleconnections can influence hydro-meteorological variables on a local scale. Several extreme flow and timing measures, including high flow and low flow measures, from 62 hydrometric stations in Canada are investigated to detect possible linkages with several large scale climate indices. The streamflow data used in this study are derived from the Canadian Reference Hydrometric Basin Network and are characterized by relatively pristine and stable land-use conditions with a minimum of 40 years of record. A composite analysis approach was used to identify linkages between extreme flow and timing measures and climate indices. The approach involves determining the 10 highest and 10 lowest values of various climate indices from the data record. Extreme flow and timing measures for each station were examined for the years associated with the 10 largest values and the years associated with the 10 smallest values. In each case, a re-sampling approach was applied to determine if the 10 values of extreme flow measures differed significantly from the series mean. Results indicate that several stations are impacted by the large scale climate indices considered in this study. The results allow the determination of any relationship between stations that exhibit a statistically significant trend and stations for which the extreme measures exhibit a linkage with the climate indices.

Keywords: flood analysis, low-flow events, climate change, trend analysis, Canada

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1561
161 Vision-Based Collision Avoidance for Unmanned Aerial Vehicles by Recurrent Neural Networks

Authors: Yao-Hong Tsai

Abstract:

Due to the sensor technology, video surveillance has become the main way for security control in every big city in the world. Surveillance is usually used by governments for intelligence gathering, the prevention of crime, the protection of a process, person, group or object, or the investigation of crime. Many surveillance systems based on computer vision technology have been developed in recent years. Moving target tracking is the most common task for Unmanned Aerial Vehicle (UAV) to find and track objects of interest in mobile aerial surveillance for civilian applications. The paper is focused on vision-based collision avoidance for UAVs by recurrent neural networks. First, images from cameras on UAV were fused based on deep convolutional neural network. Then, a recurrent neural network was constructed to obtain high-level image features for object tracking and extracting low-level image features for noise reducing. The system distributed the calculation of the whole system to local and cloud platform to efficiently perform object detection, tracking and collision avoidance based on multiple UAVs. The experiments on several challenging datasets showed that the proposed algorithm outperforms the state-of-the-art methods.

Keywords: Unmanned aerial vehicle, object tracking, deep learning, collision avoidance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 921
160 Through Biometric Card in Romania: Person Identification by Face, Fingerprint and Voice Recognition

Authors: Hariton N. Costin, Iulian Ciocoiu, Tudor Barbu, Cristian Rotariu

Abstract:

In this paper three different approaches for person verification and identification, i.e. by means of fingerprints, face and voice recognition, are studied. Face recognition uses parts-based representation methods and a manifold learning approach. The assessment criterion is recognition accuracy. The techniques under investigation are: a) Local Non-negative Matrix Factorization (LNMF); b) Independent Components Analysis (ICA); c) NMF with sparse constraints (NMFsc); d) Locality Preserving Projections (Laplacianfaces). Fingerprint detection was approached by classical minutiae (small graphical patterns) matching through image segmentation by using a structural approach and a neural network as decision block. As to voice / speaker recognition, melodic cepstral and delta delta mel cepstral analysis were used as main methods, in order to construct a supervised speaker-dependent voice recognition system. The final decision (e.g. “accept-reject" for a verification task) is taken by using a majority voting technique applied to the three biometrics. The preliminary results, obtained for medium databases of fingerprints, faces and voice recordings, indicate the feasibility of our study and an overall recognition precision (about 92%) permitting the utilization of our system for a future complex biometric card.

Keywords: Biometry, image processing, pattern recognition, speech analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1918
159 Performance Analysis of HSDPA Systems using Low-Density Parity-Check (LDPC)Coding as Compared to Turbo Coding

Authors: K. Anitha Sheela, J. Tarun Kumar

Abstract:

HSDPA is a new feature which is introduced in Release-5 specifications of the 3GPP WCDMA/UTRA standard to realize higher speed data rate together with lower round-trip times. Moreover, the HSDPA concept offers outstanding improvement of packet throughput and also significantly reduces the packet call transfer delay as compared to Release -99 DSCH. Till now the HSDPA system uses turbo coding which is the best coding technique to achieve the Shannon limit. However, the main drawbacks of turbo coding are high decoding complexity and high latency which makes it unsuitable for some applications like satellite communications, since the transmission distance itself introduces latency due to limited speed of light. Hence in this paper it is proposed to use LDPC coding in place of Turbo coding for HSDPA system which decreases the latency and decoding complexity. But LDPC coding increases the Encoding complexity. Though the complexity of transmitter increases at NodeB, the End user is at an advantage in terms of receiver complexity and Bit- error rate. In this paper LDPC Encoder is implemented using “sparse parity check matrix" H to generate a codeword at Encoder and “Belief Propagation algorithm "for LDPC decoding .Simulation results shows that in LDPC coding the BER suddenly drops as the number of iterations increase with a small increase in Eb/No. Which is not possible in Turbo coding. Also same BER was achieved using less number of iterations and hence the latency and receiver complexity has decreased for LDPC coding. HSDPA increases the downlink data rate within a cell to a theoretical maximum of 14Mbps, with 2Mbps on the uplink. The changes that HSDPA enables includes better quality, more reliable and more robust data services. In other words, while realistic data rates are only a few Mbps, the actual quality and number of users achieved will improve significantly.

Keywords: AMC, HSDPA, LDPC, WCDMA, 3GPP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2020
158 Discrete and Stationary Adaptive Sub-Band Threshold Method for Improving Image Resolution

Authors: P. Joyce Beryl Princess, Y. Harold Robinson

Abstract:

Image Processing is a structure of Signal Processing for which the input is the image and the output is also an image or parameter of the image. Image Resolution has been frequently referred as an important aspect of an image. In Image Resolution Enhancement, images are being processed in order to obtain more enhanced resolution. To generate highly resoluted image for a low resoluted input image with high PSNR value. Stationary Wavelet Transform is used for Edge Detection and minimize the loss occurs during Downsampling. Inverse Discrete Wavelet Transform is to get highly resoluted image. Highly resoluted output is generated from the Low resolution input with high quality. Noisy input will generate output with low PSNR value. So Noisy resolution enhancement technique has been used for adaptive sub-band thresholding is used. Downsampling in each of the DWT subbands causes information loss in the respective subbands. SWT is employed to minimize this loss. Inverse Discrete wavelet transform (IDWT) is to convert the object which is downsampled using DWT into a highly resoluted object. Used Image denoising and resolution enhancement techniques will generate image with high PSNR value. Our Proposed method will improve Image Resolution and reached the optimized threshold.

Keywords: Image Processing, Inverse Discrete wavelet transform, PSNR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1762
157 Accurate Position Electromagnetic Sensor Using Data Acquisition System

Authors: Z. Ezzouine, A. Nakheli

Abstract:

This paper presents a high position electromagnetic sensor system (HPESS) that is applicable for moving object detection. The authors have developed a high-performance position sensor prototype dedicated to students’ laboratory. The challenge was to obtain a highly accurate and real-time sensor that is able to calculate position, length or displacement. An electromagnetic solution based on a two coil induction principal was adopted. The HPESS converts mechanical motion to electric energy with direct contact. The output signal can then be fed to an electronic circuit. The voltage output change from the sensor is captured by data acquisition system using LabVIEW software. The displacement of the moving object is determined. The measured data are transmitted to a PC in real-time via a DAQ (NI USB -6281). This paper also describes the data acquisition analysis and the conditioning card developed specially for sensor signal monitoring. The data is then recorded and viewed using a user interface written using National Instrument LabVIEW software. On-line displays of time and voltage of the sensor signal provide a user-friendly data acquisition interface. The sensor provides an uncomplicated, accurate, reliable, inexpensive transducer for highly sophisticated control systems.

Keywords: Electromagnetic sensor, data acquisition, accurately, position measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 930
156 ADA Tool for Satellite InSAR-Based Ground Displacement Analysis: The Granada Region

Authors: M. Cuevas-González, O. Monserrat, A. Barra, C. Reyes-Carmona, R. M. Mateos, J. P. Galve, R. Sarro, M. Cantalejo, E. Peña, M. Martínez-Corbella, J. A. Luque, J. M. Azañón, A. Millares, M. Béjar, J. A. Navarro, L. Solari

Abstract:

Geohazard prone areas require continuous monitoring to detect risks, understand the phenomena occurring in those regions and prevent disasters. Satellite interferometry (InSAR) has come to be a trustworthy technique for ground movement detection and monitoring in the last few years. InSAR based techniques allow to process large areas providing high number of displacement measurements at low cost. However, the results provided by such techniques are usually not easy to interpret by non-experienced users hampering its use for decision makers. This work presents a set of tools developed in the framework of different projects (Momit, Safety, U-Geohaz, Riskcoast) and an example of their use in the Granada Coastal area (Spain) is shown. The ADA (Active Displacement Areas) tool has been developed with the aim of easing the management, use and interpretation of InSAR based results. It provides a semi-automatic extraction of the most significant ADAs through the application ADAFinder tool. This tool aims to support the exploitation of the European Ground Motion Service (EU-GMS), which will offer reliable and systematic information on natural and anthropogenic ground motion phenomena across Europe.

Keywords: Ground displacements, InSAR, natural hazards, satellite imagery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 362
155 Palestine Smart Tourism Augmented Reality Mobile Application

Authors: Murad Al-Rajab, Sherin Hazboun, Azhar Al-Hamamreh, Nirmeen Odeh, Siham Halaseh

Abstract:

Tourism is considered an important sector for most countries, while maintaining good tourism attractions can promote national economic development. The State of Palestine is historically considered a wealthy country full of many archaeological places. In the city of Bethlehem, for example, the Church of the Nativity is the most important touristic site, but it does not have enough technology development to attract tourists. In this paper, we propose a smart mobile application named “Pal-STAR” (Palestine Smart Tourist Augmented Reality) as an innovative solution which targets tourists and assists them to make a visit inside the Church of the Nativity. The application will use augmented reality and feature a virtual tourist guide showing views of the church while providing historical information in a smart, easy, effective and user-friendly way. The proposed application is compatible with multiple mobile platforms and is considered user friendly. The findings show that this application will improve the practice of the tourism sector in the Holy Land, it will also increase the number of tourists visiting the Church of the Nativity and it will facilitate access to historical data that have been difficult to obtain using traditional tourism guidance. The value that tourism adds to a country cannot be denied, and the more technological advances are incorporated in this sector, the better the country’s tourism sector can be served. Palestine’s economy is heavily dependent on tourism in many of its main cities, despite several limitations, and technological development is needed to enable this sector to flourish. The proposed mobile application would definitely have a good impact on the development of the tourism sector by creating an Augmented Reality environment for tourists inside the church, helping them to navigate and learn about holy places in a non-traditional way, using a virtual tourist guide.

Keywords: Smartphones, tourism, tourists guide, augmented reality, Palestine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 558
154 Texture Based Weed Detection Using Multi Resolution Combined Statistical and Spatial Frequency (MRCSF)

Authors: R.S.Sabeenian, V.Palanisamy

Abstract:

Texture classification is a trendy and a catchy technology in the field of texture analysis. Textures, the repeated patterns, have different frequency components along different orientations. Our work is based on Texture Classification and its applications. It finds its applications in various fields like Medical Image Classification, Computer Vision, Remote Sensing, Agricultural Field, and Textile Industry. Weed control has a major effect on agriculture. A large amount of herbicide has been used for controlling weeds in agriculture fields, lawns, golf courses, sport fields, etc. Random spraying of herbicides does not meet the exact requirement of the field. Certain areas in field have more weed patches than estimated. So, we need a visual system that can discriminate weeds from the field image which will reduce or even eliminate the amount of herbicide used. This would allow farmers to not use any herbicides or only apply them where they are needed. A machine vision precision automated weed control system could reduce the usage of chemicals in crop fields. In this paper, an intelligent system for automatic weeding strategy Multi Resolution Combined Statistical & spatial Frequency is used to discriminate the weeds from the crops and to classify them as narrow, little and broad weeds.

Keywords: crop weed discrimination, MRCSF, MRFM, Weeddetection, Spatial Frequency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1801
153 Applying the Regression Technique for Prediction of the Acute Heart Attack

Authors: Paria Soleimani, Arezoo Neshati

Abstract:

Myocardial infarction is one of the leading causes of death in the world. Some of these deaths occur even before the patient reaches the hospital. Myocardial infarction occurs as a result of impaired blood supply. Because the most of these deaths are due to coronary artery disease, hence the awareness of the warning signs of a heart attack is essential. Some heart attacks are sudden and intense, but most of them start slowly, with mild pain or discomfort, then early detection and successful treatment of these symptoms is vital to save them. Therefore, importance and usefulness of a system designing to assist physicians in early diagnosis of the acute heart attacks is obvious. The main purpose of this study would be to enable patients to become better informed about their condition and to encourage them to seek professional care at an earlier stage in the appropriate situations. For this purpose, the data were collected on 711 heart patients in Iran hospitals. 28 attributes of clinical factors can be reported by patients; were studied. Three logistic regression models were made on the basis of the 28 features to predict the risk of heart attacks. The best logistic regression model in terms of performance had a C-index of 0.955 and with an accuracy of 94.9%. The variables, severe chest pain, back pain, cold sweats, shortness of breath, nausea and vomiting, were selected as the main features.

Keywords: Coronary heart disease, acute heart attacks, prediction, logistic regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2400
152 Two-Level Identification of HVAC Consumers for Demand Response Potential Estimation Based on Setpoint Change

Authors: M. Naserian, M. Jooshaki, M. Fotuhi-Firuzabad, M. Hossein Mohammadi Sanjani, A. Oraee

Abstract:

In recent years, the development of communication infrastructure and smart meters have facilitated the utilization of demand-side resources which can enhance stability and economic efficiency of power systems. Direct load control programs can play an important role in the utilization of demand-side resources in the residential sector. However, investments required for installing control equipment can be a limiting factor in the development of such demand response programs. Thus, selection of consumers with higher potentials is crucial to the success of a direct load control program. Heating, ventilation, and air conditioning (HVAC) systems, which due to the heat capacity of buildings feature relatively high flexibility, make up a major part of household consumption. Considering that the consumption of HVAC systems depends highly on the ambient temperature and bearing in mind the high investments required for control systems enabling direct load control demand response programs, in this paper, a solution is presented to uncover consumers with high air conditioner demand among a large number of consumers and to measure the demand response potential of such consumers. This can pave the way for estimating the investments needed for the implementation of direct load control programs for residential HVAC systems and for estimating the demand response potentials in a distribution system. In doing so, we first cluster consumers into several groups based on the correlation coefficients between hourly consumption data and hourly temperature data using K-means algorithm. Then, by applying a recent algorithm to the hourly consumption and temperature data, consumers with high air conditioner consumption are identified. Finally, demand response potential of such consumers is estimated based on the equivalent desired temperature setpoint changes.

Keywords: Data-driven analysis, demand response, direct load control, HVAC system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 190
151 Combating Money Laundering in the Banking Industry: Malaysian Experience

Authors: Aspalella A. Rahman

Abstract:

Money laundering has been described by many as the lifeblood of crime and is a major threat to the economic and social well-being of societies. It has been recognized that the banking system has long been the central element of money laundering. This is in part due to the complexity and confidentiality of the banking system itself. It is generally accepted that effective anti-money laundering (AML) measures adopted by banks will make it tougher for criminals to get their "dirty money" into the financial system. In fact, for law enforcement agencies, banks are considered to be an important source of valuable information for the detection of money laundering. However, from the banks- perspective, the main reason for their existence is to make as much profits as possible. Hence their cultural and commercial interests are totally distinct from that of the law enforcement authorities. Undoubtedly, AML laws create a major dilemma for banks as they produce a significant shift in the way banks interact with their customers. Furthermore, the implementation of the laws not only creates significant compliance problems for banks, but also has the potential to adversely affect the operations of banks. As such, it is legitimate to ask whether these laws are effective in preventing money launderers from using banks, or whether they simply put an unreasonable burden on banks and their customers. This paper attempts to address these issues and analyze them against the background of the Malaysian AML laws. It must be said that effective coordination between AML regulator and the banking industry is vital to minimize problems faced by the banks and thereby to ensure effective implementation of the laws in combating money laundering.

Keywords: Banking Industry, Bank Negara Money, Laundering, Malaysia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4258
150 Performance Assessment of Computational Gridon Weather Indices from HOAPS Data

Authors: Madhuri Bhavsar, Anupam K Singh, Shrikant Pradhan

Abstract:

Long term rainfall analysis and prediction is a challenging task especially in the modern world where the impact of global warming is creating complications in environmental issues. These factors which are data intensive require high performance computational modeling for accurate prediction. This research paper describes a prototype which is designed and developed on grid environment using a number of coupled software infrastructural building blocks. This grid enabled system provides the demanding computational power, efficiency, resources, user-friendly interface, secured job submission and high throughput. The results obtained using sequential execution and grid enabled execution shows that computational performance has enhanced among 36% to 75%, for decade of climate parameters. Large variation in performance can be attributed to varying degree of computational resources available for job execution. Grid Computing enables the dynamic runtime selection, sharing and aggregation of distributed and autonomous resources which plays an important role not only in business, but also in scientific implications and social surroundings. This research paper attempts to explore the grid enabled computing capabilities on weather indices from HOAPS data for climate impact modeling and change detection.

Keywords: Climate model, Computational Grid, GridApplication, Heterogeneous Grid

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1411
149 An Intelligent Combined Method Based on Power Spectral Density, Decision Trees and Fuzzy Logic for Hydraulic Pumps Fault Diagnosis

Authors: Kaveh Mollazade, Hojat Ahmadi, Mahmoud Omid, Reza Alimardani

Abstract:

Recently, the issue of machine condition monitoring and fault diagnosis as a part of maintenance system became global due to the potential advantages to be gained from reduced maintenance costs, improved productivity and increased machine availability. The aim of this work is to investigate the effectiveness of a new fault diagnosis method based on power spectral density (PSD) of vibration signals in combination with decision trees and fuzzy inference system (FIS). To this end, a series of studies was conducted on an external gear hydraulic pump. After a test under normal condition, a number of different machine defect conditions were introduced for three working levels of pump speed (1000, 1500, and 2000 rpm), corresponding to (i) Journal-bearing with inner face wear (BIFW), (ii) Gear with tooth face wear (GTFW), and (iii) Journal-bearing with inner face wear plus Gear with tooth face wear (B&GW). The features of PSD values of vibration signal were extracted using descriptive statistical parameters. J48 algorithm is used as a feature selection procedure to select pertinent features from data set. The output of J48 algorithm was employed to produce the crisp if-then rule and membership function sets. The structure of FIS classifier was then defined based on the crisp sets. In order to evaluate the proposed PSD-J48-FIS model, the data sets obtained from vibration signals of the pump were used. Results showed that the total classification accuracy for 1000, 1500, and 2000 rpm conditions were 96.42%, 100%, and 96.42% respectively. The results indicate that the combined PSD-J48-FIS model has the potential for fault diagnosis of hydraulic pumps.

Keywords: Power Spectral Density, Machine ConditionMonitoring, Hydraulic Pump, Fuzzy Logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2657
148 Automatic Segmentation of Dermoscopy Images Using Histogram Thresholding on Optimal Color Channels

Authors: Rahil Garnavi, Mohammad Aldeen, M. Emre Celebi, Alauddin Bhuiyan, Constantinos Dolianitis, George Varigos

Abstract:

Automatic segmentation of skin lesions is the first step towards development of a computer-aided diagnosis of melanoma. Although numerous segmentation methods have been developed, few studies have focused on determining the most discriminative and effective color space for melanoma application. This paper proposes a novel automatic segmentation algorithm using color space analysis and clustering-based histogram thresholding, which is able to determine the optimal color channel for segmentation of skin lesions. To demonstrate the validity of the algorithm, it is tested on a set of 30 high resolution dermoscopy images and a comprehensive evaluation of the results is provided, where borders manually drawn by four dermatologists, are compared to automated borders detected by the proposed algorithm. The evaluation is carried out by applying three previously used metrics of accuracy, sensitivity, and specificity and a new metric of similarity. Through ROC analysis and ranking the metrics, it is shown that the best results are obtained with the X and XoYoR color channels which results in an accuracy of approximately 97%. The proposed method is also compared with two state-ofthe- art skin lesion segmentation methods, which demonstrates the effectiveness and superiority of the proposed segmentation method.

Keywords: Border detection, Color space analysis, Dermoscopy, Histogram thresholding, Melanoma, Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2044
147 Support Vector Machine Prediction Model of Early-stage Lung Cancer Based on Curvelet Transform to Extract Texture Features of CT Image

Authors: Guo Xiuhua, Sun Tao, Wu Haifeng, He Wen, Liang Zhigang, Zhang Mengxia, Guo Aimin, Wang Wei

Abstract:

Purpose: To explore the use of Curvelet transform to extract texture features of pulmonary nodules in CT image and support vector machine to establish prediction model of small solitary pulmonary nodules in order to promote the ratio of detection and diagnosis of early-stage lung cancer. Methods: 2461 benign or malignant small solitary pulmonary nodules in CT image from 129 patients were collected. Fourteen Curvelet transform textural features were as parameters to establish support vector machine prediction model. Results: Compared with other methods, using 252 texture features as parameters to establish prediction model is more proper. And the classification consistency, sensitivity and specificity for the model are 81.5%, 93.8% and 38.0% respectively. Conclusion: Based on texture features extracted from Curvelet transform, support vector machine prediction model is sensitive to lung cancer, which can promote the rate of diagnosis for early-stage lung cancer to some extent.

Keywords: CT image, Curvelet transform, Small pulmonary nodules, Support vector machines, Texture extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2716
146 A Real Time Development Study for Automated Centralized Remote Monitoring System at Royal Belum Forest

Authors: Amri Yusoff, Shahrizuan Shafiril, Ashardi Abas, Norma Che Yusoff

Abstract:

Nowadays, illegal logging has been causing many effects including flash flood, avalanche, global warming, and etc. The purpose of this study was to maintain the earth ecosystem by keeping and regulate Malaysia’s treasurable rainforest by utilizing a new technology that will assist in real-time alert and give faster response to the authority to act on these illegal activities. The methodology of this research consisted of design stages that have been conducted as well as the system model and system architecture of the prototype in addition to the proposed hardware and software that have been mainly used such as microcontroller, sensor with the implementation of GSM, and GPS integrated system. This prototype was deployed at Royal Belum forest in December 2014 for phase 1 and April 2015 for phase 2 at 21 pinpoint locations. The findings of this research were the capture of data in real-time such as temperature, humidity, gaseous, fire, and rain detection which indicate the current natural state and habitat in the forest. Besides, this device location can be detected via GPS of its current location and then transmitted by SMS via GSM system. All of its readings were sent in real-time for further analysis. The data that were compared to meteorological department showed that the precision of this device was about 95% and these findings proved that the system is acceptable and suitable to be used in the field.

Keywords: Remote monitoring system, forest data, GSM, GPS, wireless sensor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1596
145 Antibody-Conjugated Nontoxic Arginine-Doped Fe3O4 Nanoparticles for Magnetic Circulating Tumor Cells Separation

Authors: F. Kashanian, M. M. Masoudi, A. Akbari, A. Shamloo, M. R. Zand, S. S. Salehi

Abstract:

Nano-sized materials present new opportunities in biology and medicine and they are used as biomedical tools for investigation, separation of molecules and cells. To achieve more effective cancer therapy, it is essential to select cancer cells exactly. This research suggests that using the antibody-functionalized nontoxic Arginine-doped magnetic nanoparticles (A-MNPs), has been prosperous in detection, capture, and magnetic separation of circulating tumor cells (CTCs) in tumor tissue. In this study, A-MNPs were synthesized via a simple precipitation reaction and directly immobilized Ep-CAM EBA-1 antibodies over superparamagnetic A-MNPs for Mucin BCA-225 in breast cancer cell. The samples were characterized by vibrating sample magnetometer (VSM), FT-IR spectroscopy, Tunneling Electron Microscopy (TEM) and Scanning Electron Microscopy (SEM). These antibody-functionalized nontoxic A-MNPs were used to capture breast cancer cell. Through employing a strong permanent magnet, the magnetic separation was achieved within a few seconds. Antibody-Conjugated nontoxic Arginine-doped Fe3O4 nanoparticles have the potential for the future study to capture CTCs which are released from tumor tissue and for drug delivery, and these results demonstrate that the antibody-conjugated A-MNPs can be used in magnetic hyperthermia techniques for cancer treatment.

Keywords: Tumor tissue, antibody, magnetic nanoparticle, CTCs capturing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1054
144 A Real Time Set Up for Retrieval of Emotional States from Human Neural Responses

Authors: Rashima Mahajan, Dipali Bansal, Shweta Singh

Abstract:

Real time non-invasive Brain Computer Interfaces have a significant progressive role in restoring or maintaining a quality life for medically challenged people. This manuscript provides a comprehensive review of emerging research in the field of cognitive/affective computing in context of human neural responses. The perspectives of different emotion assessment modalities like face expressions, speech, text, gestures, and human physiological responses have also been discussed. Focus has been paid to explore the ability of EEG (Electroencephalogram) signals to portray thoughts, feelings, and unspoken words. An automated workflow-based protocol to design an EEG-based real time Brain Computer Interface system for analysis and classification of human emotions elicited by external audio/visual stimuli has been proposed. The front end hardware includes a cost effective and portable Emotiv EEG Neuroheadset unit, a personal computer and a set of external stimulators. Primary signal analysis and processing of real time acquired EEG shall be performed using MATLAB based advanced brain mapping toolbox EEGLab/BCILab. This shall be followed by the development of MATLAB based self-defined algorithm to capture and characterize temporal and spectral variations in EEG under emotional stimulations. The extracted hybrid feature set shall be used to classify emotional states using artificial intelligence tools like Artificial Neural Network. The final system would result in an inexpensive, portable and more intuitive Brain Computer Interface in real time scenario to control prosthetic devices by translating different brain states into operative control signals.

Keywords: Brain Computer Interface (BCI), Electroencephalogram (EEG), EEGLab, BCILab, Emotiv, Emotions, Interval features, Spectral features, Artificial Neural Network, Control applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5270
143 “Post-Industrial” Journalism as a Creative Industry

Authors: Lynette Sheridan Burns, Benjamin J. Matthews

Abstract:

The context of post-industrial journalism is one in which the material circumstances of mechanical publication have been displaced by digital technologies, increasing the distance between the orthodoxy of the newsroom and the culture of journalistic writing. Content is, with growing frequency, created for delivery via the internet, publication on web-based ‘platforms’ and consumption on screen media. In this environment, the question is not ‘who is a journalist?’ but ‘what is journalism?’ today. The changes bring into sharp relief new distinctions between journalistic work and journalistic labor, providing a key insight into the current transition between the industrial journalism of the 20th century, and the post-industrial journalism of the present. In the 20th century, the work of journalists and journalistic labor went hand-in-hand as most journalists were employees of news organizations, whilst in the 21st century evidence of a decoupling of ‘acts of journalism’ (work) and journalistic employment (labor) is beginning to appear. This 'decoupling' of the work and labor that underpins journalism practice is far reaching in its implications, not least for institutional structures. Under these conditions we are witnessing the emergence of expanded ‘entrepreneurial’ journalism, based on smaller, more independent and agile - if less stable - enterprise constructs that are a feature of creative industries. Entrepreneurial journalism is realized in a range of organizational forms from social enterprise, through to profit driven start-ups and hybrids of the two. In all instances, however, the primary motif of the organization is an ideological definition of journalism. An example is the Scoop Foundation for Public Interest Journalism in New Zealand, which owns and operates Scoop Publishing Limited, a not for profit company and social enterprise that publishes an independent news site that claims to have over 500,000 monthly users. Our paper demonstrates that this journalistic work meets the ideological definition of journalism; conducted within the creative industries using an innovative organizational structure that offers a new, viable post-industrial future for journalism.

Keywords: Creative industries, digital communication, journalism, post-industrial.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1854
142 An Edge Detection and Filtering Mechanism of Two Dimensional Digital Objects Based on Fuzzy Inference

Authors: Ayman A. Aly, Abdallah A. Alshnnaway

Abstract:

The general idea behind the filter is to average a pixel using other pixel values from its neighborhood, but simultaneously to take care of important image structures such as edges. The main concern of the proposed filter is to distinguish between any variations of the captured digital image due to noise and due to image structure. The edges give the image the appearance depth and sharpness. A loss of edges makes the image appear blurred or unfocused. However, noise smoothing and edge enhancement are traditionally conflicting tasks. Since most noise filtering behaves like a low pass filter, the blurring of edges and loss of detail seems a natural consequence. Techniques to remedy this inherent conflict often encompass generation of new noise due to enhancement. In this work a new fuzzy filter is presented for the noise reduction of images corrupted with additive noise. The filter consists of three stages. (1) Define fuzzy sets in the input space to computes a fuzzy derivative for eight different directions (2) construct a set of IFTHEN rules by to perform fuzzy smoothing according to contributions of neighboring pixel values and (3) define fuzzy sets in the output space to get the filtered and edged image. Experimental results are obtained to show the feasibility of the proposed approach with two dimensional objects.

Keywords: Additive noise, edge preserving filtering, fuzzy image filtering, noise reduction, two dimensional mechanical images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1542
141 Integrating Fast Karnough Map and Modular Neural Networks for Simplification and Realization of Complex Boolean Functions

Authors: Hazem M. El-Bakry

Abstract:

In this paper a new fast simplification method is presented. Such method realizes Karnough map with large number of variables. In order to accelerate the operation of the proposed method, a new approach for fast detection of group of ones is presented. Such approach implemented in the frequency domain. The search operation relies on performing cross correlation in the frequency domain rather than time one. It is proved mathematically and practically that the number of computation steps required for the presented method is less than that needed by conventional cross correlation. Simulation results using MATLAB confirm the theoretical computations. Furthermore, a powerful solution for realization of complex functions is given. The simplified functions are implemented by using a new desigen for neural networks. Neural networks are used because they are fault tolerance and as a result they can recognize signals even with noise or distortion. This is very useful for logic functions used in data and computer communications. Moreover, the implemented functions are realized with minimum amount of components. This is done by using modular neural nets (MNNs) that divide the input space into several homogenous regions. Such approach is applied to implement XOR function, 16 logic functions on one bit level, and 2-bit digital multiplier. Compared to previous non- modular designs, a clear reduction in the order of computations and hardware requirements is achieved.

Keywords: Boolean Functions, Simplification, KarnoughMap, Implementation of Logic Functions, Modular NeuralNetworks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786
140 Energy Saving Suction Hood

Authors: I.Daut, N. Gomesh, M. Irwanto, Y. M. Irwan

Abstract:

Public awareness towards green energy are on the rise and this can be prove by many product being manufactured or prerequired to be made as energy saving devices mainly to save consumer from spending more on utility billing. These schemes are popular nowadays and many homemade appliances are turned into energy saving gadget which attracts the attention of consumers. Knowing the public demands and pattern towards purchasing home appliances thus the idea of “energy saving suction hood (ESSH)" is proposed. The ESSH can be used in many places that require smoke ventilation or even to reduce the room temperature as many conventional suction hoods (CSH) do, but this device works automatically by the usage of sensors that detects the smoke/temperature and automatically spins the exhaust fan. As it turns, the mechanical rotation rotates the AC generator which is coupled together with the fan and then charges the battery. The innovation of this product is, it does not rely on the utility supply as it is also hook up with a solar panel which also charges the battery, Secondly, it generates energy as the exhaust fan mechanically rotates. Thirdly, an energy loop back feature is introduced to this system which will supply for the ventilator fan. Another major innovation is towards interfacing this device with an in house production of generator. This generator is produced by proper design on stator as well as rotor to reduce the losses. A comparison is made between the ESSH and the CSH and result shows that the ESSH saves 172.8kWh/year of utility supply which is used by CSH. This amount of energy can save RM 3.14 from monthly utility bill and a total of RM 37.67 per year. In fact this product can generate 175 Watt of power from generator(75W) and solar panel(100W) that can be used either to supply other household appliances and/or to loop back to supply the fans motor. The innovation of this system is essential for future production of other equipment by using the loopback power method and turning most equipment into a standalone system.

Keywords: Energy saving suction hood (ESSH), conventional suction hoods (CSH), energy, and power

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1862
139 Integrating Fast Karnough Map and Modular Neural Networks for Simplification and Realization of Complex Boolean Functions

Authors: Hazem M. El-Bakry

Abstract:

In this paper a new fast simplification method is presented. Such method realizes Karnough map with large number of variables. In order to accelerate the operation of the proposed method, a new approach for fast detection of group of ones is presented. Such approach implemented in the frequency domain. The search operation relies on performing cross correlation in the frequency domain rather than time one. It is proved mathematically and practically that the number of computation steps required for the presented method is less than that needed by conventional cross correlation. Simulation results using MATLAB confirm the theoretical computations. Furthermore, a powerful solution for realization of complex functions is given. The simplified functions are implemented by using a new desigen for neural networks. Neural networks are used because they are fault tolerance and as a result they can recognize signals even with noise or distortion. This is very useful for logic functions used in data and computer communications. Moreover, the implemented functions are realized with minimum amount of components. This is done by using modular neural nets (MNNs) that divide the input space into several homogenous regions. Such approach is applied to implement XOR function, 16 logic functions on one bit level, and 2-bit digital multiplier. Compared to previous non- modular designs, a clear reduction in the order of computations and hardware requirements is achieved.

Keywords: Boolean functions, simplification, Karnough map, implementation of logic functions, modular neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2040
138 Mathematical Study for Traffic Flow and Traffic Density in Kigali Roads

Authors: Kayijuka Idrissa

Abstract:

This work investigates a mathematical study for traffic flow and traffic density in Kigali city roads and the data collected from the national police of Rwanda in 2012. While working on this topic, some mathematical models were used in order to analyze and compare traffic variables. This work has been carried out on Kigali roads specifically at roundabouts from Kigali Business Center (KBC) to Prince House as our study sites. In this project, we used some mathematical tools to analyze the data collected and to understand the relationship between traffic variables. We applied the Poisson distribution method to analyze and to know the number of accidents occurred in this section of the road which is from KBC to Prince House. The results show that the accidents that occurred in 2012 were at very high rates due to the fact that this section has a very narrow single lane on each side which leads to high congestion of vehicles, and consequently, accidents occur very frequently. Using the data of speeds and densities collected from this section of road, we found that the increment of the density results in a decrement of the speed of the vehicle. At the point where the density is equal to the jam density the speed becomes zero. The approach is promising in capturing sudden changes on flow patterns and is open to be utilized in a series of intelligent management strategies and especially in noncurrent congestion effect detection and control.

Keywords: Statistical methods, Poisson distribution, car moving techniques, traffic flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1791
137 Development of EPID-based Real time Dose Verification for Dynamic IMRT

Authors: Todsaporn Fuangrod, Daryl J. O'Connor, Boyd MC McCurdy, Peter B. Greer

Abstract:

An electronic portal image device (EPID) has become a method of patient-specific IMRT dose verification for radiotherapy. Research studies have focused on pre and post-treatment verification, however, there are currently no interventional procedures using EPID dosimetry that measure the dose in real time as a mechanism to ensure that overdoses do not occur and underdoses are detected as soon as is practically possible. As a result, an EPID-based real time dose verification system for dynamic IMRT was developed and was implemented with MATLAB/Simulink. The EPID image acquisition was set to continuous acquisition mode at 1.4 images per second. The system defined the time constraint gap, or execution gap at the image acquisition time, so that every calculation must be completed before the next image capture is completed. In addition, the <=-evaluation method was used for dose comparison, with two types of comparison processes; individual image and cumulative dose comparison monitored. The outputs of the system are the <=-map, the percent of <=<1, and mean-<= versus time, all in real time. Two strategies were used to test the system, including an error detection test and a clinical data test. The system can monitor the actual dose delivery compared with the treatment plan data or previous treatment dose delivery that means a radiation therapist is able to switch off the machine when the error is detected.

Keywords: real-time dose verification, EPID dosimetry, simulation, dynamic IMRT

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2153
136 Fast Painting with Different Colors Using Cross Correlation in the Frequency Domain

Authors: Hazem M. El-Bakry

Abstract:

In this paper, a new technique for fast painting with different colors is presented. The idea of painting relies on applying masks with different colors to the background. Fast painting is achieved by applying these masks in the frequency domain instead of spatial (time) domain. New colors can be generated automatically as a result from the cross correlation operation. This idea was applied successfully for faster specific data (face, object, pattern, and code) detection using neural algorithms. Here, instead of performing cross correlation between the input input data (e.g., image, or a stream of sequential data) and the weights of neural networks, the cross correlation is performed between the colored masks and the background. Furthermore, this approach is developed to reduce the computation steps required by the painting operation. The principle of divide and conquer strategy is applied through background decomposition. Each background is divided into small in size subbackgrounds and then each sub-background is processed separately by using a single faster painting algorithm. Moreover, the fastest painting is achieved by using parallel processing techniques to paint the resulting sub-backgrounds using the same number of faster painting algorithms. In contrast to using only faster painting algorithm, the speed up ratio is increased with the size of the background when using faster painting algorithm and background decomposition. Simulation results show that painting in the frequency domain is faster than that in the spatial domain.

Keywords: Fast Painting, Cross Correlation, Frequency Domain, Parallel Processing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1764
135 Device for 3D Analysis of Basic Movements of the Lower Extremity

Authors: Jiménez Villanueva Mayra Alejandra, Ortíz Casallas Diana Carolina, Luengas Contreras Lely Adriana

Abstract:

This document details the process of developing a wireless device that captures the basic movements of the foot (plantar flexion, dorsal flexion, abduction, adduction.), and the knee movement (flexion). It implements a motion capture system by using a hardware based on optical fiber sensors, due to the advantages in terms of scope, noise immunity and speed of data transmission and reception. The operating principle used by this system is the detection and transmission of joint movement by mechanical elements and their respective measurement by optical ones (in this case infrared). Likewise, Visual Basic software is used for reception, analysis and signal processing of data acquired by the device, generating a 3D graphical representation in real time of each movement. The result is a boot in charge of capturing the movement, a transmission module (Implementing Xbee Technology) and a receiver module for receiving information and sending it to the PC for their respective processing. The main idea with this device is to help on topics such as bioengineering and medicine, by helping to improve the quality of life and movement analysis.

Keywords: abduction, adduction, A / D converter, Autodesk 3DMax, Infrared Diode, Driver, extension, flexion, Infrared LEDs, Interface, Modeling OPENGL, Optical Fiber, USB CDC(Communications Device Class), Virtual Reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1667
134 Applied Actuator Fault Accommodation in Flight Control Systems Using Fault Reconstruction Based FDD and SMC Reconfiguration

Authors: A. Ghodbane, M. Saad, J.-F. Boland, C. Thibeault

Abstract:

Historically, actuators’ redundancy was used to deal with faults occurring suddenly in flight systems. This technique was generally expensive, time consuming and involves increased weight and space in the system. Therefore, nowadays, the on-line fault diagnosis of actuators and accommodation plays a major role in the design of avionic systems. These approaches, known as Fault Tolerant Flight Control systems (FTFCs) are able to adapt to such sudden faults while keeping avionics systems lighter and less expensive. In this paper, a (FTFC) system based on the Geometric Approach and a Reconfigurable Flight Control (RFC) are presented. The Geometric approach is used for cosmic ray fault reconstruction, while Sliding Mode Control (SMC) based on Lyapunov stability theory is designed for the reconfiguration of the controller in order to compensate the fault effect. Matlab®/Simulink® simulations are performed to illustrate the effectiveness and robustness of the proposed flight control system against actuators’ faulty signal caused by cosmic rays. The results demonstrate the successful real-time implementation of the proposed FTFC system on a non-linear 6 DOF aircraft model.

Keywords: Actuators’ faults, Fault detection and diagnosis, Fault tolerant flight control, Sliding mode control, Geometric approach for fault reconstruction, Lyapunov stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2553
133 Milling Simulations with a 3-DOF Flexible Planar Robot

Authors: Hoai Nam Huynh, Edouard Rivière-Lorphèvre, Olivier Verlinden

Abstract:

Manufacturing technologies are becoming continuously more diversified over the years. The increasing use of robots for various applications such as assembling, painting, welding has also affected the field of machining. Machining robots can deal with larger workspaces than conventional machine-tools at a lower cost and thus represent a very promising alternative for machining applications. Furthermore, their inherent structure ensures them a great flexibility of motion to reach any location on the workpiece with the desired orientation. Nevertheless, machining robots suffer from a lack of stiffness at their joints restricting their use to applications involving low cutting forces especially finishing operations. Vibratory instabilities may also happen while machining and deteriorate the precision leading to scrap parts. Some researchers are therefore concerned with the identification of optimal parameters in robotic machining. This paper continues the development of a virtual robotic machining simulator in order to find optimized cutting parameters in terms of depth of cut or feed per tooth for example. The simulation environment combines an in-house milling routine (DyStaMill) achieving the computation of cutting forces and material removal with an in-house multibody library (EasyDyn) which is used to build a dynamic model of a 3-DOF planar robot with flexible links. The position of the robot end-effector submitted to milling forces is controlled through an inverse kinematics scheme while controlling the position of its joints separately. Each joint is actuated through a servomotor for which the transfer function has been computed in order to tune the corresponding controller. The output results feature the evolution of the cutting forces when the robot structure is deformable or not and the tracking errors of the end-effector. Illustrations of the resulting machined surfaces are also presented. The consideration of the links flexibility has highlighted an increase of the cutting forces magnitude. This proof of concept will aim to enrich the database of results in robotic machining for potential improvements in production.

Keywords: Control, machining, multibody, robotic, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1338