Search results for: feature noise
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2638

Search results for: feature noise

928 Kinetic Model to Interpret Whistler Waves in Multicomponent Non-Maxwellian Space Plasmas

Authors: Warda Nasir, M. N. S. Qureshi

Abstract:

Whistler waves are right handed circularly polarized waves and are frequently observed in space plasmas. The Low frequency branch of the Whistler waves having frequencies nearly around 100 Hz, known as Lion roars, are frequently observed in magnetosheath. Another feature of the magnetosheath is the observations of flat top electron distributions with single as well as two electron populations. In the past, lion roars were studied by employing kinetic model using classical bi-Maxwellian distribution function, however, could not be justified both on quantitatively as well as qualitatively grounds. We studied Whistler waves by employing kinetic model using non-Maxwellian distribution function such as the generalized (r,q) distribution function which is the generalized form of kappa and Maxwellian distribution functions by employing kinetic theory with single or two electron populations. We compare our results with the Cluster observations and found good quantitative and qualitative agreement between them. At times when lion roars are observed (not observed) in the data and bi-Maxwellian could not provide the sufficient growth (damping) rates, we showed that when generalized (r,q) distribution function is employed, the resulted growth (damping) rates exactly match the observations.

Keywords: kinetic model, whistler waves, non-maxwellian distribution function, space plasmas

Procedia PDF Downloads 314
927 Multivariate Output-Associative RVM for Multi-Dimensional Affect Predictions

Authors: Achut Manandhar, Kenneth D. Morton, Peter A. Torrione, Leslie M. Collins

Abstract:

The current trends in affect recognition research are to consider continuous observations from spontaneous natural interactions in people using multiple feature modalities, and to represent affect in terms of continuous dimensions, incorporate spatio-temporal correlation among affect dimensions, and provide fast affect predictions. These research efforts have been propelled by a growing effort to develop affect recognition system that can be implemented to enable seamless real-time human-computer interaction in a wide variety of applications. Motivated by these desired attributes of an affect recognition system, in this work a multi-dimensional affect prediction approach is proposed by integrating multivariate Relevance Vector Machine (MVRVM) with a recently developed Output-associative Relevance Vector Machine (OARVM) approach. The resulting approach can provide fast continuous affect predictions by jointly modeling the multiple affect dimensions and their correlations. Experiments on the RECOLA database show that the proposed approach performs competitively with the OARVM while providing faster predictions during testing.

Keywords: dimensional affect prediction, output-associative RVM, multivariate regression, fast testing

Procedia PDF Downloads 286
926 Highly Accurate Target Motion Compensation Using Entropy Function Minimization

Authors: Amin Aghatabar Roodbary, Mohammad Hassan Bastani

Abstract:

One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.

Keywords: automatic target recognition (ATR), high resolution range profile (HRRP), motion compensation, stepped frequency waveform technique (SFW), target motion parameters (TMPs)

Procedia PDF Downloads 152
925 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering  

Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi

Abstract:

In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.

Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering

Procedia PDF Downloads 150
924 Optimized Brain Computer Interface System for Unspoken Speech Recognition: Role of Wernicke Area

Authors: Nassib Abdallah, Pierre Chauvet, Abd El Salam Hajjar, Bassam Daya

Abstract:

In this paper, we propose an optimized brain computer interface (BCI) system for unspoken speech recognition, based on the fact that the constructions of unspoken words rely strongly on the Wernicke area, situated in the temporal lobe. Our BCI system has four modules: (i) the EEG Acquisition module based on a non-invasive headset with 14 electrodes; (ii) the Preprocessing module to remove noise and artifacts, using the Common Average Reference method; (iii) the Features Extraction module, using Wavelet Packet Transform (WPT); (iv) the Classification module based on a one-hidden layer artificial neural network. The present study consists of comparing the recognition accuracy of 5 Arabic words, when using all the headset electrodes or only the 4 electrodes situated near the Wernicke area, as well as the selection effect of the subbands produced by the WPT module. After applying the articial neural network on the produced database, we obtain, on the test dataset, an accuracy of 83.4% with all the electrodes and all the subbands of 8 levels of the WPT decomposition. However, by using only the 4 electrodes near Wernicke Area and the 6 middle subbands of the WPT, we obtain a high reduction of the dataset size, equal to approximately 19% of the total dataset, with 67.5% of accuracy rate. This reduction appears particularly important to improve the design of a low cost and simple to use BCI, trained for several words.

Keywords: brain-computer interface, speech recognition, artificial neural network, electroencephalography, EEG, wernicke area

Procedia PDF Downloads 272
923 Single Atom Manipulation with 4 Scanning Tunneling Microscope Technique

Authors: Jianshu Yang, Delphine Sordes, Marek Kolmer, Christian Joachim

Abstract:

Nanoelectronics, for example the calculating circuits integrating at molecule scale logic gates, atomic scale circuits, has been constructed and investigated recently. A major challenge is their functional properties characterization because of the connecting problem from atomic scale to micrometer scale. New experimental instruments and new processes have been proposed therefore. To satisfy a precisely measurement at atomic scale and then connecting micrometer scale electrical integration controller, the technique improvement is kept on going. Our new machine, a low temperature high vacuum four scanning tunneling microscope, as a customer required instrument constructed by Omicron GmbH, is expected to be scaling down to atomic scale characterization. Here, we will present our first testified results about the performance of this new instrument. The sample we selected is Au(111) surface. The measurements have been taken at 4.2 K. The atomic resolution surface structure was observed with each of four scanners with noise level better than 3 pm. With a tip-sample distance calibration by I-z spectra, the sample conductance has been derived from its atomic locally I-V spectra. Furthermore, the surface conductance measurement has been performed using two methods, (1) by landing two STM tips on the surface with sample floating; and (2) by sample floating and one of the landed tips turned to be grounding. In addition, single atom manipulation has been achieved with a modified tip design, which is comparable to a conventional LT-STM.

Keywords: low temperature ultra-high vacuum four scanning tunneling microscope, nanoelectronics, point contact, single atom manipulation, tunneling resistance

Procedia PDF Downloads 280
922 The Ludic Exception and the Permanent Emergency: Understanding the Emergency Regimes with the Concept of Play

Authors: Mete Ulaş Aksoy

Abstract:

In contemporary politics, the state of emergency has become a permanent and salient feature of politics. This study aims to clarify the anthropological and ontological dimensions of the permanent state of emergency. It pays special attention to the structural relation between the exception and play. Focusing on the play in the context of emergency and exception enables the recognition of the difference and sometimes the discrepancy between the exception and emergency, which has passed into oblivion because of the frequency and normalization of emergency situations. This study coins the term “ludic exception” in order to highlight the difference between the exceptions in which exuberance and paroxysm rule over the socio-political life and the permanent emergency that protects the authority with a sort of extra-legality. The main thesis of the study is that the ludic elements such as risk, conspicuous consumption, sacrificial gestures, agonism, etc. circumscribe the exceptional moments temporarily, preventing them from being routine and normal. The study also emphasizes the decline of ludic elements in modernity as the main factor in the transformation of the exceptions into permanent emergency situations. In the introduction, the relationship between play and exception is taken into consideration. In the second part, the study elucidates the concept of ludic exceptions and dwells on the anthropological examples of the ludic exceptions. In the last part, the decline of ludic elements in modernity is addressed as the main factor for the permanent emergency.

Keywords: emergency, exception, ludic exception, play, sovereignty

Procedia PDF Downloads 90
921 Kinoform Optimisation Using Gerchberg- Saxton Iterative Algorithm

Authors: M. Al-Shamery, R. Young, P. Birch, C. Chatwin

Abstract:

Computer Generated Holography (CGH) is employed to create digitally defined coherent wavefronts. A CGH can be created by using different techniques such as by using a detour-phase technique or by direct phase modulation to create a kinoform. The detour-phase technique was one of the first techniques that was used to generate holograms digitally. The disadvantage of this technique is that the reconstructed image often has poor quality due to the limited dynamic range it is possible to record using a medium with reasonable spatial resolution.. The kinoform (phase-only hologram) is an alternative technique. In this method, the phase of the original wavefront is recorded but the amplitude is constrained to be constant. The original object does not need to exist physically and so the kinoform can be used to reconstruct an almost arbitrary wavefront. However, the image reconstructed by this technique contains high levels of noise and is not identical to the reference image. To improve the reconstruction quality of the kinoform, iterative techniques such as the Gerchberg-Saxton algorithm (GS) are employed. In this paper the GS algorithm is described for the optimisation of a kinoform used for the reconstruction of a complex wavefront. Iterations of the GS algorithm are applied to determine the phase at a plane (with known amplitude distribution which is often taken as uniform), that satisfies given phase and amplitude constraints in a corresponding Fourier plane. The GS algorithm can be used in this way to enhance the reconstruction quality of the kinoform. Different images are employed as the reference object and their kinoform is synthesised using the GS algorithm. The quality of the reconstructed images is quantified to demonstrate the enhanced reconstruction quality achieved by using this method.

Keywords: computer generated holography, digital holography, Gerchberg-Saxton algorithm, kinoform

Procedia PDF Downloads 533
920 Recreating Old Gardens, a Dynamic and Sustainable Design Pattern for Urban Green Spaces, Case Study: Persian Garden

Authors: Mina Sarabi, Dariush Sattarzadeh, Mitra Asadollahi Oula

Abstract:

In the old days, gardens reflect the identity and culture of each country. Persian garden in urban planning and architecture has a high position and it is a kind of paradise in Iranian opinion. But nowadays, the gardens were replaced with parks and urban open spaces. On the other hand, due to the industrial development of cities and increasing air pollution in urban environments, living in this spaces make problem for people. And improving ecological conditions will be felt more than ever. The purposes of this study are identification and reproduction of Persian garden pattern and adaptation of it with sustainability features in green spaces in contemporary cities and developing meaningful green spaces instead of designing aimless spaces in urban environment. The research method in this article is analytical and descriptive. Studying and collecting information about Iranian garden pattern is referring to library documents, articles and analysis case studies. The result reveals that Persian garden was the main factor the bond between man and nature. But in the last century, this relationship is in trouble. It has a significant impact in reducing the adverse effects of urban air pollution, noise and etc as well. Nowadays, recreated pattern of Iranian gardens in urban green spaces not only keep Iranian identity for future generations but also, using the principles of sustainability can play an important role in sustainable development and quality space of a city.

Keywords: green open spaces, nature, Persian garden, urban sustainability

Procedia PDF Downloads 251
919 A Method for Evaluating the Mechanical Stress on Mandibular Advancement Devices

Authors: Tsung-yin Lin, Yi-yu Lee, Ching-hua Hung

Abstract:

Snoring, the lay term for obstructive breathing during sleep, is one of the most prevalent of obnoxious human habits. Loud snoring usually makes others feel noisy and uncomfortable. Snoring also influences the sleep quality of snorers’ bed partners, because of the noise they do not get to sleep easily. Snoring causes the reduce of sleep quality leading to several medical problems, such as excessive daytime sleepiness, high blood pressure, increased risk for cardiovascular disease and cerebral vascular accident, and etc. There are many non-prescription devices offered for sale on the market, but very limited data are available to support a beneficial effect of these devices on snoring and use in treating obstructive sleep apnea (OSA). Mandibular advancement devices (MADs), also termed as the Mandibular reposition devices (MRDs) are removable devices which are worn at night during sleep. Most devices require dental impression, bite registration, and fabrication by a dental laboratory. Those devices are fixed to upper and lower teeth and are adjusted to advance the mandible. The amount of protrusion is adjusted to meet the therapeutic requirements, comfort, and tolerance. Many devices have a fixed degree of advancement. Some are adjustable in a limited degree. This study focuses on the stress analysis of Mandibular Advancement Devices (MADs), which are considered as a standard treatment of snoring that promoted by American Academy of Sleep Medicine (AASM). This paper proposes a new MAD design, and the finite element analysis (FEA) is introduced to precede the stress simulation for this MAD.

Keywords: finite element analysis, mandibular advancement devices, mechanical stress, snoring

Procedia PDF Downloads 356
918 Towards Human-Interpretable, Automated Learning of Feedback Control for the Mixing Layer

Authors: Hao Li, Guy Y. Cornejo Maceda, Yiqing Li, Jianguo Tan, Marek Morzynski, Bernd R. Noack

Abstract:

We propose an automated analysis of the flow control behaviour from an ensemble of control laws and associated time-resolved flow snapshots. The input may be the rich database of machine learning control (MLC) optimizing a feedback law for a cost function in the plant. The proposed methodology provides (1) insights into the control landscape, which maps control laws to performance, including extrema and ridge-lines, (2) a catalogue of representative flow states and their contribution to cost function for investigated control laws and (3) visualization of the dynamics. Key enablers are classification and feature extraction methods of machine learning. The analysis is successfully applied to the stabilization of a mixing layer with sensor-based feedback driving an upstream actuator. The fluctuation energy is reduced by 26%. The control replaces unforced Kelvin-Helmholtz vortices with subsequent vortex pairing by higher-frequency Kelvin-Helmholtz structures of lower energy. These efforts target a human interpretable, fully automated analysis of MLC identifying qualitatively different actuation regimes, distilling corresponding coherent structures, and developing a digital twin of the plant.

Keywords: machine learning control, mixing layer, feedback control, model-free control

Procedia PDF Downloads 223
917 Concubines, Handmaids Or Sister Wives: Polygamy In The Media, A Comparison Between The TV Dramas "The Legend of Zhen Huan", "The Handmaid’s Tale" And "Big Love"

Authors: Muriel Canas-Walker

Abstract:

Polygamy is a sensitive issue yet a surprisingly popular topic on television. In China, among other palace intrigues dramas, "The Legend of Zhen Huan" stands out in its harsh portrayal of sequestered concubines in the Forbidden City. In the United States the critically acclaimed "Big Love", set in the Mormon community, generated much discussion and controversy, both accademically and on social media. More recently "The Handmaid’s Tale", adapted from the famous novel by Canadian writer Margaret Atwood, also contributed to the topic. All three dramas feature the plight of women caught in a polygamy system and are particularly popular with female audiences. Using Foucault’s theory of power, visual anthropology, and feminist perspective this paper aims at analyzing the treatment of this sensitive topic in the media and its reception. From the seemingly happy sister wives in "Big Love", to the fiercely competitive concubines in "The Legend of Zhen Huan" and the tragically coerced handmaids in "The Handmaid’s Tale", the lives of women in a polygamy system are inspiring to modern audiences. This paper’s objective is to understand how the treatment of polygamy is relevant to these audiences.

Keywords: polygamy, michel foucault, feminism, visual anthropology

Procedia PDF Downloads 93
916 Pilot-Assisted Direct-Current Biased Optical Orthogonal Frequency Division Multiplexing Visible Light Communication System

Authors: Ayad A. Abdulkafi, Shahir F. Nawaf, Mohammed K. Hussein, Ibrahim K. Sileh, Fouad A. Abdulkafi

Abstract:

Visible light communication (VLC) is a new approach of optical wireless communication proposed to support the congested radio frequency (RF) spectrum. VLC systems are combined with orthogonal frequency division multiplexing (OFDM) to achieve high rate transmission and high spectral efficiency. In this paper, we investigate the Pilot-Assisted Channel Estimation for DC biased Optical OFDM (PACE-DCO-OFDM) systems to reduce the effects of the distortion on the transmitted signal. Least-square (LS) and linear minimum mean-squared error (LMMSE) estimators are implemented in MATLAB/Simulink to enhance the bit-error-rate (BER) of PACE-DCO-OFDM. Results show that DCO-OFDM system based on PACE scheme has achieved better BER performance compared to conventional system without pilot assisted channel estimation. Simulation results show that the proposed PACE-DCO-OFDM based on LMMSE algorithm can more accurately estimate the channel and achieves better BER performance when compared to the LS based PACE-DCO-OFDM and the traditional system without PACE. For the same signal to noise ratio (SNR) of 25 dB, the achieved BER is about 5×10-4 for LMMSE-PACE and 4.2×10-3 with LS-PACE while it is about 2×10-1 for system without PACE scheme.

Keywords: channel estimation, OFDM, pilot-assist, VLC

Procedia PDF Downloads 180
915 Development of a Few-View Computed Tomographic Reconstruction Algorithm Using Multi-Directional Total Variation

Authors: Chia Jui Hsieh, Jyh Cheng Chen, Chih Wei Kuo, Ruei Teng Wang, Woei Chyn Chu

Abstract:

Compressed sensing (CS) based computed tomographic (CT) reconstruction algorithm utilizes total variation (TV) to transform CT image into sparse domain and minimizes L1-norm of sparse image for reconstruction. Different from the traditional CS based reconstruction which only calculates x-coordinate and y-coordinate TV to transform CT images into sparse domain, we propose a multi-directional TV to transform tomographic image into sparse domain for low-dose reconstruction. Our method considers all possible directions of TV calculations around a pixel, so the sparse transform for CS based reconstruction is more accurate. In 2D CT reconstruction, we use eight-directional TV to transform CT image into sparse domain. Furthermore, we also use 26-directional TV for 3D reconstruction. This multi-directional sparse transform method makes CS based reconstruction algorithm more powerful to reduce noise and increase image quality. To validate and evaluate the performance of this multi-directional sparse transform method, we use both Shepp-Logan phantom and a head phantom as the targets for reconstruction with the corresponding simulated sparse projection data (angular sampling interval is 5 deg and 6 deg, respectively). From the results, the multi-directional TV method can reconstruct images with relatively less artifacts compared with traditional CS based reconstruction algorithm which only calculates x-coordinate and y-coordinate TV. We also choose RMSE, PSNR, UQI to be the parameters for quantitative analysis. From the results of quantitative analysis, no matter which parameter is calculated, the multi-directional TV method, which we proposed, is better.

Keywords: compressed sensing (CS), low-dose CT reconstruction, total variation (TV), multi-directional gradient operator

Procedia PDF Downloads 256
914 Monitoring Energy Reduction through Applying Green Roofs to Residential Buildings in Dubai

Authors: Hanan M. Taleb

Abstract:

Since buildings are a major consumer of energy, their potential impact on the environment is considerable. Therefore, expanding the application of low energy architecture is of the utmost importance. Designing with nature is also one of the most attractive methods of design for many architects and designers because it creates a pathway to sustainability. One feature of designing with nature is the use of green roofing which aims to cover the roof with vegetation either partially or completely. Appreciably, green roofing in a building has many advantages including absorbing rainwater, providing thermal insulation, enhancing the ecology, creating a peaceful retreat for people and animals, improving air quality and helping to offset the air temperature and heat island effect. The aim of this paper is to monitor energy saving in the residential buildings of Dubai after applying green roofing techniques. The paper also attempts to provide a thermal analysis after the application of green roofs. A villa in Dubai was chosen as a case study. With the aid of energy simulation software, namely Design Builder, as well as manual recording and calculations, the energy savings after applying the green roofing were detected. To that extent, the paper draws some recommendations with regard to the types of green roofing that should be used in these particular climatic conditions based on this real experiment that took place over a one year period.

Keywords: residential buildings, Dubai, energy saving, green roofing, CFD, thermal comfort

Procedia PDF Downloads 299
913 Setting the Baseline for a Sentinel System for the Identification of Occupational Risk Factors in Africa

Authors: Menouni Aziza, Chbihi Kaoutar, Duca Radu Corneliu, Gilissen Liesbeth, Bounou Salim, Godderis Lode, El Jaafari Samir

Abstract:

In Africa, environmental and occupational health risks are mostly underreported. The aim of this research is to develop and implement a sentinel surveillance system comprising training and guidance of occupational physicians (OC) who will report new work-related diseases in African countries. A group of 30 OC are recruited and trained in each of the partner countries (Morocco, Benin and Ethiopia). Each committed OC is asked to recruit 50 workers during a consultation in a time-frame of 6 months (1500 workers per country). Workers are asked to fill out an online questionnaire about their health status and work conditions, including exposure to 20 chemicals. Urine and blood samples are then collected for human biomonitoring of common exposures. Some preliminary results showed that 92% of the employees surveyed are exposed to physical constraints, 44% to chemical agents, and 24% to biological agents. The most common physical constraints are manual handling of loads, noise pollution and thermal pollution. The most frequent chemical risks are exposure to pesticides and fuels. This project will allow a better understanding of effective sentinel systems as a promising method to gather high quality data, which can support policy-making in terms of preventing emerging work-related diseases.

Keywords: sentinel system, occupational diseases, human biomonitoring, Africa

Procedia PDF Downloads 82
912 Detecting and Thwarting Interest Flooding Attack in Information Centric Network

Authors: Vimala Rani P, Narasimha Malikarjunan, Mercy Shalinie S

Abstract:

Data Networking was brought forth as an instantiation of information-centric networking. The attackers can send a colossal number of spoofs to take hold of the Pending Interest Table (PIT) named an Interest Flooding attack (IFA) since the in- interests are recorded in the PITs of the intermediate routers until they receive corresponding Data Packets are go beyond the time limit. These attacks can be detrimental to network performance. PIT expiration rate or the Interest satisfaction rate, which cannot differentiate the IFA from attacks, is the criterion Traditional IFA detection techniques are concerned with. Threshold values can casually affect Threshold-based traditional methods. This article proposes an accurate IFA detection mechanism based on a Multiple Feature-based Extreme Learning Machine (MF-ELM). Accuracy of the attack detection can be increased by presenting the entropy of Internet names, Interest satisfaction rate and PIT usage as features extracted in the MF-ELM classifier. Furthermore, we deploy a queue-based hostile Interest prefix mitigation mechanism. The inference of this real-time test bed is that the mechanism can help the network to resist IFA with higher accuracy and efficiency.

Keywords: information-centric network, pending interest table, interest flooding attack, MF-ELM classifier, queue-based mitigation strategy

Procedia PDF Downloads 206
911 How Unicode Glyphs Revolutionized the Way We Communicate

Authors: Levi Corallo

Abstract:

Typed language made by humans on computers and cell phones has made a significant distinction from previous modes of written language exchanges. While acronyms remain one of the most predominant markings of typed language, another and perhaps more recent revolution in the way humans communicate has been with the use of symbols or glyphs, primarily Emojis—globally introduced on the iPhone keyboard by Apple in 2008. This paper seeks to analyze the use of symbols in typed communication from both a linguistic and machine learning perspective. The Unicode system will be explored and methods of encoding will be juxtaposed with the current machine and human perception. Topics in how typed symbol usage exists in conversation will be explored as well as topics across current research methods dealing with Emojis like sentiment analysis, predictive text models, and so on. This study proposes that sequential analysis is a significant feature for analyzing unicode characters in a corpus with machine learning. Current models that are trying to learn or translate the meaning of Emojis should be starting to learn using bi- and tri-grams of Emoji, as well as observing the relationship between combinations of different Emoji in tandem. The sociolinguistics of an entire new vernacular of language referred to here as ‘typed language’ will also be delineated across my analysis with unicode glyphs from both a semantic and technical perspective.

Keywords: unicode, text symbols, emojis, glyphs, communication

Procedia PDF Downloads 194
910 Mitigation of Interference in Satellite Communications Systems via a Cross-Layer Coding Technique

Authors: Mario A. Blanco, Nicholas Burkhardt

Abstract:

An important problem in satellite communication systems which operate in the Ka and EHF frequency bands consists of the overall degradation in link performance of mobile terminals due to various types of degradations in the link/channel, such as fading, blockage of the link to the satellite (especially in urban environments), intentional as well as other types of interference, etc. In this paper, we focus primarily on the interference problem, and we develop a very efficient and cost-effective solution based on the use of fountain codes. We first introduce a satellite communications (SATCOM) terminal uplink interference channel model that is classically used against communication systems that use spread-spectrum waveforms. We then consider the use of fountain codes, with focus on Raptor codes, as our main mitigation technique to combat the degradation in link/receiver performance due to the interference signal. The performance of the receiver is obtained in terms of average probability of bit and message error rate as a function of bit energy-to-noise density ratio, Eb/N0, and other parameters of interest, via a combination of analysis and computer simulations, and we show that the use of fountain codes is extremely effective in overcoming the effects of intentional interference on the performance of the receiver and associated communication links. We then show this technique can be extended to mitigate other types of SATCOM channel degradations, such as those caused by channel fading, shadowing, and hard-blockage of the uplink signal.

Keywords: SATCOM, interference mitigation, fountain codes, turbo codes, cross-layer

Procedia PDF Downloads 361
909 Effects of Nitroxin Fertilizer on Physiological Characters Forage Millet under Drought Stress Conditions

Authors: Mohammad Darbani, Jafar Masoud Sinaki, Armaghan Abedzadeh Neyshaburi

Abstract:

An experiment was conducted as split plot factorial design using randomized complete block design in Damghan in 2012-2013 in order to investigate the effects of irrigation cut off (based on the Phenological stages of plants) on physiological properties of forage millet cultivars. The treatments included three irrigation levels (control with full irrigation, irrigation cut off when flowering started, and irrigation cut off when flowering ended) in the main plots, and applying nitroxin biofertilizer (+), not applying nitroxin biofertilizer (control), and Iranian forage millet cultivars (Bastan, Pishahang, and Isfahan) in the subplots. The highest rate of ashes and water-soluble carbohydrates content were observed in the cultivar Bastan (8.22 and 8.91%, respectively), the highest content of fiber and water (74.17 and 48.83%, respectively) in the treatment of irrigation cut off when flowering started, and the largest proline concentration (μmol/gfw-1) was seen in the treatment of irrigation cut off when flowering started. very rapid growth of millet, its short growing season, drought tolerance, its unique feature regarding harvest time, and its response to nitroxin biofertilizer can help expanding its cultivation in arid and semi-arid regions of Iran.

Keywords: irrigation cut off, forage millet, Nitroxin fertilizer, physiological properties

Procedia PDF Downloads 609
908 Classification of EEG Signals Based on Dynamic Connectivity Analysis

Authors: Zoran Šverko, Saša Vlahinić, Nino Stojković, Ivan Markovinović

Abstract:

In this article, the classification of target letters is performed using data from the EEG P300 Speller paradigm. Neural networks trained with the results of dynamic connectivity analysis between different brain regions are used for classification. Dynamic connectivity analysis is based on the adaptive window size and the imaginary part of the complex Pearson correlation coefficient. Brain dynamics are analysed using the relative intersection of confidence intervals for the imaginary component of the complex Pearson correlation coefficient method (RICI-imCPCC). The RICI-imCPCC method overcomes the shortcomings of currently used dynamical connectivity analysis methods, such as the low reliability and low temporal precision for short connectivity intervals encountered in constant sliding window analysis with wide window size and the high susceptibility to noise encountered in constant sliding window analysis with narrow window size. This method overcomes these shortcomings by dynamically adjusting the window size using the RICI rule. This method extracts information about brain connections for each time sample. Seventy percent of the extracted brain connectivity information is used for training and thirty percent for validation. Classification of the target word is also done and based on the same analysis method. As far as we know, through this research, we have shown for the first time that dynamic connectivity can be used as a parameter for classifying EEG signals.

Keywords: dynamic connectivity analysis, EEG, neural networks, Pearson correlation coefficients

Procedia PDF Downloads 215
907 MhAGCN: Multi-Head Attention Graph Convolutional Network for Web Services Classification

Authors: Bing Li, Zhi Li, Yilong Yang

Abstract:

Web classification can promote the quality of service discovery and management in the service repository. It is widely used to locate developers desired services. Although traditional classification methods based on supervised learning models can achieve classification tasks, developers need to manually mark web services, and the quality of these tags may not be enough to establish an accurate classifier for service classification. With the doubling of the number of web services, the manual tagging method has become unrealistic. In recent years, the attention mechanism has made remarkable progress in the field of deep learning, and its huge potential has been fully demonstrated in various fields. This paper designs a multi-head attention graph convolutional network (MHAGCN) service classification method, which can assign different weights to the neighborhood nodes without complicated matrix operations or relying on understanding the entire graph structure. The framework combines the advantages of the attention mechanism and graph convolutional neural network. It can classify web services through automatic feature extraction. The comprehensive experimental results on a real dataset not only show the superior performance of the proposed model over the existing models but also demonstrate its potentially good interpretability for graph analysis.

Keywords: attention mechanism, graph convolutional network, interpretability, service classification, service discovery

Procedia PDF Downloads 136
906 Spatio-Temporal Data Mining with Association Rules for Lake Van

Authors: Tolga Aydin, M. Fatih Alaeddinoğlu

Abstract:

People, throughout the history, have made estimates and inferences about the future by using their past experiences. Developing information technologies and the improvements in the database management systems make it possible to extract useful information from knowledge in hand for the strategic decisions. Therefore, different methods have been developed. Data mining by association rules learning is one of such methods. Apriori algorithm, one of the well-known association rules learning algorithms, is not commonly used in spatio-temporal data sets. However, it is possible to embed time and space features into the data sets and make Apriori algorithm a suitable data mining technique for learning spatio-temporal association rules. Lake Van, the largest lake of Turkey, is a closed basin. This feature causes the volume of the lake to increase or decrease as a result of change in water amount it holds. In this study, evaporation, humidity, lake altitude, amount of rainfall and temperature parameters recorded in Lake Van region throughout the years are used by the Apriori algorithm and a spatio-temporal data mining application is developed to identify overflows and newly-formed soil regions (underflows) occurring in the coastal parts of Lake Van. Identifying possible reasons of overflows and underflows may be used to alert the experts to take precautions and make the necessary investments.

Keywords: apriori algorithm, association rules, data mining, spatio-temporal data

Procedia PDF Downloads 374
905 Characterization and Effect of Using Pumpkin Seeds Oil Methyl Ester (PSME) as Fuel in a LHR Diesel Engine

Authors: Hanbey Hazar, Hakan Gul, Ugur Ozturk

Abstract:

In order to decrease the hazardous emissions of the internal combustion engines and to improve the combustion and thermal efficiency, thermal barrier coatings are applied. In this experimental study, cylinder, piston, exhaust, and inlet valves which are combustion chamber components have been coated with a ceramic material, and this earned the engine LHR feature. Cylinder, exhaust and inlet valves of the diesel engine used in the tests were coated with ekabor-2 commercial powder, which is a ceramic material, to a thickness of 50 µm, by using the boriding method. The piston of a diesel engine was coated in 300 µm thickness with bor-based powder by using plasma coating method. Pumpkin seeds oil methyl ester (PSME) was produced by the transesterification method. In addition, dimethoxymethane additive materials were used to improve the properties of diesel fuel, pumpkin seeds oil methyl ester (PSME) and its mixture. Dimethoxymethane was blended with test fuels, which was used as a pilot fuel, at the volumetric ratios of 4% and 8%. Due to thermal barrier coating, the diesel engine's CO, HC, and smoke density values decreased; but, NOx and exhaust gas temperature (EGT) increased.

Keywords: boriding, diesel engine, exhaust emission, thermal barrier coating

Procedia PDF Downloads 477
904 Clinical Profile of Renal Diseases in Children in Tertiary Care Centre

Authors: Jyoti Agrawal

Abstract:

Introduction: Renal diseases in children and young adult can be difficult to diagnose early as it may present only with few symptoms, tends to have different course than adult and respond variously to different treatment. The pattern of renal disease in children is different from developing countries as compared to developed countries. Methods: This study was a hospital based prospective observational study carried from March, 2014 to February 2015 at BP Koirala institute of health sciences. Patients with renal disease, both inpatient and outpatient from birth to 14 years of age were enrolled in the study. The diagnosis of renal disease was be made on clinical and laboratory criteria. Results: Total of 120 patients were enrolled in our study which contributed to 3.74% % of total admission. The commonest feature of presentation was edema (75%), followed by fever (65%), hypertension (60%), decreased urine output (45%) and hematuria (25%). Most common diagnosis was acute glomerulonephritis (40%) followed by Nephrotic syndrome (25%) and urinary tract infection (25%). Renal biopsy was done for 10% of cases and most of them were steroid dependent nephrotic syndrome. 5% of our cases expired because of multiorgan dysfunction syndrome, sepsis and acute kidney injury. Conclusion: Renal disease contributes to a large part of hospital pediatric admission as well as mortality and morbidity to the children.

Keywords: glomerulonephritis, nephrotic syndrome, renal disease, urinary tract infection

Procedia PDF Downloads 427
903 Design and Analysis of Crankshaft Using Al-Al2O3 Composite Material

Authors: Palanisamy Samyraj, Sriram Yogesh, Kishore Kumar, Vaishak Cibi

Abstract:

The project is about design and analysis of crankshaft using Al-Al2O3 composite material. The project is mainly concentrated across two areas one is to design and analyze the composite material, and the other is to work on the practical model. Growing competition and the growing concern for the environment has forced the automobile manufactures to meet conflicting demands such as increased power and performance, lower fuel consumption, lower pollution emission and decrease noise and vibration. Metal matrix composites offer good properties for a number of automotive components. The work reports on studies on Al-Al2O3 as the possible alternative material for a crank shaft. These material have been considered for use in various components in engines due to the high amount of strength to weight ratio. These materials are significantly taken into account for their light weight, high strength, high specific modulus, low co-efficient of thermal expansion, good air resistance properties. In addition high specific stiffness, superior high temperature, mechanical properties and oxidation resistance of Al2O3 have developed some advanced materials that are Al-Al2O3 composites. Crankshafts are used in automobile industries. Crankshaft is connected to the connecting rod for the movement of the piston which is subjected to high stresses which cause the wear of the crankshaft. Hence using composite material in crankshaft gives good fuel efficiency, low manufacturing cost, less weight.

Keywords: metal matrix composites, Al-Al2O3, high specific modulus, strength to weight ratio

Procedia PDF Downloads 275
902 Robust Recognition of Locomotion Patterns via Data-Driven Machine Learning in the Cloud Environment

Authors: Shinoy Vengaramkode Bhaskaran, Kaushik Sathupadi, Sandesh Achar

Abstract:

Human locomotion recognition is important in a variety of sectors, such as robotics, security, healthcare, fitness tracking and cloud computing. With the increasing pervasiveness of peripheral devices, particularly Inertial Measurement Units (IMUs) sensors, researchers have attempted to exploit these advancements in order to precisely and efficiently identify and categorize human activities. This research paper introduces a state-of-the-art methodology for the recognition of human locomotion patterns in a cloud environment. The methodology is based on a publicly available benchmark dataset. The investigation implements a denoising and windowing strategy to deal with the unprocessed data. Next, feature extraction is adopted to abstract the main cues from the data. The SelectKBest strategy is used to abstract optimal features from the data. Furthermore, state-of-the-art ML classifiers are used to evaluate the performance of the system, including logistic regression, random forest, gradient boosting and SVM have been investigated to accomplish precise locomotion classification. Finally, a detailed comparative analysis of results is presented to reveal the performance of recognition models.

Keywords: artificial intelligence, cloud computing, IoT, human locomotion, gradient boosting, random forest, neural networks, body-worn sensors

Procedia PDF Downloads 11
901 Introducing α-Oxoester (COBz) as a Protecting Group for Carbohydrates

Authors: Atul Kumar, Veeranjaneyulu Gannedi, Qazi Naveed Ahmed

Abstract:

Oligosaccharides, which are essential to all cellular organisms, play vital roles in cell recognition, signaling, and are involved in a broad range of biological processes. The chemical synthesis of carbohydrates represents a powerful tool to provide homogeneous glycans. In carbohydrate synthesis, the major concern is the orthogonal protection of hydroxyl groups that can be unmasked independently. Classical protecting groups include benzyl ethers (Bn), which are normally cleaved through hydrogenolysis or by means of metal reduction, and acetate (Ac), benzoate (Bz) or pivaloate esters, which are removed using base promoted hydrolysis. In present work a series of α-Oxoester (COBz) protected saccharides, with divergent base sensitivity profiles against benzoyl (Bz) and acetyl (Ac), were designed and KHSO₅/CH₃COCl in methanol was identified as an easy, mild, selective and efficient deprotecting reagent for their removal in the perspective of carbohydrate synthesis. Timely monitoring of later reagent was advantageous in establishing both sequential as well as simultaneous deprotecting of COBz, Bz, and Ac. The salient feature of our work is its ease to generate different acceptors using designed monosaccharides. In summary, we demonstrated α-Oxoester (COBz) as a new protecting group for carbohydrates and the application of this group for the synthesis of Glycosylphosphatidylinositol (GPI) anchor are in progress.

Keywords: α-Oxoester, oligosaccharides, new protecting group, acceptor synthesis, glycosylation

Procedia PDF Downloads 150
900 Functional Characterization of Transcriptional Regulator WhiB Proteins of Mycobacterium Tuberculosis

Authors: Sonam Kumari

Abstract:

Mycobacterium tuberculosis (Mtb), the causative agent of tuberculosis, possesses a remarkable feature of entering into and emerging from a persistent state. The mechanism by which Mtb switches from the dormant state to the replicative form is still poorly characterized. Proteome studies have given us an insight into the role of certain proteins in giving stupendous virulence to Mtb, but numerous dotsremain unconnected and unaccounted. The WhiB family of proteins is one such protein that is associated with developmental processes in actinomycetes.Mtb has seven such proteins (WhiB1 to WhiB7).WhiB proteins are transcriptional regulators; their conserved C-terminal HTH motif is involved in DNA binding. They regulate various essential genes of Mtbby binding to their promoter DNA. Biophysical Analysis of the effect of DNA binding on WhiB proteins has not yet been appropriately characterized. Interaction with DNA induces conformational changes in the WhiB proteins, confirmed by steady-state fluorescence and circular dichroism spectroscopy. ITC has deduced thermodynamic parameters and the binding affinity of the interaction. Since these transcription factors are highly unstable in vitro, their stability and solubility were enhanced by the co-expression of molecular chaperones. The present study findings help determine the conditions under which the WhiB proteins interact with their interacting partner and the factors that influence their binding affinity. This is crucial in understanding their role in regulating gene expression in Mtbandin targeting WhiB proteins as a drug target to cure TB.

Keywords: tuberculosis, WhiB proteins, mycobacterium tuberculosis, nucleic acid binding

Procedia PDF Downloads 104
899 Voice Liveness Detection Using Kolmogorov Arnold Networks

Authors: Arth J. Shah, Madhu R. Kamble

Abstract:

Voice biometric liveness detection is customized to certify an authentication process of the voice data presented is genuine and not a recording or synthetic voice. With the rise of deepfakes and other equivalently sophisticated spoofing generation techniques, it’s becoming challenging to ensure that the person on the other end is a live speaker or not. Voice Liveness Detection (VLD) system is a group of security measures which detect and prevent voice spoofing attacks. Motivated by the recent development of the Kolmogorov-Arnold Network (KAN) based on the Kolmogorov-Arnold theorem, we proposed KAN for the VLD task. To date, multilayer perceptron (MLP) based classifiers have been used for the classification tasks. We aim to capture not only the compositional structure of the model but also to optimize the values of univariate functions. This study explains the mathematical as well as experimental analysis of KAN for VLD tasks, thereby opening a new perspective for scientists to work on speech and signal processing-based tasks. This study emerges as a combination of traditional signal processing tasks and new deep learning models, which further proved to be a better combination for VLD tasks. The experiments are performed on the POCO and ASVSpoof 2017 V2 database. We used Constant Q-transform, Mel, and short-time Fourier transform (STFT) based front-end features and used CNN, BiLSTM, and KAN as back-end classifiers. The best accuracy is 91.26 % on the POCO database using STFT features with the KAN classifier. In the ASVSpoof 2017 V2 database, the lowest EER we obtained was 26.42 %, using CQT features and KAN as a classifier.

Keywords: Kolmogorov Arnold networks, multilayer perceptron, pop noise, voice liveness detection

Procedia PDF Downloads 41