Search results for: Signal Processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5008

Search results for: Signal Processing

1558 Integrating of Multi-Criteria Decision Making and Spatial Data Warehouse in Geographic Information System

Authors: Zohra Mekranfar, Ahmed Saidi, Abdellah Mebrek

Abstract:

This work aims to develop multi-criteria decision making (MCDM) and spatial data warehouse (SDW) methods, which will be integrated into a GIS according to a ‘GIS dominant’ approach. The GIS operating tools will be operational to operate the SDW. The MCDM methods can provide many solutions to a set of problems with various and multiple criteria. When the problem is so complex, integrating spatial dimension, it makes sense to combine the MCDM process with other approaches like data mining, ascending analyses, we present in this paper an experiment showing a geo-decisional methodology of SWD construction, On-line analytical processing (OLAP) technology which combines both basic multidimensional analysis and the concepts of data mining provides powerful tools to highlight inductions and information not obvious by traditional tools. However, these OLAP tools become more complex in the presence of the spatial dimension. The integration of OLAP with a GIS is the future geographic and spatial information solution. GIS offers advanced functions for the acquisition, storage, analysis, and display of geographic information. However, their effectiveness for complex spatial analysis is questionable due to their determinism and their decisional rigor. A prerequisite for the implementation of any analysis or exploration of spatial data requires the construction and structuring of a spatial data warehouse (SDW). This SDW must be easily usable by the GIS and by the tools offered by an OLAP system.

Keywords: data warehouse, GIS, MCDM, SOLAP

Procedia PDF Downloads 178
1557 The Effect of Common Daily Schedule on the Human Circadian Rhythms during the Polar Day on Svalbard: Field Study

Authors: Kamila Weissova, Jitka Skrabalova, Katerina Skalova, Jana Koprivova, Zdenka Bendova

Abstract:

Any Arctic visitor has to deal with extreme conditions, including constant light during the summer season or constant darkness during winter time. Light/dark cycle is the most powerful synchronizing signal for biological clock and the absence of daily dark period during the polar day can significantly alter the functional state of the internal clock. However, the inner clock can be synchronized by other zeitgebers such as physical activity, food intake or social interactions. Here, we investigated the effect of polar day on circadian clock of 10 researchers attending the polar base station in the Svalbard region during July. The data obtained on Svalbard were compared with the data obtained before the researchers left for the expedition (in the Czech Republic). To determine the state of circadian clock we used wrist actigraphy followed by sleep diaries, saliva, and buccal mucosa samples, both collected every 4 hours during 24h-interval to detect melatonin by radioimmunoassay and clock gene (PER1, BMAL1, NR1D1, DBP) mRNA levels by RT-qPCR. The clock gene expression was analyzed using cosinor analysis. From our results, it is apparent that the constant sunlight delayed melatonin onset and postponed the physical activity in the same order. Nevertheless, the clock gene expression displayed higher amplitude on Svalbard compared to the amplitude detected in the Czech Republic. These results have suggested that the common daily schedule at the Svalbard expedition can strengthen circadian rhythm in the environment that is lacking light/dark cycle. In conclusion, the constant sunlight delays melatonin onset, but it still maintains its rhythmic secretion. The effect of constant sunlight on circadian clock can be minimalized by common daily scheduled activity.

Keywords: actighraph, clock genes, human, melatonin, polar day

Procedia PDF Downloads 173
1556 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference

Authors: Hussein Alahmer, Amr Ahmed

Abstract:

Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate.  This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.

Keywords: CAD system, difference of feature, fuzzy c means, lesion detection, liver segmentation

Procedia PDF Downloads 325
1555 TerraEnhance: High-Resolution Digital Elevation Model Generation using GANs

Authors: Siddharth Sarma, Ayush Majumdar, Nidhi Sabu, Mufaddal Jiruwaala, Shilpa Paygude

Abstract:

Digital Elevation Models (DEMs) are digital representations of the Earth’s topography, which include information about the elevation, slope, aspect, and other terrain attributes. DEMs play a crucial role in various applications, including terrain analysis, urban planning, and environmental modeling. In this paper, TerraEnhance is proposed, a distinct approach for high-resolution DEM generation using Generative Adversarial Networks (GANs) combined with Real-ESRGANs. By learning from a dataset of low-resolution DEMs, the GANs are trained to upscale the data by 10 times, resulting in significantly enhanced DEMs with improved resolution and finer details. The integration of Real-ESRGANs further enhances visual quality, leading to more accurate representations of the terrain. A post-processing layer is introduced, employing high-pass filtering to refine the generated DEMs, preserving important details while reducing noise and artifacts. The results demonstrate that TerraEnhance outperforms existing methods, producing high-fidelity DEMs with intricate terrain features and exceptional accuracy. These advancements make TerraEnhance suitable for various applications, such as terrain analysis and precise environmental modeling.

Keywords: DEM, ESRGAN, image upscaling, super resolution, computer vision

Procedia PDF Downloads 8
1554 A Comparative Study of Global Power Grids and Global Fossil Energy Pipelines Using GIS Technology

Authors: Wenhao Wang, Xinzhi Xu, Limin Feng, Wei Cong

Abstract:

This paper comprehensively investigates current development status of global power grids and fossil energy pipelines (oil and natural gas), proposes a standard visual platform of global power and fossil energy based on Geographic Information System (GIS) technology. In this visual platform, a series of systematic visual models is proposed with global spatial data, systematic energy and power parameters. Under this visual platform, the current Global Power Grids Map and Global Fossil Energy Pipelines Map are plotted within more than 140 countries and regions across the world. Using the multi-scale fusion data processing and modeling methods, the world’s global fossil energy pipelines and power grids information system basic database is established, which provides important data supporting global fossil energy and electricity research. Finally, through the systematic and comparative study of global fossil energy pipelines and global power grids, the general status of global fossil energy and electricity development are reviewed, and energy transition in key areas are evaluated and analyzed. Through the comparison analysis of fossil energy and clean energy, the direction of relevant research is pointed out for clean development and energy transition.

Keywords: energy transition, geographic information system, fossil energy, power systems

Procedia PDF Downloads 151
1553 Alloy Design of Single Crystal Ni-base Superalloys by Combined Method of Neural Network and CALPHAD

Authors: Mehdi Montakhabrazlighi, Ercan Balikci

Abstract:

The neural network (NN) method is applied to alloy development of single crystal Ni-base Superalloys with low density and improved mechanical strength. A set of 1200 dataset which includes chemical composition of the alloys, applied stress and temperature as inputs and density and time to rupture as outputs is used for training and testing the network. Thermodynamic phase diagram modeling of the screened alloys is performed with Thermocalc software to model the equilibrium phases and also microsegregation in solidification processing. The model is first trained by 80% of the data and the 20% rest is used to test it. Comparing the predicted values and the experimental ones showed that a well-trained network is capable of accurately predicting the density and time to rupture strength of the Ni-base superalloys. Modeling results is used to determine the effect of alloying elements, stress, temperature and gamma-prime phase volume fraction on rupture strength of the Ni-base superalloys. This approach is in line with the materials genome initiative and integrated computed materials engineering approaches promoted recently with the aim of reducing the cost and time for development of new alloys for critical aerospace components. This work has been funded by TUBITAK under grant number 112M783.

Keywords: neural network, rupture strength, superalloy, thermocalc

Procedia PDF Downloads 314
1552 Genome Sequencing of the Yeast Saccharomyces cerevisiae Strain 202-3

Authors: Yina A. Cifuentes Triana, Andrés M. Pinzón Velásco, Marío E. Velásquez Lozano

Abstract:

In this work the sequencing and genome characterization of a natural isolate of Saccharomyces cerevisiae yeast (strain 202-3), identified with potential for the production of second generation ethanol from sugarcane bagasse hydrolysates is presented. This strain was selected because its capability to consume xylose during the fermentation of sugarcane bagasse hydrolysates, taking into account that many strains of S. cerevisiae are incapable of processing this sugar. This advantage and other prominent positive aspects during fermentation profiles evaluated in bagasse hydrolysates made the strain 202-3 a candidate strain to improve the production of second-generation ethanol, which was proposed as a first step to study the strain at the genomic level. The molecular characterization was carried out by genome sequencing with the Illumina HiSeq 2000 platform paired end; the assembly was performed with different programs, finally choosing the assembler ABYSS with kmer 89. Gene prediction was developed with the approach of hidden Markov models with Augustus. The genes identified were scored based on similarity with public databases of nucleotide and protein. Records were organized from ontological functions at different hierarchical levels, which identified central metabolic functions and roles of the S. cerevisiae strain 202-3, highlighting the presence of four possible new proteins, two of them probably associated with the positive consumption of xylose.

Keywords: cellulosic ethanol, Saccharomyces cerevisiae, genome sequencing, xylose consumption

Procedia PDF Downloads 320
1551 Cladding Technology for Metal-Hybrid Composites with Network-Structure

Authors: Ha-Guk Jeong, Jong-Beom Lee

Abstract:

Cladding process is very typical technology for manufacturing composite materials by the hydrostatic extrusion. Because there is no friction between the metal and the container, it can be easily obtained in uniform flow during the deformation. The general manufacturing process for a metal-matrix composite in the solid state, mixing metal powders and ceramic powders with a suited volume ratio, prior to be compressed or extruded at the cold or hot condition in a can. Since through a plurality of unit processing steps of dispersing the materials having a large difference in their characteristics and physical mixing, the process is complicated and leads to non-uniform dispersion of ceramics. It is difficult and hard to reach a uniform ideal property in the coherence problems at the interface between the metal and the ceramic reinforcements. Metal hybrid composites, which presented in this report, are manufactured through the traditional plastic deformation processes like hydrostatic extrusion, caliber-rolling, and drawing. By the previous process, the realization of uniform macro and microstructure is surely possible. In this study, as a constituent material, aluminum, copper, and titanium have been used, according to the component ratio, excellent characteristics of each material were possible to produce a metal hybrid composite that appears to maximize. MgB₂ superconductor wire also fabricated via the same process. It will be introduced to their unique artistic and thermal characteristics.

Keywords: cladding process, metal-hybrid composites, hydrostatic extrusion, electronic/thermal characteristics

Procedia PDF Downloads 179
1550 Application of Supervised Deep Learning-based Machine Learning to Manage Smart Homes

Authors: Ahmed Al-Adaileh

Abstract:

Renewable energy sources, domestic storage systems, controllable loads and machine learning technologies will be key components of future smart homes management systems. An energy management scheme that uses a Deep Learning (DL) approach to support the smart home management systems, which consist of a standalone photovoltaic system, storage unit, heating ventilation air-conditioning system and a set of conventional and smart appliances, is presented. The objective of the proposed scheme is to apply DL-based machine learning to predict various running parameters within a smart home's environment to achieve maximum comfort levels for occupants, reduced electricity bills, and less dependency on the public grid. The problem is using Reinforcement learning, where decisions are taken based on applying the Continuous-time Markov Decision Process. The main contribution of this research is the proposed framework that applies DL to enhance the system's supervised dataset to offer unlimited chances to effectively support smart home systems. A case study involving a set of conventional and smart appliances with dedicated processing units in an inhabited building can demonstrate the validity of the proposed framework. A visualization graph can show "before" and "after" results.

Keywords: smart homes systems, machine learning, deep learning, Markov Decision Process

Procedia PDF Downloads 202
1549 A Comparative Analysis of Various Companding Techniques Used to Reduce PAPR in VLC Systems

Authors: Arushi Singh, Anjana Jain, Prakash Vyavahare

Abstract:

Recently, Li-Fi(light-fiedelity) has been launched based on VLC(visible light communication) technique, 100 times faster than WiFi. Now 5G mobile communication system is proposed to use VLC-OFDM as the transmission technique. The VLC system focused on visible rays, is considered for efficient spectrum use and easy intensity modulation through LEDs. The reason of high speed in VLC is LED, as they flicker incredibly fast(order of MHz). Another advantage of employing LED is-it acts as low pass filter results no out-of-band emission. The VLC system falls under the category of ‘green technology’ for utilizing LEDs. In present scenario, OFDM is used for high data-rates, interference immunity and high spectral efficiency. Inspite of the advantages OFDM suffers from large PAPR, ICI among carriers and frequency offset errors. Since, the data transmission technique used in VLC system is OFDM, the system suffers the drawbacks of OFDM as well as VLC, the non-linearity dues to non-linear characteristics of LED and PAPR of OFDM due to which the high power amplifier enters in non-linear region. The proposed paper focuses on reduction of PAPR in VLC-OFDM systems. Many techniques are applied to reduce PAPR such as-clipping-introduces distortion in the carrier; selective mapping technique-suffers wastage of bandwidth; partial transmit sequence-very complex due to exponentially increased number of sub-blocks. The paper discusses three companding techniques namely- µ-law, A-law and advance A-law companding technique. The analysis shows that the advance A-law companding techniques reduces the PAPR of the signal by adjusting the companding parameter within the range. VLC-OFDM systems are the future of the wireless communication but non-linearity in VLC-OFDM is a severe issue. The proposed paper discusses the techniques to reduce PAPR, one of the non-linearities of the system. The companding techniques mentioned in this paper provides better results without increasing the complexity of the system.

Keywords: non-linear companding techniques, peak to average power ratio (PAPR), visible light communication (VLC), VLC-OFDM

Procedia PDF Downloads 285
1548 Unveiling the Detailed Turn Off-On Mechanism of Carbon Dots to Different Sized MnO₂ Nanosensor for Selective Detection of Glutathione

Authors: Neeraj Neeraj, Soumen Basu, Banibrata Maity

Abstract:

Glutathione (GSH) is one of the most important biomolecules having small molecular weight, which helps in various cellular functions like regulation of gene, xenobiotic metabolism, preservation of intracellular redox activities, signal transduction, etc. Therefore, the detection of GSH requires huge attention by using extremely selective and sensitive techniques. Herein, a rapid fluorometric nanosensor is designed by combining carbon dots (Cdots) and MnO₂ nanoparticles of different sizes for the detection of GSH. The bottom-up approach, i.e., microwave method, was used for the preparation of the water soluble and greatly fluorescent Cdots by using ascorbic acid as a precursor. MnO₂ nanospheres of different sizes (large, medium, and small) were prepared by varying the ratio of concentration of methionine and KMnO₄ at room temperature, which was confirmed by HRTEM analysis. The successive addition of MnO₂ nanospheres in Cdots results fluorescence quenching. From the fluorescence intensity data, Stern-Volmer quenching constant values (KS-V) were evaluated. From the fluorescence intensity and lifetime analysis, it was found that the degree of fluorescence quenching of Cdots followed the order: large > medium > small. Moreover, fluorescence recovery studies were also performed in the presence of GSH. Fluorescence restoration studies also show the order of turn on follows the same order, i.e., large > medium > small, which was also confirmed by quantum yield and lifetime studies. The limits of detection (LOD) of GSH in presence of Cdots@different sized MnO₂ nanospheres were also evaluated. It was observed thatLOD values were in μM region and lowest in case of large MnO₂ nanospheres. The separation distance (d) between Cdots and the surface of different MnO₂ nanospheres was determined. The d values increase with increase in the size of the MnO₂ nanospheres. In summary, the synthesized Cdots@MnO₂ nanocomposites acted as a rapid, simple, economical as well as environmental-friendly nanosensor for the detection of GSH.

Keywords: carbon dots, fluorescence, glutathione, MnO₂ nanospheres, turn off-on

Procedia PDF Downloads 152
1547 The Comparative Electroencephalogram Study: Children with Autistic Spectrum Disorder and Healthy Children Evaluate Classical Music in Different Ways

Authors: Galina Portnova, Kseniya Gladun

Abstract:

In our EEG experiment participated 27 children with ASD with the average age of 6.13 years and the average score for CARS 32.41 and 25 healthy children (of 6.35 years). Six types of musical stimulation were presented, included Gluck, Javier-Naida, Kenny G, Chopin and other classic musical compositions. Children with autism showed orientation reaction to the music and give behavioral responses to different types of music, some of them might assess stimulation by scales. The participants were instructed to remain calm. Brain electrical activity was recorded using a 19-channel EEG recording device, 'Encephalan' (Russia, Taganrog). EEG epochs lasting 150 s were analyzed using EEGLab plugin for MatLab (Mathwork Inc.). For EEG analysis we used Fast Fourier Transform (FFT), analyzed Peak alpha frequency (PAF), correlation dimension D2 and Stability of rhythms. To express the dynamics of desynchronizing of different rhythms we've calculated the envelope of the EEG signal, using the whole frequency range and a set of small narrowband filters using Hilbert transformation. Our data showed that healthy children showed similar EEG spectral changes during musical stimulation as well as described the feelings induced by musical fragments. The exception was the ‘Chopin. Prelude’ fragment (no.6). This musical fragment induced different subjective feeling, behavioral reactions and EEG spectral changes in children with ASD and healthy children. The correlation dimension D2 was significantly lower in autists compared to healthy children during musical stimulation. Hilbert envelope frequency was reduced in all group of subjects during musical compositions 1,3,5,6 compositions compared to the background. During musical fragments 2 and 4 (terrible) lower Hilbert envelope frequency was observed only in children with ASD and correlated with the severity of the disease. Alfa peak frequency was lower compared to the background during this musical composition in healthy children and conversely higher in children with ASD.

Keywords: electroencephalogram (EEG), emotional perception, ASD, musical perception, childhood Autism rating scale (CARS)

Procedia PDF Downloads 284
1546 Bakla Po Ako (I Am Gay): A Case Study on the Communication Styles of Selected Filipino Gays in Disclosing Their Sexual Orientation to Their Parents

Authors: Bryan Christian Baybay, M. Francesca Ronario

Abstract:

This study is intended to answer the question “What are the communication styles of selected Filipino gays in breaking their silence on their sexual orientation to their parents?” In this regard, six cases of Filipino gay disclosures were examined through in-depth interviews. The participants were selected through purposive sampling and snowball technique. The theories, Rhetorical Sensitivity of Roderick Hart and Communicator Style of Robert Norton were used to analyze the gathered data and to give support to the communication attitudes, message processing, message rendering and communication styles exhibited in each disclosure. As secondary data and validation, parents and experts in the field of communication, sociology, and psychology were also interviewed and consulted. The study found that Filipino gays vary in the communication styles they use during the disclosure with their parents. All communication styles: impression-leaving, contentious, open, dramatic, dominant, precise, relaxed, friendly, animated, and communicator image were observed by the gays depending on their motivation, relationship and thoughts contemplated. These results lend ideas for future researchers to look into the communication patterns and/or styles of lesbians, bisexuals, transgenders and queers or expand researches on the same subject and the utilization of Social Judgment and Relational Dialectics theories in determining and analyzing LGBTQ communication.

Keywords: communication attitudes, communication styles, Filipino gays, self-disclosure, sexual orientation

Procedia PDF Downloads 523
1545 A Moving Target: Causative Factors for Geographic Variation in a Handed Flower

Authors: Celeste De Kock, Bruce Anderson, Corneile Minnaar

Abstract:

Geographic variation in the floral morphology of a flower species has often been assumed to result from co-variation in the availability of regionally-specific functional pollinator types, giving rise to plant ecotypes that are adapted to the morphology of the main pollinator types in that area. Wachendorfia paniculata is a geographically variable enantiostylous (handed) flower with preliminary observations suggesting that differences in pollinator community composition might be driving differences in the degree of herkogamy (spatial separation of the stigma and anthers on the same flower) across its geographic range. This study aimed to determine if pollinator-related variables such as visitation rate and pollinator type could explain differences in floral morphology seen in different populations. To assess pollinator community compositions, pollinator visitation rates, and the degree of herkogamy and flower size, flowers from 13 populations were observed and measured across the Western Cape, South Africa. Multiple regression analyses indicated that pollinator-related variables had no significant effect on the degree of herkogamy between sites. However, the degree of herkogamy was strongly negatively associated with the time of measurement. It remains possible that pollinators have had an effect on the development of herkogamy throughout the evolutionary timeline of different W. paniculata populations, but not necessarily to the fine-scale degree, as was predicted for this study. Annual fluctuations in pollinator community composition, paired with recent disturbances such as urbanization and the overabundance of artificially introduced honeybee hives, might also result in the signal of pollinator adaptation getting lost. Surprisingly, differences in herkogamy between populations could largely be explained by the time of day at which flowers were measured, suggesting a significant narrowing of the distance between reproductive parts throughout the day. We propose that this floral movement could possibly be an adaptation to ensure pollination if pollinator visitation to a flower was not sufficient earlier in the day, and will be explored in subsequent studies.

Keywords: enantiostyly, floral movement, geographic variation, ecotypes

Procedia PDF Downloads 280
1544 The European Research and Development Project Improved Nuclear Site Characterization for Waste Minimization in Decommissioning under Constrained Environment: Focus on Performance Analysis and Overall Uncertainty

Authors: M. Crozet, D. Roudil, T. Branger, S. Boden, P. Peerani, B. Russell, M. Herranz, L. Aldave de la Heras

Abstract:

The EURATOM work program project INSIDER (Improved Nuclear Site Characterization for Waste minimization in Decommissioning under Constrained Environment) was launched in June 2017. This 4-year project has 18 partners and aims at improving the management of contaminated materials arising from decommissioning and dismantling (D&D) operations by proposing an integrated methodology of characterization. This methodology is based on advanced statistical processing and modelling, coupled with adapted and innovative analytical and measurement methods, with respect to sustainability and economic objectives. In order to achieve these objectives, the approaches will be then applied to common case studies in the form of Inter-laboratory comparisons on matrix representative reference samples and benchmarking. Work Package 6 (WP6) ‘Performance analysis and overall uncertainty’ is in charge of the analysis of the benchmarking on real samples, the organisation of inter-laboratory comparison on synthetic certified reference materials and the establishment of overall uncertainty budget. Assessment of the outcome will be used for providing recommendations and guidance resulting in pre-standardization tests.

Keywords: decommissioning, sampling strategy, research and development, characterization, European project

Procedia PDF Downloads 364
1543 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate

Authors: Angela Maria Fasnacht

Abstract:

Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.

Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive

Procedia PDF Downloads 121
1542 Improving Perceptual Reasoning in School Children through Chess Training

Authors: Ebenezer Joseph, Veena Easvaradoss, S. Sundar Manoharan, David Chandran, Sumathi Chandrasekaran, T. R. Uma

Abstract:

Perceptual reasoning is the ability that incorporates fluid reasoning, spatial processing, and visual motor integration. Several theories of cognitive functioning emphasize the importance of fluid reasoning. The ability to manipulate abstractions and rules and to generalize is required for reasoning tasks. This study, funded by the Cognitive Science Research Initiative, Department of Science and Technology, Government of India, analyzed the effect of 1-year chess training on the perceptual reasoning of children. A pretest–posttest with control group design was used, with 43 (28 boys, 15 girls) children in the experimental group and 42 (26 boys, 16 girls) children in the control group. The sample was selected from children studying in two private schools from South India (grades 3 to 9), which included both the genders. The experimental group underwent weekly 1-hour chess training for 1 year. Perceptual reasoning was measured by three subtests of WISC-IV INDIA. Pre-equivalence of means was established. Further statistical analyses revealed that the experimental group had shown statistically significant improvement in perceptual reasoning compared to the control group. The present study clearly establishes a correlation between chess learning and perceptual reasoning. If perceptual reasoning can be enhanced in children, it could possibly result in the improvement of executive functions as well as the scholastic performance of the child.

Keywords: chess, cognition, intelligence, perceptual reasoning

Procedia PDF Downloads 356
1541 A Life Cycle Assessment (LCA) of Aluminum Production Process

Authors: Alaa Al Hawari, Mohammad Khader, Wael El Hasan, Mahmoud Alijla, Ammar Manawi, Abdelbaki Benamour

Abstract:

The production of aluminium alloys and ingots -starting from the processing of alumina to aluminium, and the final cast product- was studied using a Life Cycle Assessment (LCA) approach. The studied aluminium supply chain consisted of a carbon plant, a reduction plant, a casting plant, and a power plant. In the LCA model, the environmental loads of the different plants for the production of 1 ton of aluminium metal were investigated. The impact of the aluminium production was assessed in eight impact categories. The results showed that for all of the impact categories the power plant had the highest impact only in the cases of Human Toxicity Potential (HTP) the reduction plant had the highest impact and in the Marine Aquatic Eco-Toxicity Potential (MAETP) the carbon plant had the highest impact. Furthermore, the impact of the carbon plant and the reduction plant combined was almost the same as the impact of the power plant in the case of the Acidification Potential (AP). The carbon plant had a positive impact on the environment when it comes to the Eutrophication Potential (EP) due to the production of clean water in the process. The natural gas based power plant used in the case study had 8.4 times less negative impact on the environment when compared to the heavy fuel based power plant and 10.7 times less negative impact when compared to the hard coal based power plant.

Keywords: life cycle assessment, aluminium production, supply chain, ecological impacts

Procedia PDF Downloads 532
1540 A Comprehensive Methodology for Voice Segmentation of Large Sets of Speech Files Recorded in Naturalistic Environments

Authors: Ana Londral, Burcu Demiray, Marcus Cheetham

Abstract:

Speech recording is a methodology used in many different studies related to cognitive and behaviour research. Modern advances in digital equipment brought the possibility of continuously recording hours of speech in naturalistic environments and building rich sets of sound files. Speech analysis can then extract from these files multiple features for different scopes of research in Language and Communication. However, tools for analysing a large set of sound files and automatically extract relevant features from these files are often inaccessible to researchers that are not familiar with programming languages. Manual analysis is a common alternative, with a high time and efficiency cost. In the analysis of long sound files, the first step is the voice segmentation, i.e. to detect and label segments containing speech. We present a comprehensive methodology aiming to support researchers on voice segmentation, as the first step for data analysis of a big set of sound files. Praat, an open source software, is suggested as a tool to run a voice detection algorithm, label segments and files and extract other quantitative features on a structure of folders containing a large number of sound files. We present the validation of our methodology with a set of 5000 sound files that were collected in the daily life of a group of voluntary participants with age over 65. A smartphone device was used to collect sound using the Electronically Activated Recorder (EAR): an app programmed to record 30-second sound samples that were randomly distributed throughout the day. Results demonstrated that automatic segmentation and labelling of files containing speech segments was 74% faster when compared to a manual analysis performed with two independent coders. Furthermore, the methodology presented allows manual adjustments of voiced segments with visualisation of the sound signal and the automatic extraction of quantitative information on speech. In conclusion, we propose a comprehensive methodology for voice segmentation, to be used by researchers that have to work with large sets of sound files and are not familiar with programming tools.

Keywords: automatic speech analysis, behavior analysis, naturalistic environments, voice segmentation

Procedia PDF Downloads 281
1539 Engineering of Reagentless Fluorescence Biosensors Based on Single-Chain Antibody Fragments

Authors: Christian Fercher, Jiaul Islam, Simon R. Corrie

Abstract:

Fluorescence-based immunodiagnostics are an emerging field in biosensor development and exhibit several advantages over traditional detection methods. While various affinity biosensors have been developed to generate a fluorescence signal upon sensing varying concentrations of analytes, reagentless, reversible, and continuous monitoring of complex biological samples remains challenging. Here, we aimed to genetically engineer biosensors based on single-chain antibody fragments (scFv) that are site-specifically labeled with environmentally sensitive fluorescent unnatural amino acids (UAA). A rational design approach resulted in quantifiable analyte-dependent changes in peak fluorescence emission wavelength and enabled antigen detection in vitro. Incorporation of a polarity indicator within the topological neighborhood of the antigen-binding interface generated a titratable wavelength blueshift with nanomolar detection limits. In order to ensure continuous analyte monitoring, scFv candidates with fast binding and dissociation kinetics were selected from a genetic library employing a high-throughput phage display and affinity screening approach. Initial rankings were further refined towards rapid dissociation kinetics using bio-layer interferometry (BLI) and surface plasmon resonance (SPR). The most promising candidates were expressed, purified to homogeneity, and tested for their potential to detect biomarkers in a continuous microfluidic-based assay. Variations of dissociation kinetics within an order of magnitude were achieved without compromising the specificity of the antibody fragments. This approach is generally applicable to numerous antibody/antigen combinations and currently awaits integration in a wide range of assay platforms for one-step protein quantification.

Keywords: antibody engineering, biosensor, phage display, unnatural amino acids

Procedia PDF Downloads 146
1538 Sliding Mode Power System Stabilizer for Synchronous Generator Stability Improvement

Authors: J. Ritonja, R. Brezovnik, M. Petrun, B. Polajžer

Abstract:

Many modern synchronous generators in power systems are extremely weakly damped. The reasons are cost optimization of the machine building and introduction of the additional control equipment into power systems. Oscillations of the synchronous generators and related stability problems of the power systems are harmful and can lead to failures in operation and to damages. The only useful solution to increase damping of the unwanted oscillations represents the implementation of the power system stabilizers. Power system stabilizers generate the additional control signal which changes synchronous generator field excitation voltage. Modern power system stabilizers are integrated into static excitation systems of the synchronous generators. Available commercial power system stabilizers are based on linear control theory. Due to the nonlinear dynamics of the synchronous generator, current stabilizers do not assure optimal damping of the synchronous generator’s oscillations in the entire operating range. For that reason the use of the robust power system stabilizers which are convenient for the entire operating range is reasonable. There are numerous robust techniques applicable for the power system stabilizers. In this paper the use of sliding mode control for synchronous generator stability improvement is studied. On the basis of the sliding mode theory, the robust power system stabilizer was developed. The main advantages of the sliding mode controller are simple realization of the control algorithm, robustness to parameter variations and elimination of disturbances. The advantage of the proposed sliding mode controller against conventional linear controller was tested for damping of the synchronous generator oscillations in the entire operating range. Obtained results show the improved damping in the entire operating range of the synchronous generator and the increase of the power system stability. The proposed study contributes to the progress in the development of the advanced stabilizer, which will replace conventional linear stabilizers and improve damping of the synchronous generators.

Keywords: control theory, power system stabilizer, robust control, sliding mode control, stability, synchronous generator

Procedia PDF Downloads 224
1537 Random Forest Classification for Population Segmentation

Authors: Regina Chua

Abstract:

To reduce the costs of re-fielding a large survey, a Random Forest classifier was applied to measure the accuracy of classifying individuals into their assigned segments with the fewest possible questions. Given a long survey, one needed to determine the most predictive ten or fewer questions that would accurately assign new individuals to custom segments. Furthermore, the solution needed to be quick in its classification and usable in non-Python environments. In this paper, a supervised Random Forest classifier was modeled on a dataset with 7,000 individuals, 60 questions, and 254 features. The Random Forest consisted of an iterative collection of individual decision trees that result in a predicted segment with robust precision and recall scores compared to a single tree. A random 70-30 stratified sampling for training the algorithm was used, and accuracy trade-offs at different depths for each segment were identified. Ultimately, the Random Forest classifier performed at 87% accuracy at a depth of 10 with 20 instead of 254 features and 10 instead of 60 questions. With an acceptable accuracy in prioritizing feature selection, new tools were developed for non-Python environments: a worksheet with a formulaic version of the algorithm and an embedded function to predict the segment of an individual in real-time. Random Forest was determined to be an optimal classification model by its feature selection, performance, processing speed, and flexible application in other environments.

Keywords: machine learning, supervised learning, data science, random forest, classification, prediction, predictive modeling

Procedia PDF Downloads 94
1536 DNA Methylation Changes in Response to Ocean Acidification at the Time of Larval Metamorphosis in the Edible Oyster, Crassostrea hongkongensis

Authors: Yong-Kian Lim, Khan Cheung, Xin Dang, Steven Roberts, Xiaotong Wang, Vengatesen Thiyagarajan

Abstract:

Unprecedented rate of increased CO₂ level in the ocean and the subsequent changes in carbonate system including decreased pH, known as ocean acidification (OA), is predicted to disrupt not only the calcification process but also several other physiological and developmental processes in a variety of marine organisms, including edible oysters. Nonetheless, not all species are vulnerable to those OA threats, e.g., some species may be able to cope with OA stress using environmentally induced modifications on gene and protein expressions. For example, external environmental stressors, including OA, can influence the addition and removal of methyl groups through epigenetic modification (e.g., DNA methylation) process to turn gene expression “on or off” as part of a rapid adaptive mechanism to cope with OA. In this study, the above hypothesis was tested through testing the effect of OA, using decreased pH 7.4 as a proxy, on the DNA methylation pattern of an endemic and a commercially important estuary oyster species, Crassostrea hongkongensis, at the time of larval habitat selection and metamorphosis. Larval growth rate did not differ between control pH 8.1 and treatment pH 7.4. The metamorphosis rate of the pediveliger larvae was higher at pH 7.4 than those in control pH 8.1; however, over one-third of the larvae raised at pH 7.4 failed to attach to an optimal substrate as defined by biofilm presence. During larval development, a total of 130 genes were differentially methylated across the two treatments. The differential methylation in the larval genes may have partially accounted for the higher metamorphosis success rate under decreased pH 7.4 but with poor substratum selection ability. Differentially methylated loci were concentrated in the exon regions and appear to be associated with cytoskeletal and signal transduction, oxidative stress, metabolic processes, and larval metamorphosis, which implies the high potential of C. hongkongensis larvae to acclimate and adapt through non-genetic ways to OA threats within a single generation.

Keywords: adaptive plasticity, DNA methylation, larval metamorphosis, ocean acidification

Procedia PDF Downloads 139
1535 Camptothecin Promotes ROS-Mediated G2/M Phase Cell Cycle Arrest, Resulting from Autophagy-Mediated Cytoprotection

Authors: Rajapaksha Gedara Prasad Tharanga Jayasooriya, Matharage Gayani Dilshara, Yung Hyun Choi, Gi-Young Kim

Abstract:

Camptothecin (CPT) is a quinolone alkaloid which inhibits DNA topoisomerase I that induces cytotoxicity in a variety of cancer cell lines. We previously showed that CPT effectively inhibited invasion of prostate cancer cells and also combined treatment with subtoxic doses of CPT and TNF-related apoptosis-inducing ligand (TRAIL) potentially enhanced apoptosis in a caspase-dependent manner in hepatoma cancer cells. Here, we found that treatment with CPT caused an irreversible cell cycle arrest in the G2/M phase. CPT-induced cell cycle arrest was associated with a decrease in protein levels of cell division cycle 25C (Cdc25C) and increased the level of cyclin B and p21. The CPT-induced decrease in Cdc25C was blocked in the presence of proteasome inhibitor MG132, thus reversed the cell cycle arrest. In addition to that treatment of CPT-increased phosphorylation of Cdc25C was the resulted of activation of checkpoint kinase 2 (Chk2), which was associated with phosphorylation of ataxia telangiectasia-mutated. Interestingly CPT induced G2/M phase of the cell cycle arrest is reactive oxygen species (ROS) dependent where ROS inhibitors NAC and GSH reversed the CPT-induced cell cycle arrest. These results further confirm by using transient knockdown of nuclear factor-erythroid 2-related factor 2 (Nrf2) since it regulates the production of ROS. Our data reveal that treatment of siNrf2 increased the ROS level as well as further increased the CPT induce G2/M phase cell cycle arrest. Our data also indicate CPT-enhanced cell cycle arrest through the extracellular signal-regulated kinase (ERK) and the c-Jun N-terminal kinase (JNK) pathway. Inhibitors of ERK and JNK more decreased the Cdc25C expression and protein expression of p21 and cyclin B. These findings indicate that Chk2-mediated phosphorylation of Cdc25C plays a major role in G2/M arrest by CPT.

Keywords: camptothecin, cell cycle, checkpoint kinase 2, nuclear factor-erythroid 2-related factor 2, reactive oxygen species

Procedia PDF Downloads 441
1534 Production and Characterization of Ce3+: Si2N2O Phosphors for White Light-Emitting Diodes

Authors: Alparslan A. Balta, Hilmi Yurdakul, Orkun Tunckan, Servet Turan, Arife Yurdakul

Abstract:

Si2N2O (Sinoite) is an inorganic-based oxynitride material that reveals promising phosphor candidates for white light-emitting diodes (WLEDs). However, there is now limited knowledge to explain the synthesis of Si2N2O for this purpose. Here, to the best of authors’ knowledge, we report the first time the production of Si2N2O based phosphors by CeO2, SiO2, Si3N4 from main starting powders, and Li2O sintering additive through spark plasma sintering (SPS) route. The processing parameters, e.g., pressure, temperature, and sintering time, were optimized to reach the monophase Si2N2O containing samples. The lattice parameter, crystallite size, and amount of formation phases were characterized in detail by X-ray diffraction (XRD). Grain morphology, particle size, and distribution were analyzed by scanning and transmission electron microscopes (SEM and TEM). Cathodoluminescence (CL) in SEM and photoluminescence (PL) analyses were conducted on the samples to determine the excitation, and emission characteristics of Ce3+ activated Si2N2O. Results showed that the Si2N2O phase in a maximum 90% ratio was obtained by sintering for 15 minutes at 1650oC under 30 MPa pressure. Based on the SEM-CL and PL measurements, Ce3+: Si2N2O phosphor shows a broad emission summit between 400-700 nm that corresponds to white light. The present research was supported by TUBITAK under project number 217M667.

Keywords: cerium, oxynitride, phosphors, sinoite, Si₂N₂O

Procedia PDF Downloads 107
1533 Brain-Computer Interfaces That Use Electroencephalography

Authors: Arda Ozkurt, Ozlem Bozkurt

Abstract:

Brain-computer interfaces (BCIs) are devices that output commands by interpreting the data collected from the brain. Electroencephalography (EEG) is a non-invasive method to measure the brain's electrical activity. Since it was invented by Hans Berger in 1929, it has led to many neurological discoveries and has become one of the essential components of non-invasive measuring methods. Despite the fact that it has a low spatial resolution -meaning it is able to detect when a group of neurons fires at the same time-, it is a non-invasive method, making it easy to use without possessing any risks. In EEG, electrodes are placed on the scalp, and the voltage difference between a minimum of two electrodes is recorded, which is then used to accomplish the intended task. The recordings of EEGs include, but are not limited to, the currents along dendrites from synapses to the soma, the action potentials along the axons connecting neurons, and the currents through the synaptic clefts connecting axons with dendrites. However, there are some sources of noise that may affect the reliability of the EEG signals as it is a non-invasive method. For instance, the noise from the EEG equipment, the leads, and the signals coming from the subject -such as the activity of the heart or muscle movements- affect the signals detected by the electrodes of the EEG. However, new techniques have been developed to differentiate between those signals and the intended ones. Furthermore, an EEG device is not enough to analyze the data from the brain to be used by the BCI implication. Because the EEG signal is very complex, to analyze it, artificial intelligence algorithms are required. These algorithms convert complex data into meaningful and useful information for neuroscientists to use the data to design BCI devices. Even though for neurological diseases which require highly precise data, invasive BCIs are needed; non-invasive BCIs - such as EEGs - are used in many cases to help disabled people's lives or even to ease people's lives by helping them with basic tasks. For example, EEG is used to detect before a seizure occurs in epilepsy patients, which can then prevent the seizure with the help of a BCI device. Overall, EEG is a commonly used non-invasive BCI technique that has helped develop BCIs and will continue to be used to detect data to ease people's lives as more BCI techniques will be developed in the future.

Keywords: BCI, EEG, non-invasive, spatial resolution

Procedia PDF Downloads 71
1532 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms

Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao

Abstract:

Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.

Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50

Procedia PDF Downloads 140
1531 Segmentation of Liver Using Random Forest Classifier

Authors: Gajendra Kumar Mourya, Dinesh Bhatia, Akash Handique, Sunita Warjri, Syed Achaab Amir

Abstract:

Nowadays, Medical imaging has become an integral part of modern healthcare. Abdominal CT images are an invaluable mean for abdominal organ investigation and have been widely studied in the recent years. Diagnosis of liver pathologies is one of the major areas of current interests in the field of medical image processing and is still an open problem. To deeply study and diagnose the liver, segmentation of liver is done to identify which part of the liver is mostly affected. Manual segmentation of the liver in CT images is time-consuming and suffers from inter- and intra-observer differences. However, automatic or semi-automatic computer aided segmentation of the Liver is a challenging task due to inter-patient Liver shape and size variability. In this paper, we present a technique for automatic segmenting the liver from CT images using Random Forest Classifier. Random forests or random decision forests are an ensemble learning method for classification that operate by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes of the individual trees. After comparing with various other techniques, it was found that Random Forest Classifier provide a better segmentation results with respect to accuracy and speed. We have done the validation of our results using various techniques and it shows above 89% accuracy in all the cases.

Keywords: CT images, image validation, random forest, segmentation

Procedia PDF Downloads 313
1530 Bioavailability of Iron in Some Selected Fiji Foods using In vitro Technique

Authors: Poonam Singh, Surendra Prasad, William Aalbersberg

Abstract:

Iron the most essential trace element in human nutrition. Its deficiency has serious health consequences and is a major public health threat worldwide. The common deficiencies in Fiji population reported are of Fe, Ca and Zn. It has also been reported that 40% of women in Fiji are iron deficient. Therefore, we have been studying the bioavailability of iron in commonly consumed Fiji foods. To study the bioavailability it is essential to assess the iron contents in raw foods. This paper reports the iron contents and its bioavailability in commonly consumed foods by multicultural population of Fiji. The food samples (rice, breads, wheat flour and breakfast cereals) were analyzed by atomic absorption spectrophotometer for total iron and its bioavailability. The white rice had the lowest total iron 0.10±0.03 mg/100g but had high bioavailability of 160.60±0.03%. The brown rice had 0.20±0.03 mg/100g total iron content but 85.00±0.03% bioavailable. The white and brown breads showed the highest iron bioavailability as 428.30±0.11 and 269.35 ±0.02%, respectively. The Weetabix and the rolled oats had the iron contents 2.89±0.27 and 1.24.±0.03 mg/100g with bioavailability of 14.19±0.04 and 12.10±0.03%, respectively. The most commonly consumed normal wheat flour had 0.65±0.00 mg/100g iron while the whole meal and the Roti flours had 2.35±0.20 and 0.62±0.17 mg/100g iron showing bioavailability of 55.38±0.05, 16.67±0.08 and 12.90±0.00%, respectively. The low bioavailability of iron in certain foods may be due to the presence of phytates/oxalates, processing/storage conditions, cooking method or interaction with other minerals present in the food samples.

Keywords: iron, bioavailability, Fiji foods, in vitro technique, human nutrition

Procedia PDF Downloads 529
1529 Functional Gene Expression in Human Cells Using Linear Vectors Derived from Bacteriophage N15 Processing

Authors: Kumaran Narayanan, Pei-Sheng Liew

Abstract:

This paper adapts the bacteriophage N15 protelomerase enzyme to assemble linear chromosomes as vectors for gene expression in human cells. Phage N15 has the unique ability to replicate as a linear plasmid with telomeres in E. coli during its prophage stage of life-cycle. The virus-encoded protelomerase enzyme cuts its circular genome and caps its ends to form hairpin telomeres, resulting in a linear human-chromosome-like structure in E. coli. In mammalian cells, however, no enzyme with TelN-like activities has been found. In this work, we show for the first-time transfer of the protelomerase from phage into human and mouse cells and demonstrate recapitulation of its activity in these hosts. The function of this enzyme is assayed by demonstrating cleavage of its target DNA, followed by detecting telomere formation based on its resistance to recBCD enzyme digestion. We show protelomerase expression persists for at least 60 days, which indicates limited silencing of its expression. Next, we show that an intact human β-globin gene delivered on this linear chromosome accurately retains its expression in the human cellular environment for at least 60 hours, demonstrating its stability and potential as a vector. These results demonstrate that the N15 protelomerse is able to function in mammalian cells to cut and heal DNA to create telomeres, which provides a new tool for creating novel structures by DNA resolution in these hosts.

Keywords: chromosome, beta-globin, DNA, gene expression, linear vector

Procedia PDF Downloads 192