Search results for: computer processing of large databases
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12642

Search results for: computer processing of large databases

12162 Image Segmentation Techniques: Review

Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo

Abstract:

Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.

Keywords: clustering-based, convolution-network, edge-based, region-growing

Procedia PDF Downloads 90
12161 A Prediction Method for Large-Size Event Occurrences in the Sandpile Model

Authors: S. Channgam, A. Sae-Tang, T. Termsaithong

Abstract:

In this research, the occurrences of large size events in various system sizes of the Bak-Tang-Wiesenfeld sandpile model are considered. The system sizes (square lattice) of model considered here are 25×25, 50×50, 75×75 and 100×100. The cross-correlation between the ratio of sites containing 3 grain time series and the large size event time series for these 4 system sizes are also analyzed. Moreover, a prediction method of the large-size event for the 50×50 system size is also introduced. Lastly, it can be shown that this prediction method provides a slightly higher efficiency than random predictions.

Keywords: Bak-Tang-Wiesenfeld sandpile model, cross-correlation, avalanches, prediction method

Procedia PDF Downloads 377
12160 Social-Cognitive Aspects of Interpretation: Didactic Approaches in Language Processing and English as a Second Language Difficulties in Dyslexia

Authors: Schnell Zsuzsanna

Abstract:

Background: The interpretation of written texts, language processing in the visual domain, in other words, atypical reading abilities, also known as dyslexia, is an ever-growing phenomenon in today’s societies and educational communities. The much-researched problem affects cognitive abilities and, coupled with normal intelligence normally manifests difficulties in the differentiation of sounds and orthography and in the holistic processing of written words. The factors of susceptibility are varied: social, cognitive psychological, and linguistic factors interact with each other. Methods: The research will explain the psycholinguistics of dyslexia on the basis of several empirical experiments and demonstrate how domain-general abilities of inhibition, retrieval from the mental lexicon, priming, phonological processing, and visual modality transfer affect successful language processing and interpretation. Interpretation of visual stimuli is hindered, and the problem seems to be embedded in a sociocultural, psycholinguistic, and cognitive background. This makes the picture even more complex, suggesting that the understanding and resolving of the issues of dyslexia has to be interdisciplinary, aided by several disciplines in the field of humanities and social sciences, and should be researched from an empirical approach, where the practical, educational corollaries can be analyzed on an applied basis. Aim and applicability: The lecture sheds light on the applied, cognitive aspects of interpretation, social cognitive traits of language processing, the mental underpinnings of cognitive interpretation strategies in different languages (namely, Hungarian and English), offering solutions with a few applied techniques for success in foreign language learning that can be useful advice for the developers of testing methodologies and measures across ESL teaching and testing platforms.

Keywords: dyslexia, social cognition, transparency, modalities

Procedia PDF Downloads 80
12159 Ice Load Measurements on Known Structures Using Image Processing Methods

Authors: Azam Fazelpour, Saeed R. Dehghani, Vlastimil Masek, Yuri S. Muzychka

Abstract:

This study employs a method based on image analyses and structure information to detect accumulated ice on known structures. The icing of marine vessels and offshore structures causes significant reductions in their efficiency and creates unsafe working conditions. Image processing methods are used to measure ice loads automatically. Most image processing methods are developed based on captured image analyses. In this method, ice loads on structures are calculated by defining structure coordinates and processing captured images. A pyramidal structure is designed with nine cylindrical bars as the known structure of experimental setup. Unsymmetrical ice accumulated on the structure in a cold room represents the actual case of experiments. Camera intrinsic and extrinsic parameters are used to define structure coordinates in the image coordinate system according to the camera location and angle. The thresholding method is applied to capture images and detect iced structures in a binary image. The ice thickness of each element is calculated by combining the information from the binary image and the structure coordinate. Averaging ice diameters from different camera views obtains ice thicknesses of structure elements. Comparison between ice load measurements using this method and the actual ice loads shows positive correlations with an acceptable range of error. The method can be applied to complex structures defining structure and camera coordinates.

Keywords: camera calibration, ice detection, ice load measurements, image processing

Procedia PDF Downloads 366
12158 Genome Sequencing of the Yeast Saccharomyces cerevisiae Strain 202-3

Authors: Yina A. Cifuentes Triana, Andrés M. Pinzón Velásco, Marío E. Velásquez Lozano

Abstract:

In this work the sequencing and genome characterization of a natural isolate of Saccharomyces cerevisiae yeast (strain 202-3), identified with potential for the production of second generation ethanol from sugarcane bagasse hydrolysates is presented. This strain was selected because its capability to consume xylose during the fermentation of sugarcane bagasse hydrolysates, taking into account that many strains of S. cerevisiae are incapable of processing this sugar. This advantage and other prominent positive aspects during fermentation profiles evaluated in bagasse hydrolysates made the strain 202-3 a candidate strain to improve the production of second-generation ethanol, which was proposed as a first step to study the strain at the genomic level. The molecular characterization was carried out by genome sequencing with the Illumina HiSeq 2000 platform paired end; the assembly was performed with different programs, finally choosing the assembler ABYSS with kmer 89. Gene prediction was developed with the approach of hidden Markov models with Augustus. The genes identified were scored based on similarity with public databases of nucleotide and protein. Records were organized from ontological functions at different hierarchical levels, which identified central metabolic functions and roles of the S. cerevisiae strain 202-3, highlighting the presence of four possible new proteins, two of them probably associated with the positive consumption of xylose.

Keywords: cellulosic ethanol, Saccharomyces cerevisiae, genome sequencing, xylose consumption

Procedia PDF Downloads 317
12157 Glucose Monitoring System Using Machine Learning Algorithms

Authors: Sangeeta Palekar, Neeraj Rangwani, Akash Poddar, Jayu Kalambe

Abstract:

The bio-medical analysis is an indispensable procedure for identifying health-related diseases like diabetes. Monitoring the glucose level in our body regularly helps us identify hyperglycemia and hypoglycemia, which can cause severe medical problems like nerve damage or kidney diseases. This paper presents a method for predicting the glucose concentration in blood samples using image processing and machine learning algorithms. The glucose solution is prepared by the glucose oxidase (GOD) and peroxidase (POD) method. An experimental database is generated based on the colorimetric technique. The image of the glucose solution is captured by the raspberry pi camera and analyzed using image processing by extracting the RGB, HSV, LUX color space values. Regression algorithms like multiple linear regression, decision tree, RandomForest, and XGBoost were used to predict the unknown glucose concentration. The multiple linear regression algorithm predicts the results with 97% accuracy. The image processing and machine learning-based approach reduce the hardware complexities of existing platforms.

Keywords: artificial intelligence glucose detection, glucose oxidase, peroxidase, image processing, machine learning

Procedia PDF Downloads 198
12156 A Cloud Computing System Using Virtual Hyperbolic Coordinates for Services Distribution

Authors: Telesphore Tiendrebeogo, Oumarou Sié

Abstract:

Cloud computing technologies have attracted considerable interest in recent years. Thus, these latters have become more important for many existing database applications. It provides a new mode of use and of offer of IT resources in general. Such resources can be used “on demand” by anybody who has access to the internet. Particularly, the Cloud platform provides an ease to use interface between providers and users, allow providers to develop and provide software and databases for users over locations. Currently, there are many Cloud platform providers support large scale database services. However, most of these only support simple keyword-based queries and can’t response complex query efficiently due to lack of efficient in multi-attribute index techniques. Existing Cloud platform providers seek to improve performance of indexing techniques for complex queries. In this paper, we define a new cloud computing architecture based on a Distributed Hash Table (DHT) and design a prototype system. Next, we perform and evaluate our cloud computing indexing structure based on a hyperbolic tree using virtual coordinates taken in the hyperbolic plane. We show through our experimental results that we compare with others clouds systems to show our solution ensures consistence and scalability for Cloud platform.

Keywords: virtual coordinates, cloud, hyperbolic plane, storage, scalability, consistency

Procedia PDF Downloads 420
12155 The Amount of Information Processing and Balance Performance in Children: The Dual-Task Paradigm

Authors: Chin-Chih Chiou, Tai-Yuan Su, Ti-Yu Chen, Wen-Yu Chiu, Chungyu Chen

Abstract:

The purpose of this study was to investigate the effect of reaction time (RT) or balance performance as the number of stimulus-response choices increases, the amount of information processing of 0-bit and 1-bit conditions based on Hick’s law, using the dual-task design. Eighteen children (age: 9.38 ± 0.27 years old) were recruited as the participants for this study, and asked to assess RT and balance performance separately and simultaneously as following five conditions: simple RT (0-bit decision), choice RT (1-bit decision), single balance control, balance control with simple RT, and balance control with choice RT. Biodex 950-300 balance system and You-Shang response timer were used to record and analyze the postural stability and information processing speed (RT) respectively for the participants. Repeated measures one-way ANOVA with HSD post-hoc test and 2 (balance) × 2 (amount of information processing) repeated measures two-way ANOVA were used to test the parameters of balance performance and RT (α = .05). The results showed the overall stability index in the 1-bit decision was lower than in 0-bit decision, and the mean deflection in the 1-bit decision was lower than in single balance performance. Simple RTs were faster than choice RTs both in single task condition and dual task condition. It indicated that the chronometric approach of RT could use to infer the attention requirement of the secondary task. However, this study did not find that the balance performance is interfered for children by the increasing of the amount of information processing.

Keywords: capacity theory, reaction time, Hick’s law, balance

Procedia PDF Downloads 448
12154 Exploring the Intersection Between the General Data Protection Regulation and the Artificial Intelligence Act

Authors: Maria Jędrzejczak, Patryk Pieniążek

Abstract:

The European legal reality is on the eve of significant change. In European Union law, there is talk of a “fourth industrial revolution”, which is driven by massive data resources linked to powerful algorithms and powerful computing capacity. The above is closely linked to technological developments in the area of artificial intelligence, which has prompted an analysis covering both the legal environment as well as the economic and social impact, also from an ethical perspective. The discussion on the regulation of artificial intelligence is one of the most serious yet widely held at both European Union and Member State level. The literature expects legal solutions to guarantee security for fundamental rights, including privacy, in artificial intelligence systems. There is no doubt that personal data have been increasingly processed in recent years. It would be impossible for artificial intelligence to function without processing large amounts of data (both personal and non-personal). The main driving force behind the current development of artificial intelligence is advances in computing, but also the increasing availability of data. High-quality data are crucial to the effectiveness of many artificial intelligence systems, particularly when using techniques involving model training. The use of computers and artificial intelligence technology allows for an increase in the speed and efficiency of the actions taken, but also creates security risks for the data processed of an unprecedented magnitude. The proposed regulation in the field of artificial intelligence requires analysis in terms of its impact on the regulation on personal data protection. It is necessary to determine what the mutual relationship between these regulations is and what areas are particularly important in the personal data protection regulation for processing personal data in artificial intelligence systems. The adopted axis of considerations is a preliminary assessment of two issues: 1) what principles of data protection should be applied in particular during processing personal data in artificial intelligence systems, 2) what regulation on liability for personal data breaches is in such systems. The need to change the regulations regarding the rights and obligations of data subjects and entities processing personal data cannot be excluded. It is possible that changes will be required in the provisions regarding the assignment of liability for a breach of personal data protection processed in artificial intelligence systems. The research process in this case concerns the identification of areas in the field of personal data protection that are particularly important (and may require re-regulation) due to the introduction of the proposed legal regulation regarding artificial intelligence. The main question that the authors want to answer is how the European Union regulation against data protection breaches in artificial intelligence systems is shaping up. The answer to this question will include examples to illustrate the practical implications of these legal regulations.

Keywords: data protection law, personal data, AI law, personal data breach

Procedia PDF Downloads 56
12153 Rapid Soil Classification Using Computer Vision, Electrical Resistivity and Soil Strength

Authors: Eugene Y. J. Aw, J. W. Koh, S. H. Chew, K. E. Chua, Lionel L. J. Ang, Algernon C. S. Hong, Danette S. E. Tan, Grace H. B. Foo, K. Q. Hong, L. M. Cheng, M. L. Leong

Abstract:

This paper presents a novel rapid soil classification technique that combines computer vision with four-probe soil electrical resistivity method and cone penetration test (CPT), to improve the accuracy and productivity of on-site classification of excavated soil. In Singapore, excavated soils from local construction projects are transported to Staging Grounds (SGs) to be reused as fill material for land reclamation. Excavated soils are mainly categorized into two groups (“Good Earth” and “Soft Clay”) based on particle size distribution (PSD) and water content (w) from soil investigation reports and on-site visual survey, such that proper treatment and usage can be exercised. However, this process is time-consuming and labour-intensive. Thus, a rapid classification method is needed at the SGs. Computer vision, four-probe soil electrical resistivity and CPT were combined into an innovative non-destructive and instantaneous classification method for this purpose. The computer vision technique comprises soil image acquisition using industrial grade camera; image processing and analysis via calculation of Grey Level Co-occurrence Matrix (GLCM) textural parameters; and decision-making using an Artificial Neural Network (ANN). Complementing the computer vision technique, the apparent electrical resistivity of soil (ρ) is measured using a set of four probes arranged in Wenner’s array. It was found from the previous study that the ANN model coupled with ρ can classify soils into “Good Earth” and “Soft Clay” in less than a minute, with an accuracy of 85% based on selected representative soil images. To further improve the technique, the soil strength is measured using a modified mini cone penetrometer, and w is measured using a set of time-domain reflectometry (TDR) probes. Laboratory proof-of-concept was conducted through a series of seven tests with three types of soils – “Good Earth”, “Soft Clay” and an even mix of the two. Validation was performed against the PSD and w of each soil type obtained from conventional laboratory tests. The results show that ρ, w and CPT measurements can be collectively analyzed to classify soils into “Good Earth” or “Soft Clay”. It is also found that these parameters can be integrated with the computer vision technique on-site to complete the rapid soil classification in less than three minutes.

Keywords: Computer vision technique, cone penetration test, electrical resistivity, rapid and non-destructive, soil classification

Procedia PDF Downloads 208
12152 Embedded Acoustic Signal Processing System Using OpenMP Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

In this paper, altera de1-SoC FPGA board technology is utilized as a distinguished tool for nondestructive characterization of an aluminum circular cylindrical shell of radius ratio b/a (a: outer radius; b: inner radius). The acoustic backscattered signal processing system has been developed using OpenMP architecture. The design is built in three blocks; it is implemented per functional block, in a heterogeneous Intel-Altera system running under Linux. The useful data to determine the performances of SoC FPGA is computed by the analytical method. The exploitation of SoC FPGA has lead to obtain the backscattering form function and resonance spectra. A0 and S0 modes of propagation in the tube are shown. The findings are then compared to those achieved from the Matlab simulation of analytical method. A good agreement has, therefore, been noted. Moreover, the detailed SoC FPGA-based system has shown that acoustic spectra are performed at up to 5 times faster than the Matlab implementation using almost the same data. This FPGA-based system implementation of processing algorithms is realized with a coefficient of correlation R and absolute error respectively about 0.962 and 5 10⁻⁵.

Keywords: OpenMP, signal processing system, acoustic backscattering, nondestructive characterization, thin tubes

Procedia PDF Downloads 88
12151 Massively-Parallel Bit-Serial Neural Networks for Fast Epilepsy Diagnosis: A Feasibility Study

Authors: Si Mon Kueh, Tom J. Kazmierski

Abstract:

There are about 1% of the world population suffering from the hidden disability known as epilepsy and major developing countries are not fully equipped to counter this problem. In order to reduce the inconvenience and danger of epilepsy, different methods have been researched by using a artificial neural network (ANN) classification to distinguish epileptic waveforms from normal brain waveforms. This paper outlines the aim of achieving massive ANN parallelization through a dedicated hardware using bit-serial processing. The design of this bit-serial Neural Processing Element (NPE) is presented which implements the functionality of a complete neuron using variable accuracy. The proposed design has been tested taking into consideration non-idealities of a hardware ANN. The NPE consists of a bit-serial multiplier which uses only 16 logic elements on an Altera Cyclone IV FPGA and a bit-serial ALU as well as a look-up table. Arrays of NPEs can be driven by a single controller which executes the neural processing algorithm. In conclusion, the proposed compact NPE design allows the construction of complex hardware ANNs that can be implemented in a portable equipment that suits the needs of a single epileptic patient in his or her daily activities to predict the occurrences of impending tonic conic seizures.

Keywords: Artificial Neural Networks (ANN), bit-serial neural processor, FPGA, Neural Processing Element (NPE)

Procedia PDF Downloads 317
12150 Spatial-Temporal Clustering Characteristics of Dengue in the Northern Region of Sri Lanka, 2010-2013

Authors: Sumiko Anno, Keiji Imaoka, Takeo Tadono, Tamotsu Igarashi, Subramaniam Sivaganesh, Selvam Kannathasan, Vaithehi Kumaran, Sinnathamby Noble Surendran

Abstract:

Dengue outbreaks are affected by biological, ecological, socio-economic and demographic factors that vary over time and space. These factors have been examined separately and still require systematic clarification. The present study aimed to investigate the spatial-temporal clustering relationships between these factors and dengue outbreaks in the northern region of Sri Lanka. Remote sensing (RS) data gathered from a plurality of satellites were used to develop an index comprising rainfall, humidity and temperature data. RS data gathered by ALOS/AVNIR-2 were used to detect urbanization, and a digital land cover map was used to extract land cover information. Other data on relevant factors and dengue outbreaks were collected through institutions and extant databases. The analyzed RS data and databases were integrated into geographic information systems, enabling temporal analysis, spatial statistical analysis and space-time clustering analysis. Our present results showed that increases in the number of the combination of ecological factor and socio-economic and demographic factors with above the average or the presence contribute to significantly high rates of space-time dengue clusters.

Keywords: ALOS/AVNIR-2, dengue, space-time clustering analysis, Sri Lanka

Procedia PDF Downloads 474
12149 The Impact of Artificial Intelligence on Food Industry

Authors: George Hanna Abdelmelek Henien

Abstract:

Quality and safety issues are common in Ethiopia's food processing industry, which can negatively impact consumers' health and livelihoods. The country is known for its various agricultural products that are important to the economy. However, food quality and safety policies and management practices in the food processing industry have led to many health problems, foodborne illnesses and economic losses. This article aims to show the causes and consequences of food safety and quality problems in the food processing industry in Ethiopia and discuss possible solutions to solve them. One of the main reasons for food quality and safety in Ethiopia's food processing industry is the lack of adequate regulation and enforcement mechanisms. Inadequate food safety and quality policies have led to inefficiencies in food production. Additionally, the failure to monitor and enforce existing regulations has created a good opportunity for unscrupulous companies to engage in harmful practices that endanger the lives of citizens. The impact on food quality and safety is significant due to loss of life, high medical costs, and loss of consumer confidence in the food processing industry. Foodborne diseases such as diarrhoea, typhoid and cholera are common in Ethiopia, and food quality and safety play an important role in . Additionally, food recalls due to contamination or contamination often cause significant economic losses in the food processing industry. To solve these problems, the Ethiopian government began taking measures to improve food quality and safety in the food processing industry. One of the most prominent initiatives is the Ethiopian Food and Drug Administration (EFDA), which was established in 2010 to monitor and control the quality and safety of food and beverage products in the country. EFDA has implemented many measures to improve food safety, such as carrying out routine inspections, monitoring the import of food products and implementing labeling requirements. Another solution that can improve food quality and safety in the food processing industry in Ethiopia is the implementation of food safety management system (FSMS). FSMS is a set of procedures and policies designed to identify, assess and control food safety risks during food processing. Implementing a FSMS can help companies in the food processing industry identify and address potential risks before they harm consumers. Additionally, implementing an FSMS can help companies comply with current safety and security regulations. Consequently, improving food safety policy and management system in Ethiopia's food processing industry is important to protect people's health and improve the country's economy. . Addressing the root causes of food quality and safety and implementing practical solutions that can help improve the overall food safety and quality in the country, such as establishing regulatory bodies and implementing food management systems.

Keywords: food quality, food safety, policy, management system, food processing industry food traceability, industry 4.0, internet of things, block chain, best worst method, marcos

Procedia PDF Downloads 56
12148 Toward the Destigmatizing the Autism Label: Conceptualizing Celebratory Technologies

Authors: LouAnne Boyd

Abstract:

From the perspective of self-advocates, the biggest unaddressed problem is not the symptoms of an autism spectrum diagnosis but the social stigma that accompanies autism. This societal perspective is in contrast to the focus on the majority of interventions. Autism interventions, and consequently, most innovative technologies for autism, aim to improve deficits that occur within the person. For example, the most common Human-Computer Interaction research projects in assistive technology for autism target social skills from a normative perspective. The premise of the autism technologies is that difficulties occur inside the body, hence, the medical model focuses on ways to improve the ailment within the person. However, other technological approaches to support people with autism do exist. In the realm of Human Computer Interaction, there are other modes of research that provide critique of the medical model. For example, critical design, whose intended audience is industry or other HCI researchers, provides products that are the opposite of interventionist work to bring attention to the misalignment between the lived experience and the societal perception of autism. For example, parodies of interventionist work exist to provoke change, such as a recent project called Facesavr, a face covering that helps allistic adults be more independent in their emotional processing. Additionally, from a critical disability studies’ perspective, assistive technologies perpetuate harmful normalizing behaviors. However, these critical approaches can feel far from the frontline in terms of taking direct action to positively impact end users. From a critical yet more pragmatic perspective, projects such as Counterventions lists ways to reduce the likelihood of perpetuating ableism in interventionist’s work by reflectively analyzing a series of evolving assistive technology projects through a societal lens, thus leveraging the momentum of the evolving ecology of technologies for autism. Therefore, all current paradigms fall short of addressing the largest need—the negative impact of social stigma. The current work introduces a new paradigm for technologies for autism, borrowing from a paradigm introduced two decades ago around changing the narrative related to eating disorders. It is the shift from reprimanding poor habits to celebrating positive aspects of eating. This work repurposes Celebratory Technology for Neurodiversity and intended to reduce social stigma by targeting for the public at large. This presentation will review how requirements were derived from current research on autism social stigma as well as design sessions with autistic adults. Congruence between these two sources revealed three key design implications for technology: provide awareness of the autistic experience; generate acceptance of the neurodivergence; cultivate an appreciation for talents and accomplishments of neurodivergent people. The current pilot work in Celebratory Technology offers a new paradigm for supporting autism by shifting the burden of change from the person with autism to address changing society’s biases at large. Shifting the focus of research outside of the autistic body creates a new space for a design that extends beyond the bodies of a few and calls on all to embrace humanity as a whole.

Keywords: neurodiversity, social stigma, accessibility, inclusion, celebratory technology

Procedia PDF Downloads 69
12147 Digital Revolution a Veritable Infrastructure for Technological Development

Authors: Osakwe Jude Odiakaosa

Abstract:

Today’s digital society is characterized by e-education or e-learning, e-commerce, and so on. All these have been propelled by digital revolution. Digital technology such as computer technology, Global Positioning System (GPS) and Geographic Information System (GIS) has been having a tremendous impact on the field of technology. This development has positively affected the scope, methods, speed of data acquisition, data management and the rate of delivery of the results (map and other map products) of data processing. This paper tries to address the impact of revolution brought by digital technology.

Keywords: digital revolution, internet, technology, data management

Procedia PDF Downloads 446
12146 A Quality Index Optimization Method for Non-Invasive Fetal ECG Extraction

Authors: Lucia Billeci, Gennaro Tartarisco, Maurizio Varanini

Abstract:

Fetal cardiac monitoring by fetal electrocardiogram (fECG) can provide significant clinical information about the healthy condition of the fetus. Despite this potentiality till now the use of fECG in clinical practice has been quite limited due to the difficulties in its measuring. The recovery of fECG from the signals acquired non-invasively by using electrodes placed on the maternal abdomen is a challenging task because abdominal signals are a mixture of several components and the fetal one is very weak. This paper presents an approach for fECG extraction from abdominal maternal recordings, which exploits the characteristics of pseudo-periodicity of fetal ECG. It consists of devising a quality index (fQI) for fECG and of finding the linear combinations of preprocessed abdominal signals, which maximize these fQI (quality index optimization - QIO). It aims at improving the performances of the most commonly adopted methods for fECG extraction, usually based on maternal ECG (mECG) estimating and canceling. The procedure for the fECG extraction and fetal QRS (fQRS) detection is completely unsupervised and based on the following steps: signal pre-processing; maternal ECG (mECG) extraction and maternal QRS detection; mECG component approximation and canceling by weighted principal component analysis; fECG extraction by fQI maximization and fetal QRS detection. The proposed method was compared with our previously developed procedure, which obtained the highest at the Physionet/Computing in Cardiology Challenge 2013. That procedure was based on removing the mECG from abdominal signals estimated by a principal component analysis (PCA) and applying the Independent component Analysis (ICA) on the residual signals. Both methods were developed and tuned using 69, 1 min long, abdominal measurements with fetal QRS annotation of the dataset A provided by PhysioNet/Computing in Cardiology Challenge 2013. The QIO-based and the ICA-based methods were compared in analyzing two databases of abdominal maternal ECG available on the Physionet site. The first is the Abdominal and Direct Fetal Electrocardiogram Database (ADdb) which contains the fetal QRS annotations thus allowing a quantitative performance comparison, the second is the Non-Invasive Fetal Electrocardiogram Database (NIdb), which does not contain the fetal QRS annotations so that the comparison between the two methods can be only qualitative. In particular, the comparison on NIdb was performed defining an index of quality for the fetal RR series. On the annotated database ADdb the QIO method, provided the performance indexes Sens=0.9988, PPA=0.9991, F1=0.9989 overcoming the ICA-based one, which provided Sens=0.9966, PPA=0.9972, F1=0.9969. The comparison on NIdb was performed defining an index of quality for the fetal RR series. The index of quality resulted higher for the QIO-based method compared to the ICA-based one in 35 records out 55 cases of the NIdb. The QIO-based method gave very high performances with both the databases. The results of this study foresees the application of the algorithm in a fully unsupervised way for the implementation in wearable devices for self-monitoring of fetal health.

Keywords: fetal electrocardiography, fetal QRS detection, independent component analysis (ICA), optimization, wearable

Procedia PDF Downloads 275
12145 Integrated Teaching of Hardware Courses for the Undergraduates of Computer Science and Engineering to Attain Focused Outcomes

Authors: Namrata D. Hiremath, Mahalaxmi Bhille, P. G. Sunitha Hiremath

Abstract:

Computer systems play an integral role in all facets of the engineering profession. This calls for an understanding of the processor-level components of computer systems, their design and operation, and their impact on the overall performance of the systems. Systems users are always in need of faster, more powerful, yet cheaper computer systems. The focus of Computer Science engineering graduates is inclined towards software oriented base. To be an efficient programmer there is a need to understand the role of hardware architecture towards the same. It is essential for the students of Computer Science and Engineering to know the basic building blocks of any computing device and how the digital principles can be used to build them. Hence two courses Digital Electronics of 3 credits, which is associated with lab of 1.5 credits and Computer Organization of 5 credits, were introduced at the sophomore level. Activity was introduced with the objective to teach the hardware concepts to the students of Computer science engineering through structured lab. The students were asked to design and implement a component of a computing device using MultiSim simulation tool and build the same using hardware components. The experience of the activity helped the students to understand the real time applications of the SSI and MSI components. The impact of the activity was evaluated and the performance was measured. The paper explains the achievement of the ABET outcomes a, c and k.

Keywords: digital, computer organization, ABET, structured enquiry, course activity

Procedia PDF Downloads 493
12144 Birth Path and the Vitality of Caring Models in the Continuity of Midwifery

Authors: Elnaz Lalezari, Ramin Ghasemi Shaya

Abstract:

The birth way is influenced by a fracture within the quiet care handle, making a brokenness of this final one. The pregnant lady has got to interface with numerous experts, both amid the pregnancy, the childbirth, and the puerperium. Be that as it may, amid the final ten a long time, there has been an expanding of the pregnancy care worked by the midwife, who is considered to be the administrator with the correct competences, who can beware of each pregnancy and may profit herself of other professionals' commitments in arrange to make strides the results of maternal and neonatal health. To confirm whether there are proofs of viability that bolster the caseload birthing assistance care show, and in case it is conceivable to apply this show within the birth way in Italy. A amendment of writing has been done utilizing a few look motor (Google, Bing) and particular databases (MEDLINE, CINAHL, Embase, Domestic - ClinicalTrials.gov). There has, too, been a discussion of the Italian directions, the national rules, and the proposals of WHO. Results: The look string, legitimately adjusted to the three databases, has given the taking after comes about: MEDLINE 64 articles, CINAHL 94 articles, Embase 88 articles. From this choice, 14 articles have been extricated: 1 orderly survey, 3 controlled arbitrary trial, 7 observational ponders, 3 subjective studies. The caseload maternity care appears to be an successful and dependable organisational/caring strategy. It reacts to the criterions of quality and security, to the requirements of ladies not as it were amid the pregnancy but moreover amid the post-partum stage. For these reasons, it appears exceptionally valuable also for the birth way within the Italian reality.

Keywords: midwifery, care, caseload, maternity

Procedia PDF Downloads 127
12143 Rapid Soil Classification Using Computer Vision with Electrical Resistivity and Soil Strength

Authors: Eugene Y. J. Aw, J. W. Koh, S. H. Chew, K. E. Chua, P. L. Goh, Grace H. B. Foo, M. L. Leong

Abstract:

This paper presents the evaluation of various soil testing methods such as the four-probe soil electrical resistivity method and cone penetration test (CPT) that can complement a newly developed novel rapid soil classification scheme using computer vision, to improve the accuracy and productivity of on-site classification of excavated soil. In Singapore, excavated soils from the local construction industry are transported to Staging Grounds (SGs) to be reused as fill material for land reclamation. Excavated soils are mainly categorized into two groups (“Good Earth” and “Soft Clay”) based on particle size distribution (PSD) and water content (w) from soil investigation reports and on-site visual survey, such that proper treatment and usage can be exercised. However, this process is time-consuming and labor-intensive. Thus, a rapid classification method is needed at the SGs. Four-probe soil electrical resistivity and CPT were evaluated for their feasibility as suitable additions to the computer vision system to further develop this innovative non-destructive and instantaneous classification method. The computer vision technique comprises soil image acquisition using an industrial-grade camera; image processing and analysis via calculation of Grey Level Co-occurrence Matrix (GLCM) textural parameters; and decision-making using an Artificial Neural Network (ANN). It was found from the previous study that the ANN model coupled with ρ can classify soils into “Good Earth” and “Soft Clay” in less than a minute, with an accuracy of 85% based on selected representative soil images. To further improve the technique, the following three items were targeted to be added onto the computer vision scheme: the apparent electrical resistivity of soil (ρ) measured using a set of four probes arranged in Wenner’s array, the soil strength measured using a modified mini cone penetrometer, and w measured using a set of time-domain reflectometry (TDR) probes. Laboratory proof-of-concept was conducted through a series of seven tests with three types of soils – “Good Earth”, “Soft Clay,” and a mix of the two. Validation was performed against the PSD and w of each soil type obtained from conventional laboratory tests. The results show that ρ, w and CPT measurements can be collectively analyzed to classify soils into “Good Earth” or “Soft Clay” and are feasible as complementing methods to the computer vision system.

Keywords: computer vision technique, cone penetration test, electrical resistivity, rapid and non-destructive, soil classification

Procedia PDF Downloads 231
12142 Cognitive Dysfunctioning and the Fronto-Limbic Network in Bipolar Disorder Patients: A Fmri Meta-Analysis

Authors: Rahele Mesbah, Nic Van Der Wee, Manja Koenders, Erik Giltay, Albert Van Hemert, Max De Leeuw

Abstract:

Introduction: Patients with bipolar disorder (BD), characterized by depressive and manic episodes, often suffer from cognitive dysfunction. An up-to-date meta-analysis of functional Magnetic Resonance Imaging (fMRI) studies examining cognitive function in BD is lacking. Objective: The aim of the current fMRI meta-analysis is to investigate brain functioning of bipolar patients compared with healthy subjects within three domains of emotion processing, reward processing, and working memory. Method: Differences in brain regions activation were tested within whole-brain analysis using the activation likelihood estimation (ALE) method. Separate analyses were performed for each cognitive domain. Results: A total of 50 fMRI studies were included: 20 studies used an emotion processing (316 BD and 369 HC) task, 9 studies a reward processing task (215 BD and 213 HC), and 21 studies used a working memory task (503 BD and 445 HC). During emotion processing, BD patients hyperactivated parts of the left amygdala and hippocampus as compared to HC’s, but showed hypoactivation in the inferior frontal gyrus (IFG). Regarding reward processing, BD patients showed hyperactivation in part of the orbitofrontal cortex (OFC). During working memory, BD patients showed increased activity in the prefrontal cortex (PFC) and anterior cingulate cortex (ACC). Conclusions: This meta-analysis revealed evidence for activity disturbances in several brain areas involved in the cognitive functioning of BD patients. Furthermore, most of the found regions are part of the so-called fronto-limbic network which is hypothesized to be affected as a result of BD candidate genes' expression.

Keywords: cognitive functioning, fMRI analysis, bipolar disorder, fronto-limbic network

Procedia PDF Downloads 456
12141 Analyzing the Perceptions of Emotions in Aesthetic Music

Authors: Abigail Wiafe, Charles Nutrokpor, Adelaide Oduro-Asante

Abstract:

The advancement of technology is rapidly making people more receptive to music as computer-generated music requires minimal human interventions. Though algorithms are applied to generate music, the human experience of emotions is still explored. Thus, this study investigates the emotions humans experience listening to computer-generated music that possesses aesthetic qualities. Forty-two subjects participated in the survey. The selection process was purely arbitrary since it was based on convenience. Subjects listened and evaluated the emotions experienced from the computer-generated music through an online questionnaire. The Likert scale was used to rate the emotional levels after the music listening experience. The findings suggest that computer-generated music possesses aesthetic qualities that do not affect subjects' emotions as long as they are pleased with the music. Furthermore, computer-generated music has unique creativity, and expressioneven though the music produced is meaningless, the computational models developed are unable to present emotional contents in music as humans do.

Keywords: aesthetic, algorithms, emotions, computer-generated music

Procedia PDF Downloads 130
12140 Spatial Audio Player Using Musical Genre Classification

Authors: Jun-Yong Lee, Hyoung-Gook Kim

Abstract:

In this paper, we propose a smart music player that combines the musical genre classification and the spatial audio processing. The musical genre is classified based on content analysis of the musical segment detected from the audio stream. In parallel with the classification, the spatial audio quality is achieved by adding an artificial reverberation in a virtual acoustic space to the input mono sound. Thereafter, the spatial sound is boosted with the given frequency gains based on the musical genre when played back. Experiments measured the accuracy of detecting the musical segment from the audio stream and its musical genre classification. A listening test was performed based on the virtual acoustic space based spatial audio processing.

Keywords: automatic equalization, genre classification, music segment detection, spatial audio processing

Procedia PDF Downloads 424
12139 Study of Large-Scale Atmospheric Convection over the Tropical Indian Ocean and Its Association with Oceanic Variables

Authors: Supriya Manikrao Ovhal

Abstract:

In India, the summer monsoon rainfall occurs owing to large scale convection with reference to continental ITCZ. It was found that convection over tropical ocean increases with SST from 26 to 28 degree C, and when SST is above 29 degree C, it sharply decreases for warm pool areas of Indian and for monsoon areas of West Pacific Ocean. The reduction in convection can be influenced by large scale subsidence forced by nearby or remotely generated deep convection, thus it was observed that under the influence of strong large scale rising motion, convection does not decreases but increases monotonically with SST even if SST value is higher than 29.5 degree C. Since convection is related to SST gradient, that helps to generate low level moisture convergence and upward vertical motion in the atmosphere. Strong wind fields like cross equatorial low level jet stream on equator ward side of the warm pool are produced due to convection initiated by SST gradient. Areas having maximum SST have low SST gradient, and that result in feeble convection. Hence it is imperative to mention that the oceanic role (other than SST) could be prominent in influencing large Scale Atmospheric convection. Since warm oceanic surface somewhere or the other contributes to penetrate the heat radiation to the subsurface of the ocean, and as there is no studies seen related to oceanic subsurface role in large Scale Atmospheric convection, in the present study, we are concentrating on the oceanic subsurface contribution in large Scale Atmospheric convection by considering the SST gradient, mixed layer depth (MLD), thermocline, barrier layer. The present study examines the probable role of subsurface ocean parameters in influencing convection.

Keywords: sst, d20, olr, wind

Procedia PDF Downloads 90
12138 Implementation of Iterative Algorithm for Earthquake Location

Authors: Hussain K. Chaiel

Abstract:

The development in the field of the digital signal processing (DSP) and the microelectronics technology reduces the complexity of the iterative algorithms that need large number of arithmetic operations. Virtex-Field Programmable Gate Arrays (FPGAs) are programmable silicon foundations which offer an important solution for addressing the needs of high performance DSP designer. In this work, Virtex-7 FPGA technology is used to implement an iterative algorithm to estimate the earthquake location. Simulation results show that an implementation based on block RAMB36E1 and DSP48E1 slices of Virtex-7 type reduces the number of cycles of the clock frequency. This enables the algorithm to be used for earthquake prediction.

Keywords: DSP, earthquake, FPGA, iterative algorithm

Procedia PDF Downloads 381
12137 Polysorb®-A Versatile Monomer for Improving Thermoplastics and Thermosetting Properties: Case Study of Polyesters

Authors: R. Saint-Loup, H. Amedro, N. Jacquel, S. Legrand, F. Fenouillot, J. P. Pascault, A. Rousseau

Abstract:

Isosorbide or 1,4-3,6 dianhydrohexitol has been developped for several years as a new biobased monomer. It is commercially available as a starch derivative, more precisely obtained derivated from starch and more precisely from sorbitol. Isosorbide can find several applications, directly as a monomer or after chemical modification, in different polymer fields like thermoplastics (obtained from polycondensation or from radical polymerization of unsaturated monomers) or like Thermosetting resins (like cross linked PU, or after modification like acrylates or epoxy coatings) Concerning aliphatic or semi-aromatic polyesters, the addition of isosorbide improves thermal stability an,d optical properties, allowing a large range of applications as semi-crystalline or amorphous polymers. The preparation of poly (ethylene-co-isosorbide) terephthalate with different ratios of isosorbide will be particularly detailed. The structure – properties relationship will permit a focus on the obtention of polyesters with semi-crystalline or amorphous structures. The influence of isosorbide on the polymerization, on the processing of the resulting polyester as well as the modification of the final properties will be enlightened. The properties of Poly (ethylene-co-isosorbide) terephthlate will be emphasized and related to their applications. The evolutions related to Isosorbide with the replacement of ethylene glycol by Cyclohexanedimethanol allowed to drastically change the properties of the resulting polyester, with a large gap on the properties and new potential applications.

Keywords: modified PET, poly(ethylene-co-isosorbide)terephthalate, specialy polyester, poly(isosorbide_co_cyclohexanediol)terephthalate

Procedia PDF Downloads 70
12136 3D Numerical Modelling of a Pulsed Pumping Process of a Large Dense Non-Aqueous Phase Liquid Pool: In situ Pilot-Scale Case Study of Hexachlorobutadiene in a Keyed Enclosure

Authors: Q. Giraud, J. Gonçalvès, B. Paris

Abstract:

Remediation of dense non-aqueous phase liquids (DNAPLs) represents a challenging issue because of their persistent behaviour in the environment. This pilot-scale study investigates, by means of in situ experiments and numerical modelling, the feasibility of the pulsed pumping process of a large amount of a DNAPL in an alluvial aquifer. The main compound of the DNAPL is hexachlorobutadiene, an emerging organic pollutant. A low-permeability keyed enclosure was built at the location of the DNAPL source zone in order to isolate a finite undisturbed volume of soil, and a 3-month pulsed pumping process was applied inside the enclosure to exclusively extract the DNAPL. The water/DNAPL interface elevation at both the pumping and observation wells and the cumulated pumped volume of DNAPL were also recorded. A total volume of about 20m³ of purely DNAPL was recovered since no water was extracted during the process. The three-dimensional and multiphase flow simulator TMVOC was used, and a conceptual model was elaborated and generated with the pre/post-processing tool mView. Numerical model consisted of 10 layers of variable thickness and 5060 grid cells. Numerical simulations reproduce the pulsed pumping process and show an excellent match between simulated, and field data of DNAPL cumulated pumped volume and a reasonable agreement between modelled and observed data for the evolution of the water/DNAPL interface elevations at the two wells. This study offers a new perspective in remediation since DNAPL pumping system optimisation may be performed where a large amount of DNAPL is encountered.

Keywords: dense non-aqueous phase liquid (DNAPL), hexachlorobutadiene, in situ pulsed pumping, multiphase flow, numerical modelling, porous media

Procedia PDF Downloads 172
12135 Advancements in Mathematical Modeling and Optimization for Control, Signal Processing, and Energy Systems

Authors: Zahid Ullah, Atlas Khan

Abstract:

This abstract focuses on the advancements in mathematical modeling and optimization techniques that play a crucial role in enhancing the efficiency, reliability, and performance of these systems. In this era of rapidly evolving technology, mathematical modeling and optimization offer powerful tools to tackle the complex challenges faced by control, signal processing, and energy systems. This abstract presents the latest research and developments in mathematical methodologies, encompassing areas such as control theory, system identification, signal processing algorithms, and energy optimization. The abstract highlights the interdisciplinary nature of mathematical modeling and optimization, showcasing their applications in a wide range of domains, including power systems, communication networks, industrial automation, and renewable energy. It explores key mathematical techniques, such as linear and nonlinear programming, convex optimization, stochastic modeling, and numerical algorithms, that enable the design, analysis, and optimization of complex control and signal processing systems. Furthermore, the abstract emphasizes the importance of addressing real-world challenges in control, signal processing, and energy systems through innovative mathematical approaches. It discusses the integration of mathematical models with data-driven approaches, machine learning, and artificial intelligence to enhance system performance, adaptability, and decision-making capabilities. The abstract also underscores the significance of bridging the gap between theoretical advancements and practical applications. It recognizes the need for practical implementation of mathematical models and optimization algorithms in real-world systems, considering factors such as scalability, computational efficiency, and robustness. In summary, this abstract showcases the advancements in mathematical modeling and optimization techniques for control, signal processing, and energy systems. It highlights the interdisciplinary nature of these techniques, their applications across various domains, and their potential to address real-world challenges. The abstract emphasizes the importance of practical implementation and integration with emerging technologies to drive innovation and improve the performance of control, signal processing, and energy.

Keywords: mathematical modeling, optimization, control systems, signal processing, energy systems, interdisciplinary applications, system identification, numerical algorithms

Procedia PDF Downloads 109
12134 Password Cracking on Graphics Processing Unit Based Systems

Authors: N. Gopalakrishna Kini, Ranjana Paleppady, Akshata K. Naik

Abstract:

Password authentication is one of the widely used methods to achieve authentication for legal users of computers and defense against attackers. There are many different ways to authenticate users of a system and there are many password cracking methods also developed. This paper is mainly to propose how best password cracking can be performed on a CPU-GPGPU based system. The main objective of this work is to project how quickly a password can be cracked with some knowledge about the computer security and password cracking if sufficient security is not incorporated to the system.

Keywords: GPGPU, password cracking, secret key, user authentication

Procedia PDF Downloads 287
12133 The Positive Effects of Processing Instruction on the Acquisition of French as a Second Language: An Eye-Tracking Study

Authors: Cecile Laval, Harriet Lowe

Abstract:

Processing Instruction is a psycholinguistic pedagogical approach drawing insights from the Input Processing Model which establishes the initial innate strategies used by second language learners to connect form and meaning of linguistic features. With the ever-growing use of technology in Second Language Acquisition research, the present study uses eye-tracking to measure the effectiveness of Processing Instruction in the acquisition of French and its effects on learner’s cognitive strategies. The experiment was designed using a TOBII Pro-TX300 eye-tracker to measure participants’ default strategies when processing French linguistic input and any cognitive changes after receiving Processing Instruction treatment. Participants were drawn from lower intermediate adult learners of French at the University of Greenwich and randomly assigned to two groups. The study used a pre-test/post-test methodology. The pre-tests (one per linguistic item) were administered via the eye-tracker to both groups one week prior to instructional treatment. One group received full Processing Instruction treatment (explicit information on the grammatical item and on the processing strategies, and structured input activities) on the primary target linguistic feature (French past tense imperfective aspect). The second group received Processing Instruction treatment except the explicit information on the processing strategies. Three immediate post-tests on the three grammatical structures under investigation (French past tense imperfective aspect, French Subjunctive used for the expression of doubt, and the French causative construction with Faire) were administered with the eye-tracker. The eye-tracking data showed the positive change in learners’ processing of the French target features after instruction with improvement in the interpretation of the three linguistic features under investigation. 100% of participants in both groups made a statistically significant improvement (p=0.001) in the interpretation of the primary target feature (French past tense imperfective aspect) after treatment. 62.5% of participants made an improvement in the secondary target item (French Subjunctive used for the expression of doubt) and 37.5% of participants made an improvement in the cumulative target feature (French causative construction with Faire). Statistically there was no significant difference between the pre-test and post-test scores in the cumulative target feature; however, the variance approximately tripled between the pre-test and the post-test (3.9 pre-test and 9.6 post-test). This suggests that the treatment does not affect participants homogenously and implies a role for individual differences in the transfer-of-training effect of Processing Instruction. The use of eye-tracking provides an opportunity for the study of unconscious processing decisions made during moment-by-moment comprehension. The visual data from the eye-tracking demonstrates changes in participants’ processing strategies. Gaze plots from pre- and post-tests display participants fixation points changing from focusing on content words to focusing on the verb ending. This change in processing strategies can be clearly seen in the interpretation of sentences in both primary and secondary target features. This paper will present the research methodology, design and results of the experimental study using eye-tracking to investigate the primary effects and transfer-of-training effects of Processing Instruction. It will then provide evidence of the cognitive benefits of Processing Instruction in Second Language Acquisition and offer suggestion in second language teaching of grammar.

Keywords: eye-tracking, language teaching, processing instruction, second language acquisition

Procedia PDF Downloads 278