Search results for: raw complex data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28338

Search results for: raw complex data

25458 Applying Theory of Self-Efficacy in Intelligent Transportation Systems by Potential Usage of Vehicle as a Sensor

Authors: Aby Nesan Raj, Sumil K. Raj, Sumesh Jayan

Abstract:

The objective of the study is to formulate a self-regulation model that shall enhance the usage of Intelligent Transportation Systems by understanding the theory of self-efficacy. The core logic of the self-regulation model shall monitor driver's behavior based on the situations related to the various sources of Self Efficacy like enactive mastery, vicarious experience, verbal persuasion and physiological arousal in addition to the vehicle data. For this study, four different vehicle data, speed, drowsiness, diagnostic data and surround camera views are considered. This data shall be given to the self-regulation model for evaluation. The oddness, which is the output of self-regulation model, shall feed to Intelligent Transportation Systems where appropriate actions are being taken. These actions include warning to the user as well as the input to the related transportation systems. It is also observed that the usage of vehicle as a sensor reduces the wastage of resource utilization or duplication. Altogether, this approach enhances the intelligence of the transportation systems especially in safety, productivity and environmental performance.

Keywords: emergency management, intelligent transportation system, self-efficacy, traffic management

Procedia PDF Downloads 238
25457 Airborne SAR Data Analysis for Impact of Doppler Centroid on Image Quality and Registration Accuracy

Authors: Chhabi Nigam, S. Ramakrishnan

Abstract:

This paper brings out the analysis of the airborne Synthetic Aperture Radar (SAR) data to study the impact of Doppler centroid on Image quality and geocoding accuracy from the perspective of Stripmap mode of data acquisition. Although in Stripmap mode of data acquisition radar beam points at 90 degrees broad side (side looking), shift in the Doppler centroid is invariable due to platform motion. In-accurate estimation of Doppler centroid leads to poor image quality and image miss-registration. The effect of Doppler centroid is analyzed in this paper using multiple sets of data collected from airborne platform. Occurrences of ghost (ambiguous) targets and their power levels have been analyzed that impacts appropriate choice of PRF. Effect of aircraft attitudes (roll, pitch and yaw) on the Doppler centroid is also analyzed with the collected data sets. Various stages of the RDA (Range Doppler Algorithm) algorithm used for image formation in Stripmap mode, range compression, Doppler centroid estimation, azimuth compression, range cell migration correction are analyzed to find the performance limits and the dependence of the imaging geometry on the final image. The ability of Doppler centroid estimation to enhance the imaging accuracy for registration are also illustrated in this paper. The paper also tries to bring out the processing of low squint SAR data, the challenges and the performance limits imposed by the imaging geometry and the platform dynamics on the final image quality metrics. Finally, the effect on various terrain types, including land, water and bright scatters is also presented.

Keywords: ambiguous target, Doppler Centroid, image registration, Airborne SAR

Procedia PDF Downloads 212
25456 The Relationship between Class Attendance and Performance of Industrial Engineering Students Enrolled for a Statistics Subject at the University of Technology

Authors: Tshaudi Motsima

Abstract:

Class attendance is key at all levels of education. At tertiary level many students develop a tendency of not attending all classes without being aware of the repercussions of not attending all classes. It is important for all students to attend all classes as they can receive first-hand information and they can benefit more. The student who attends classes is likely to perform better academically than the student who does not. The aim of this paper is to assess the relationship between class attendance and academic performance of industrial engineering students. The data for this study were collected through the attendance register of students and the other data were accessed from the Integrated Tertiary Software and the Higher Education Data Analyzer Portal. Data analysis was conducted on a sample of 93 students. The results revealed that students with medium predicate scores (OR = 3.8; p = 0.027) and students with low predicate scores (OR = 21.4, p < 0.001) were significantly likely to attend less than 80% of the classes as compared to students with high predicate scores. Students with examination performance of less than 50% were likely to attend less than 80% of classes than students with examination performance of 50% and above, but the differences were not statistically significant (OR = 1.3; p = 0.750).

Keywords: class attendance, examination performance, final outcome, logistic regression

Procedia PDF Downloads 130
25455 Multimodal Optimization of Density-Based Clustering Using Collective Animal Behavior Algorithm

Authors: Kristian Bautista, Ruben A. Idoy

Abstract:

A bio-inspired metaheuristic algorithm inspired by the theory of collective animal behavior (CAB) was integrated to density-based clustering modeled as multimodal optimization problem. The algorithm was tested on synthetic, Iris, Glass, Pima and Thyroid data sets in order to measure its effectiveness relative to CDE-based Clustering algorithm. Upon preliminary testing, it was found out that one of the parameter settings used was ineffective in performing clustering when applied to the algorithm prompting the researcher to do an investigation. It was revealed that fine tuning distance δ3 that determines the extent to which a given data point will be clustered helped improve the quality of cluster output. Even though the modification of distance δ3 significantly improved the solution quality and cluster output of the algorithm, results suggest that there is no difference between the population mean of the solutions obtained using the original and modified parameter setting for all data sets. This implies that using either the original or modified parameter setting will not have any effect towards obtaining the best global and local animal positions. Results also suggest that CDE-based clustering algorithm is better than CAB-density clustering algorithm for all data sets. Nevertheless, CAB-density clustering algorithm is still a good clustering algorithm because it has correctly identified the number of classes of some data sets more frequently in a thirty trial run with a much smaller standard deviation, a potential in clustering high dimensional data sets. Thus, the researcher recommends further investigation in the post-processing stage of the algorithm.

Keywords: clustering, metaheuristics, collective animal behavior algorithm, density-based clustering, multimodal optimization

Procedia PDF Downloads 225
25454 Multiphase Coexistence for Aqueous System with Hydrophilic Agent

Authors: G. B. Hong

Abstract:

Liquid-Liquid Equilibrium (LLE) data are measured for the ternary mixtures of water + 1-butanol + butyl acetate and quaternary mixtures of water + 1-butanol + butyl acetate + glycerol at atmospheric pressure at 313.15 K. In addition, isothermal Vapor–Liquid–Liquid Equilibrium (VLLE) data are determined experimentally at 333.15 K. The region of heterogeneity is found to increase as the hydrophilic agent (glycerol) is introduced into the aqueous mixtures. The experimental data are correlated with the NRTL model. The predicted results from the solution model with the model parameters determined from the constituent binaries are also compared with the experimental values.

Keywords: LLE, VLLE, hydrophilic agent, NRTL

Procedia PDF Downloads 239
25453 ISMARA: Completely Automated Inference of Gene Regulatory Networks from High-Throughput Data

Authors: Piotr J. Balwierz, Mikhail Pachkov, Phil Arnold, Andreas J. Gruber, Mihaela Zavolan, Erik van Nimwegen

Abstract:

Understanding the key players and interactions in the regulatory networks that control gene expression and chromatin state across different cell types and tissues in metazoans remains one of the central challenges in systems biology. Our laboratory has pioneered a number of methods for automatically inferring core gene regulatory networks directly from high-throughput data by modeling gene expression (RNA-seq) and chromatin state (ChIP-seq) measurements in terms of genome-wide computational predictions of regulatory sites for hundreds of transcription factors and micro-RNAs. These methods have now been completely automated in an integrated webserver called ISMARA that allows researchers to analyze their own data by simply uploading RNA-seq or ChIP-seq data sets and provides results in an integrated web interface as well as in downloadable flat form. For any data set, ISMARA infers the key regulators in the system, their activities across the input samples, the genes and pathways they target, and the core interactions between the regulators. We believe that by empowering experimental researchers to apply cutting-edge computational systems biology tools to their data in a completely automated manner, ISMARA can play an important role in developing our understanding of regulatory networks across metazoans.

Keywords: gene expression analysis, high-throughput sequencing analysis, transcription factor activity, transcription regulation

Procedia PDF Downloads 60
25452 The Power of the Proper Orthogonal Decomposition Method

Authors: Charles Lee

Abstract:

The Principal Orthogonal Decomposition (POD) technique has been used as a model reduction tool for many applications in engineering and science. In principle, one begins with an ensemble of data, called snapshots, collected from an experiment or laboratory results. The beauty of the POD technique is that when applied, the entire data set can be represented by the smallest number of orthogonal basis elements. It is the such capability that allows us to reduce the complexity and dimensions of many physical applications. Mathematical formulations and numerical schemes for the POD method will be discussed along with applications in NASA’s Deep Space Large Antenna Arrays, Satellite Image Reconstruction, Cancer Detection with DNA Microarray Data, Maximizing Stock Return, and Medical Imaging.

Keywords: reduced-order methods, principal component analysis, cancer detection, image reconstruction, stock portfolios

Procedia PDF Downloads 77
25451 A Reflection of the Contemporary Life of Urban People Through Mixed Media Art

Authors: Van Huong Mai, Kanokwan Nithiratphat, Adool Booncham

Abstract:

The Movement of Contemporary Life consisted of two purposes, which were to study the movement and development of the modern life and to create the visual arts, which were paintings expressed via the form of apartment buildings was used from mixed media (digital printing and acrylic painting on canvas) which conveyed the rapid pace of modern life leading to diverse movements in viewer’s feeling. The operation of this creation was collected field data, documentary data, and influence from creative work. The data analysis was analyzed in order to theme, form, technique, and process to satisfy of concept and special character of the pieces.

Keywords: movement, contemporary life, visual art, acrylic painting, digital art, urban space

Procedia PDF Downloads 93
25450 Mining Educational Data to Support Students’ Major Selection

Authors: Kunyanuth Kularbphettong, Cholticha Tongsiri

Abstract:

This paper aims to create the model for student in choosing an emphasized track of student majoring in computer science at Suan Sunandha Rajabhat University. The objective of this research is to develop the suggested system using data mining technique to analyze knowledge and conduct decision rules. Such relationships can be used to demonstrate the reasonableness of student choosing a track as well as to support his/her decision and the system is verified by experts in the field. The sampling is from student of computer science based on the system and the questionnaire to see the satisfaction. The system result is found to be satisfactory by both experts and student as well.

Keywords: data mining technique, the decision support system, knowledge and decision rules, education

Procedia PDF Downloads 419
25449 Fire Risk Information Harmonization for Transboundary Fire Events between Portugal and Spain

Authors: Domingos Viegas, Miguel Almeida, Carmen Rocha, Ilda Novo, Yolanda Luna

Abstract:

Forest fires along the more than 1200km of the Spanish-Portuguese border are more and more frequent, currently achieving around 2000 fire events per year. Some of these events develop to large international wildfire requiring concerted operations based on shared information between the two countries. The fire event of Valencia de Alcantara (2003) causing several fatalities and more than 13000ha burnt, is a reference example of these international events. Currently, Portugal and Spain have a specific cross-border cooperation protocol on wildfires response for a strip of about 30km (15 km for each side). It is recognized by public authorities the successfulness of this collaboration however it is also assumed that this cooperation should include more functionalities such as the development of a common risk information system for transboundary fire events. Since Portuguese and Spanish authorities use different approaches to determine the fire risk indexes inputs and different methodologies to assess the fire risk, sometimes the conjoint firefighting operations are jeopardized since the information is not harmonized and the understanding of the situation by the civil protection agents from both countries is not unique. Thus, a methodology aiming the harmonization of the fire risk calculation and perception by Portuguese and Spanish Civil protection authorities is hereby presented. The final results are presented as well. The fire risk index used in this work is the Canadian Fire Weather Index (FWI), which is based on meteorological data. The FWI is limited on its application as it does not take into account other important factors with great effect on the fire appearance and development. The combination of these factors is very complex since, besides the meteorology, it addresses several parameters of different topics, namely: sociology, topography, vegetation and soil cover. Therefore, the meaning of FWI values is different from region to region, according the specific characteristics of each region. In this work, a methodology for FWI calibration based on the number of fire occurrences and on the burnt area in the transboundary regions of Portugal and Spain, in order to assess the fire risk based on calibrated FWI values, is proposed. As previously mentioned, the cooperative firefighting operations require a common perception of the information shared. Therefore, a common classification of the fire risk for the fire events occurred in the transboundary strip is proposed with the objective of harmonizing this type of information. This work is integrated in the ECHO project SpitFire - Spanish-Portuguese Meteorological Information System for Transboundary Operations in Forest Fires, which aims the development of a web platform for the sharing of information and supporting decision tools to be used in international fire events involving Portugal and Spain.

Keywords: data harmonization, FWI, international collaboration, transboundary wildfires

Procedia PDF Downloads 247
25448 SPBAC: A Semantic Policy-Based Access Control for Database Query

Authors: Aaron Zhang, Alimire Kahaer, Gerald Weber, Nalin Arachchilage

Abstract:

Access control is an essential safeguard for the security of enterprise data, which controls users’ access to information resources and ensures the confidentiality and integrity of information resources [1]. Research shows that the more common types of access control now have shortcomings [2]. In this direction, to improve the existing access control, we have studied the current technologies in the field of data security, deeply investigated the previous data access control policies and their problems, identified the existing deficiencies, and proposed a new extension structure of SPBAC. SPBAC extension proposed in this paper aims to combine Policy-Based Access Control (PBAC) with semantics to provide logically connected, real-time data access functionality by establishing associations between enterprise data through semantics. Our design combines policies with linked data through semantics to create a "Semantic link" so that access control is no longer per-database and determines that users in each role should be granted access based on the instance policy, and improves the SPBAC implementation by constructing policies and defined attributes through the XACML specification, which is designed to extend on the original XACML model. While providing relevant design solutions, this paper hopes to continue to study the feasibility and subsequent implementation of related work at a later stage.

Keywords: access control, semantic policy-based access control, semantic link, access control model, instance policy, XACML

Procedia PDF Downloads 81
25447 A Regression Analysis Study of the Applicability of Side Scan Sonar based Safety Inspection of Underwater Structures

Authors: Chul Park, Youngseok Kim, Sangsik Choi

Abstract:

This study developed an electric jig for underwater structure inspection in order to solve the problem of the application of side scan sonar to underwater inspection, and analyzed correlations of empirical data in order to enhance sonar data resolution. For the application of tow-typed sonar to underwater structure inspection, an electric jig was developed. In fact, it was difficult to inspect a cross-section at the time of inspection with tow-typed equipment. With the development of the electric jig for underwater structure inspection, it was possible to shorten an inspection time over 20%, compared to conventional tow-typed side scan sonar, and to inspect a proper cross-section through accurate angle control. The indoor test conducted to enhance sonar data resolution proved that a water depth, the distance from an underwater structure, and a filming angle influenced a resolution and data quality. Based on the data accumulated through field experience, multiple regression analysis was conducted on correlations between three variables. As a result, the relational equation of sonar operation according to a water depth was drawn.

Keywords: underwater structure, SONAR, safety inspection, resolution

Procedia PDF Downloads 260
25446 CMMI Key Process Areas and FDD Practices

Authors: Rituraj Deka, Nomi Baruah

Abstract:

The development of information technology during the past few years resulted in designing of more and more complex software. The outsourcing of software development makes a higher requirement for the management of software development project. Various software enterprises follow various paths in their pursuit of excellence, applying various principles, methods and techniques along the way. The new research is proving that CMMI and Agile methodologies can benefit from using both methods within organizations with the potential to dramatically improve business performance. The paper describes a mapping between CMMI key process areas (KPAs) and Feature-Driven Development (FDD) communication perspective, so as to increase the understanding of how improvements can be made in the software development process.

Keywords: Agile, CMMI, FDD, KPAs

Procedia PDF Downloads 453
25445 Enhanced Imperialist Competitive Algorithm for the Cell Formation Problem Using Sequence Data

Authors: S. H. Borghei, E. Teymourian, M. Mobin, G. M. Komaki, S. Sheikh

Abstract:

Imperialist competitive algorithm (ICA) is a recent meta-heuristic method that is inspired by the social evolutions for solving NP-Hard problems. The ICA is a population based algorithm which has achieved a great performance in comparison to other meta-heuristics. This study is about developing enhanced ICA approach to solve the cell formation problem (CFP) using sequence data. In addition to the conventional ICA, an enhanced version of ICA, namely EICA, applies local search techniques to add more intensification aptitude and embed the features of exploration and intensification more successfully. Suitable performance measures are used to compare the proposed algorithms with some other powerful solution approaches in the literature. In the same way, for checking the proficiency of algorithms, forty test problems are presented. Five benchmark problems have sequence data, and other ones are based on 0-1 matrices modified to sequence based problems. Computational results elucidate the efficiency of the EICA in solving CFP problems.

Keywords: cell formation problem, group technology, imperialist competitive algorithm, sequence data

Procedia PDF Downloads 450
25444 Establishment of Bit Selective Mode Storage Covert Channel in VANETs

Authors: Amarpreet Singh, Kimi Manchanda

Abstract:

Intended for providing the security in the VANETS (Vehicular Ad hoc Network) scenario, the covert storage channel is implemented through data transmitted between the sender and the receiver. Covert channels are the logical links which are used for the communication purpose and hiding the secure data from the intruders. This paper refers to the Establishment of bit selective mode covert storage channels in VANETS. In this scenario, the data is being transmitted with two modes i.e. the normal mode and the covert mode. During the communication between vehicles in this scenario, the controlling of bits is possible through the optional bits of IPV6 Header Format. This implementation is fulfilled with the help of Network simulator.

Keywords: covert mode, normal mode, VANET, OBU, on-board unit

Procedia PDF Downloads 360
25443 Enhancing Temporal Extrapolation of Wind Speed Using a Hybrid Technique: A Case Study in West Coast of Denmark

Authors: B. Elshafei, X. Mao

Abstract:

The demand for renewable energy is significantly increasing, major investments are being supplied to the wind power generation industry as a leading source of clean energy. The wind energy sector is entirely dependable and driven by the prediction of wind speed, which by the nature of wind is very stochastic and widely random. This s0tudy employs deep multi-fidelity Gaussian process regression, used to predict wind speeds for medium term time horizons. Data of the RUNE experiment in the west coast of Denmark were provided by the Technical University of Denmark, which represent the wind speed across the study area from the period between December 2015 and March 2016. The study aims to investigate the effect of pre-processing the data by denoising the signal using empirical wavelet transform (EWT) and engaging the vector components of wind speed to increase the number of input data layers for data fusion using deep multi-fidelity Gaussian process regression (GPR). The outcomes were compared using root mean square error (RMSE) and the results demonstrated a significant increase in the accuracy of predictions which demonstrated that using vector components of the wind speed as additional predictors exhibits more accurate predictions than strategies that ignore them, reflecting the importance of the inclusion of all sub data and pre-processing signals for wind speed forecasting models.

Keywords: data fusion, Gaussian process regression, signal denoise, temporal extrapolation

Procedia PDF Downloads 132
25442 The Effect of Hydrogen on the Magnetic Properties of ZnO: A Density Functional Tight Binding Study

Authors: M. A. Lahmer, K. Guergouri

Abstract:

The ferromagnetic properties of carbon-doped ZnO (ZnO:CO) and hydrogenated carbon-doped ZnO (ZnO:CO+H) are investigated using the density functional tight binding (DFTB) method. Our results reveal that CO-doped ZnO is a ferromagnetic material with a magnetic moment of 1.3 μB per carbon atom. The presence of hydrogen in the material in the form of CO-H complex decreases the total magnetism of the material without suppressing ferromagnetism. However, the system in this case becomes quickly antiferromagnetic when the C-C separation distance was increased.

Keywords: ZnO, carbon, hydrogen, ferromagnetism, density functional tight binding

Procedia PDF Downloads 280
25441 Deadline Missing Prediction for Mobile Robots through the Use of Historical Data

Authors: Edwaldo R. B. Monteiro, Patricia D. M. Plentz, Edson R. De Pieri

Abstract:

Mobile robotics is gaining an increasingly important role in modern society. Several potentially dangerous or laborious tasks for human are assigned to mobile robots, which are increasingly capable. Many of these tasks need to be performed within a specified period, i.e., meet a deadline. Missing the deadline can result in financial and/or material losses. Mechanisms for predicting the missing of deadlines are fundamental because corrective actions can be taken to avoid or minimize the losses resulting from missing the deadline. In this work we propose a simple but reliable deadline missing prediction mechanism for mobile robots through the use of historical data and we use the Pioneer 3-DX robot for experiments and simulations, one of the most popular robots in academia.

Keywords: deadline missing, historical data, mobile robots, prediction mechanism

Procedia PDF Downloads 397
25440 The Intention to Use Telecare in People of Fall Experience: Application of Fuzzy Neural Network

Authors: Jui-Chen Huang, Shou-Hsiung Cheng

Abstract:

This study examined their willingness to use telecare for people who have had experience falling in the last three months in Taiwan. This study adopted convenience sampling and a structural questionnaire to collect data. It was based on the definition and the constructs related to the Health Belief Model (HBM). HBM is comprised of seven constructs: perceived benefits (PBs), perceived disease threat (PDT), perceived barriers of taking action (PBTA), external cues to action (ECUE), internal cues to action (ICUE), attitude toward using (ATT), and behavioral intention to use (BI). This study adopted Fuzzy Neural Network (FNN) to put forward an effective method. It shows the dependence of ATT on PB, PDT, PBTA, ECUE, and ICUE. The training and testing data RMSE (root mean square error) are 0.028 and 0.166 in the FNN, respectively. The training and testing data RMSE are 0.828 and 0.578 in the regression model, respectively. On the other hand, as to the dependence of ATT on BI, as presented in the FNN, the training and testing data RMSE are 0.050 and 0.109, respectively. The training and testing data RMSE are 0.529 and 0.571 in the regression model, respectively. The results show that the FNN method is better than the regression analysis. It is an effective and viable good way.

Keywords: fall, fuzzy neural network, health belief model, telecare, willingness

Procedia PDF Downloads 192
25439 On the Utility of Bidirectional Transformers in Gene Expression-Based Classification

Authors: Babak Forouraghi

Abstract:

A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of the flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on the spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts, as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with an attention mechanism. In previous works on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work, with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on the presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.

Keywords: machine learning, classification and regression, gene circuit design, bidirectional transformers

Procedia PDF Downloads 56
25438 Second Generation Mozambican Migrant Youth’s Identity and Sense of Belonging: The Case of Hluvukani Village in Bushbuckridge, Mpumalanga

Authors: Betty Chiyangwa

Abstract:

This is a work in progress project focused on exploring the complexities surrounding the second generation Mozambican migrant youth’s experiences to construct their identity and develop a sense of belonging in post-apartheid, Bushbuckridge in South Africa. Established in 1884, Bushbuckridge is one of the earliest districts to accommodate Mozambicans who migrated to South Africa in the 1970s. Bushbuckridge as a destination for Mozambican migrants is crucial to their search for social freedom and space to “belong to.” The action of deliberately seeking freedom is known as an act of agency. Four major objectives govern the paper. The first objective observes how second-generation Mozambican migrant youth living in South Africa negotiate and construct their own identities. Secondly, it explores second-generation Mozambican migrant youth narratives regarding their sense of belonging in South Africa. Thirdly, the study intends to understand how social processes of identity and belonging influence second-generation Mozambican migrant youth experiences and future aspirations in South Africa. The last objective examines how Sen’s Capability approach is relevant in understanding second-generation Mozambican migrant youth identity and belonging in South Africa. This is a single case study informed by data from semi-structured interviews and narratives with youth between the ages of 18 and 34 who are born and raised in South Africa to at least one former Mozambican refugee parent living in Bushbuckridge. Drawing from Crenshaw’s Intersectionality and Sen’s Capability approaches, this study significantly contributes to the existing body of knowledge on South to South migration by demonstrating how both approaches can be operationalized towards understanding complex experiences and capabilities of the disadvantaged group simultaneously. The subject of second-generation migrants is often under-researched in South African migration; thus, their perspectives have been marginalized in Social Science research.

Keywords: second-generation, Mozambican, migrant, youth, bushbuckridge

Procedia PDF Downloads 212
25437 Inverse Mapping of Weld Bead Geometry in Shielded Metal Arc-Welding: Genetic Algorithm Approach

Authors: D. S. Nagesh, G. L. Datta

Abstract:

In the field of welding, various studies had been made by some of the previous investigators to predict as well as optimize weld bead geometric descriptors. Modeling of weld bead shape is important for predicting the quality of welds. In most of the cases, design of experiments technique to postulate multiple linear regression equations have been used. Nowadays, Genetic Algorithm (GA) an intelligent information treatment system with the characteristics of treating complex relationships as seen in welding processes used as a tool for inverse mapping/optimization of the process is attempted.

Keywords: smaw, genetic algorithm, bead geometry, optimization/inverse mapping

Procedia PDF Downloads 448
25436 Genetic Algorithm Approach for Inverse Mapping of Weld Bead Geometry in Shielded Metal Arc-Welding

Authors: D. S. Nagesh, G. L. Datta

Abstract:

In the field of welding, various studies had been made by some of the previous investigators to predict as well as optimize weld bead geometric descriptors. Modeling of weld bead shape is important for predicting the quality of welds. In most of the cases design of experiments technique to postulate multiple linear regression equations have been used. Nowadays Genetic Algorithm (GA) an intelligent information treatment system with the characteristics of treating complex relationships as seen in welding processes used as a tool for inverse mapping/optimization of the process is attempted.

Keywords: SMAW, genetic algorithm, bead geometry, optimization/inverse mapping

Procedia PDF Downloads 418
25435 Effect of Viscous Dissipation on 3-D MHD Casson Flow in Presence of Chemical Reaction: A Numerical Study

Authors: Bandari Shanker, Alfunsa Prathiba

Abstract:

The influence of viscous dissipation on MHD Casson 3-D fluid flow in two perpendicular directions past a linearly stretching sheet in the presence of a chemical reaction is explored in this work. For exceptional circumstances, self-similar solutions are obtained and compared to the given data. The enhancement in the values Ecert number the temperature boundary layer increases. Further, the current findings are observed to be in great accord with the existing data. In both directions, non - dimensional velocities and stress distribution are achieved. The relevant data are graphed and explained quantitatively in relation to changes in the Casson fluid parameter as well as other fluid flow parameters.

Keywords: viscous dissipation, 3-D Casson flow, chemical reaction, Ecert number

Procedia PDF Downloads 188
25434 Breast Cancer Metastasis Detection and Localization through Transfer-Learning Convolutional Neural Network Classification Based on Convolutional Denoising Autoencoder Stack

Authors: Varun Agarwal

Abstract:

Introduction: With the advent of personalized medicine, histopathological review of whole slide images (WSIs) for cancer diagnosis presents an exceedingly time-consuming, complex task. Specifically, detecting metastatic regions in WSIs of sentinel lymph node biopsies necessitates a full-scanned, holistic evaluation of the image. Thus, digital pathology, low-level image manipulation algorithms, and machine learning provide significant advancements in improving the efficiency and accuracy of WSI analysis. Using Camelyon16 data, this paper proposes a deep learning pipeline to automate and ameliorate breast cancer metastasis localization and WSI classification. Methodology: The model broadly follows five stages -region of interest detection, WSI partitioning into image tiles, convolutional neural network (CNN) image-segment classifications, probabilistic mapping of tumor localizations, and further processing for whole WSI classification. Transfer learning is applied to the task, with the implementation of Inception-ResNetV2 - an effective CNN classifier that uses residual connections to enhance feature representation, adding convolved outputs in the inception unit to the proceeding input data. Moreover, in order to augment the performance of the transfer learning CNN, a stack of convolutional denoising autoencoders (CDAE) is applied to produce embeddings that enrich image representation. Through a saliency-detection algorithm, visual training segments are generated, which are then processed through a denoising autoencoder -primarily consisting of convolutional, leaky rectified linear unit, and batch normalization layers- and subsequently a contrast-normalization function. A spatial pyramid pooling algorithm extracts the key features from the processed image, creating a viable feature map for the CNN that minimizes spatial resolution and noise. Results and Conclusion: The simplified and effective architecture of the fine-tuned transfer learning Inception-ResNetV2 network enhanced with the CDAE stack yields state of the art performance in WSI classification and tumor localization, achieving AUC scores of 0.947 and 0.753, respectively. The convolutional feature retention and compilation with the residual connections to inception units synergized with the input denoising algorithm enable the pipeline to serve as an effective, efficient tool in the histopathological review of WSIs.

Keywords: breast cancer, convolutional neural networks, metastasis mapping, whole slide images

Procedia PDF Downloads 125
25433 The Trumping of Science: Exploratory Study into Discrepancy between Politician and Scientist Sources in American Covid-19 News Coverage

Authors: Wafa Unus

Abstract:

Science journalism has been vanishing from America’s national newspapers for decades. Reportage on scientific topics is limited to only a handful of newspapers and of those, few employ dedicated science journalists to cover stories that require this specialized expertise. News organizations' lack of readiness to convey complex scientific concepts to a mass populace becomes particularly problematic when events like the Covid-19 pandemic occur. The lack of coverage of Covid-19 prior to its onset in the United States, suggests something more troubling - that the deprioritization of reporting on hard science as an educational tool in favor of political frames of coverage, places dangerous blinders on the American public. This research looks at the disparity between voices of health and science experts in news articles and the voices of political figures, in order to better understand the approach of American newspapers in conveying expert opinion on Covid-19. A content analysis of 300 articles on Covid-19 by major newspapers in the United States between January 1st, 2020 and April 30th, 2020 illuminates this investigation. The Boston Globe, the New York Times, and the Los Angeles Times are included in the content analysis. Initial findings reveal a significant disparity in the number of articles that mention Anthony Fauci, the director of the National Institute Allergy and Infectious Disease, and the number that make reference to political figures. Covid-related articles in the New York Times that focused on health topics (as opposed to economic or social issues) contained the voices of 54 different politicians who were mentioned a total of 608 times. Only five members of the scientific community were mentioned a total of 24 times (out of 674 articles). In the Boston Globe, 36 different politicians were mentioned a total of 147 times, and only two members of the scientific community, one being Anthony Fauci, were mentioned a total of nine times (out of 423 articles). In the Los Angeles Times, 52 different politicians were mentioned a total of 600 times, and only six members of the scientific community were included and were mentioned a total of 82 times with Fauci being mentioned 48 times (out of 851 articles). Results provide a better understanding of the frames in which American journalists in Covid hotspots conveyed information of expert analysis on Covid-19 during one of the most pressing news events of the century. Ultimately, the objective of this study is to utilize the exploratory data to evaluate the nature, extent and impact of Covid-19 reporting in the context of trustworthiness and scientific expertise. Secondarily, this data will illuminate the degree to which Covid-19 reporting focused on politics over science.

Keywords: science reporting, science journalism, covid, misinformation, news

Procedia PDF Downloads 210
25432 Improving Fine Motor Skills in the Hands of Children with ASD with Applying the Fine Motor Activities in Montessori Method of Education

Authors: Yeganeh Faraji, Ned Faraji

Abstract:

The aim of the present study is to search for the effects of training on improving fine hand skills in children with autistic spectrum disorder through the case study statistic method. The sample group was selected by the available sampling method and included four participants. The methodology of this research was a single-subject semi-experimental of AB design. The data were gathered by natural observation. In the next stage, the data were recorded on data record sheets and then presented on diagrams. The sample group was evaluated by an assessment which the researcher created based on Lincoln-Oseretsky’ motor development scale in two pre-test and post-test phases. In order to promote fingers’ fine movement, the Montessori method was applied. Collecting and analyzing data which were shown by the data presentation method and diagrams, proved that it had no significant effect on improving fingers’ fine movement. Therefore, based on the current research findings, it is suggested that future researchers can apply various teaching methods and different tests for improving fine hand skills or increasing the period of training.

Keywords: autism spectrum disorder, Montessori method, fine motor skills, Lincoln-Oseretsky assessment

Procedia PDF Downloads 90
25431 Application of Public Access Two-Dimensional Hydrodynamic and Distributed Hydrological Models for Flood Forecasting in Ungauged Basins

Authors: Ahmad Shayeq Azizi, Yuji Toda

Abstract:

In Afghanistan, floods are the most frequent and recurrent events among other natural disasters. On the other hand, lack of monitoring data is a severe problem, which increases the difficulty of making the appropriate flood countermeasures of flood forecasting. This study is carried out to simulate the flood inundation in Harirud River Basin by application of distributed hydrological model, Integrated Flood Analysis System (IFAS) and 2D hydrodynamic model, International River Interface Cooperative (iRIC) based on satellite rainfall combined with historical peak discharge and global accessed data. The results of the simulation can predict the inundation area, depth and velocity, and the hardware countermeasures such as the impact of levee installation can be discussed by using the present method. The methodology proposed in this study is suitable for the area where hydrological and geographical data including river survey data are poorly observed.

Keywords: distributed hydrological model, flood inundation, hydrodynamic model, ungauged basins

Procedia PDF Downloads 163
25430 FlexPoints: Efficient Algorithm for Detection of Electrocardiogram Characteristic Points

Authors: Daniel Bulanda, Janusz A. Starzyk, Adrian Horzyk

Abstract:

The electrocardiogram (ECG) is one of the most commonly used medical tests, essential for correct diagnosis and treatment of the patient. While ECG devices generate a huge amount of data, only a small part of them carries valuable medical information. To deal with this problem, many compression algorithms and filters have been developed over the past years. However, the rapid development of new machine learning techniques poses new challenges. To address this class of problems, we created the FlexPoints algorithm that searches for characteristic points on the ECG signal and ignores all other points that do not carry relevant medical information. The conducted experiments proved that the presented algorithm can significantly reduce the number of data points which represents ECG signal without losing valuable medical information. These sparse but essential characteristic points (flex points) can be a perfect input for some modern machine learning models, which works much better using flex points as an input instead of raw data or data compressed by many popular algorithms.

Keywords: characteristic points, electrocardiogram, ECG, machine learning, signal compression

Procedia PDF Downloads 159
25429 Comprehensive Machine Learning-Based Glucose Sensing from Near-Infrared Spectra

Authors: Bitewulign Mekonnen

Abstract:

Context: This scientific paper focuses on the use of near-infrared (NIR) spectroscopy to determine glucose concentration in aqueous solutions accurately and rapidly. The study compares six different machine learning methods for predicting glucose concentration and also explores the development of a deep learning model for classifying NIR spectra. The objective is to optimize the detection model and improve the accuracy of glucose prediction. This research is important because it provides a comprehensive analysis of various machine-learning techniques for estimating aqueous glucose concentrations. Research Aim: The aim of this study is to compare and evaluate different machine-learning methods for predicting glucose concentration from NIR spectra. Additionally, the study aims to develop and assess a deep-learning model for classifying NIR spectra. Methodology: The research methodology involves the use of machine learning and deep learning techniques. Six machine learning regression models, including support vector machine regression, partial least squares regression, extra tree regression, random forest regression, extreme gradient boosting, and principal component analysis-neural network, are employed to predict glucose concentration. The NIR spectra data is randomly divided into train and test sets, and the process is repeated ten times to increase generalization ability. In addition, a convolutional neural network is developed for classifying NIR spectra. Findings: The study reveals that the SVMR, ETR, and PCA-NN models exhibit excellent performance in predicting glucose concentration, with correlation coefficients (R) > 0.99 and determination coefficients (R²)> 0.985. The deep learning model achieves high macro-averaging scores for precision, recall, and F1-measure. These findings demonstrate the effectiveness of machine learning and deep learning methods in optimizing the detection model and improving glucose prediction accuracy. Theoretical Importance: This research contributes to the field by providing a comprehensive analysis of various machine-learning techniques for estimating glucose concentrations from NIR spectra. It also explores the use of deep learning for the classification of indistinguishable NIR spectra. The findings highlight the potential of machine learning and deep learning in enhancing the prediction accuracy of glucose-relevant features. Data Collection and Analysis Procedures: The NIR spectra and corresponding references for glucose concentration are measured in increments of 20 mg/dl. The data is randomly divided into train and test sets, and the models are evaluated using regression analysis and classification metrics. The performance of each model is assessed based on correlation coefficients, determination coefficients, precision, recall, and F1-measure. Question Addressed: The study addresses the question of whether machine learning and deep learning methods can optimize the detection model and improve the accuracy of glucose prediction from NIR spectra. Conclusion: The research demonstrates that machine learning and deep learning methods can effectively predict glucose concentration from NIR spectra. The SVMR, ETR, and PCA-NN models exhibit superior performance, while the deep learning model achieves high classification scores. These findings suggest that machine learning and deep learning techniques can be used to improve the prediction accuracy of glucose-relevant features. Further research is needed to explore their clinical utility in analyzing complex matrices, such as blood glucose levels.

Keywords: machine learning, signal processing, near-infrared spectroscopy, support vector machine, neural network

Procedia PDF Downloads 87