Search results for: robust ANOVA
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2327

Search results for: robust ANOVA

2087 Robust Method for Evaluation of Catchment Response to Rainfall Variations Using Vegetation Indices and Surface Temperature

Authors: Revalin Herdianto

Abstract:

Recent climate changes increase uncertainties in vegetation conditions such as health and biomass globally and locally. The detection is, however, difficult due to the spatial and temporal scale of vegetation coverage. Due to unique vegetation response to its environmental conditions such as water availability, the interplay between vegetation dynamics and hydrologic conditions leave a signature in their feedback relationship. Vegetation indices (VI) depict vegetation biomass and photosynthetic capacity that indicate vegetation dynamics as a response to variables including hydrologic conditions and microclimate factors such as rainfall characteristics and land surface temperature (LST). It is hypothesized that the signature may be depicted by VI in its relationship with other variables. To study this signature, several catchments in Asia, Australia, and Indonesia were analysed to assess the variations in hydrologic characteristics with vegetation types. Methods used in this study includes geographic identification and pixel marking for studied catchments, analysing time series of VI and LST of the marked pixels, smoothing technique using Savitzky-Golay filter, which is effective for large area and extensive data. Time series of VI, LST, and rainfall from satellite and ground stations coupled with digital elevation models were analysed and presented. This study found that the hydrologic response of vegetation to rainfall variations may be shown in one hydrologic year, in which a drought event can be detected a year later as a suppressed growth. However, an annual rainfall of above average do not promote growth above average as shown by VI. This technique is found to be a robust and tractable approach for assessing catchment dynamics in changing climates.

Keywords: vegetation indices, land surface temperature, vegetation dynamics, catchment

Procedia PDF Downloads 271
2086 Medical Diagnosis of Retinal Diseases Using Artificial Intelligence Deep Learning Models

Authors: Ethan James

Abstract:

Over one billion people worldwide suffer from some level of vision loss or blindness as a result of progressive retinal diseases. Many patients, particularly in developing areas, are incorrectly diagnosed or undiagnosed whatsoever due to unconventional diagnostic tools and screening methods. Artificial intelligence (AI) based on deep learning (DL) convolutional neural networks (CNN) have recently gained a high interest in ophthalmology for its computer-imaging diagnosis, disease prognosis, and risk assessment. Optical coherence tomography (OCT) is a popular imaging technique used to capture high-resolution cross-sections of retinas. In ophthalmology, DL has been applied to fundus photographs, optical coherence tomography, and visual fields, achieving robust classification performance in the detection of various retinal diseases including macular degeneration, diabetic retinopathy, and retinitis pigmentosa. However, there is no complete diagnostic model to analyze these retinal images that provide a diagnostic accuracy above 90%. Thus, the purpose of this project was to develop an AI model that utilizes machine learning techniques to automatically diagnose specific retinal diseases from OCT scans. The algorithm consists of neural network architecture that was trained from a dataset of over 20,000 real-world OCT images to train the robust model to utilize residual neural networks with cyclic pooling. This DL model can ultimately aid ophthalmologists in diagnosing patients with these retinal diseases more quickly and more accurately, therefore facilitating earlier treatment, which results in improved post-treatment outcomes.

Keywords: artificial intelligence, deep learning, imaging, medical devices, ophthalmic devices, ophthalmology, retina

Procedia PDF Downloads 159
2085 The Effects of Billboard Content and Visible Distance on Driver Behavior

Authors: Arsalan Hassan Pour, Mansoureh Jeihani, Samira Ahangari

Abstract:

Distracted driving has been one of the most integral concerns surrounding our daily use of vehicles since the invention of the automobile. While much attention has been recently given to cell phones related distraction, commercial billboards along roads are also candidates for drivers' visual and cognitive distractions, as they may take drivers’ eyes from the road and their minds off the driving task to see, perceive and think about the billboard’s content. Using a driving simulator and a head-mounted eye-tracking system, speed change, acceleration, deceleration, throttle response, collision, lane changing, and offset from the center of the lane data along with gaze fixation duration and frequency data were collected in this study. Some 92 participants from a fairly diverse sociodemographic background drove on a simulated freeway in Baltimore, Maryland area and were exposed to three different billboards to investigate the effects of billboards on drivers’ behavior. Participants glanced at the billboards several times with different frequencies, the maximum of which occurred on the billboard with the highest cognitive load. About 74% of the participants didn’t look at billboards for more than two seconds at each glance except for the billboard with a short visible area. Analysis of variance (ANOVA) was performed to find the variations in driving behavior when they are invisible, readable, and post billboards area. The results show a slight difference in speed, throttle, brake, steering velocity, and lane changing, among different areas. Brake force and deviation from the center of the lane increased in the readable area in comparison with the visible area, and speed increased right after each billboard. The results indicated that billboards have a significant effect on driving performance and visual attention based on their content and visibility status. Generalized linear model (GLM) analysis showed no connection between participants’ age and driving experience with gaze duration. However, the visible distance of the billboard, gender, and billboard content had a significant effect on gaze duration.

Keywords: ANOVA, billboards, distracted driving, drivers' behavior, driving simulator, eye-Tracking system, GLM

Procedia PDF Downloads 115
2084 Detection of Clipped Fragments in Speech Signals

Authors: Sergei Aleinik, Yuri Matveev

Abstract:

In this paper a novel method for the detection of clipping in speech signals is described. It is shown that the new method has better performance than known clipping detection methods, is easy to implement, and is robust to changes in signal amplitude, size of data, etc. Statistical simulation results are presented.

Keywords: clipping, clipped signal, speech signal processing, digital signal processing

Procedia PDF Downloads 379
2083 Development of an Interactive and Robust Image Analysis and Diagnostic Tool in R for Early Detection of Cervical Cancer

Authors: Kumar Dron Shrivastav, Ankan Mukherjee Das, Arti Taneja, Harpreet Singh, Priya Ranjan, Rajiv Janardhanan

Abstract:

Cervical cancer is one of the most common cancer among women worldwide which can be cured if detected early. Manual pathology which is typically utilized at present has many limitations. The current gold standard for cervical cancer diagnosis is exhaustive and time-consuming because it relies heavily on the subjective knowledge of the oncopathologists which leads to mis-diagnosis and missed diagnosis resulting false negative and false positive. To reduce time and complexities associated with early diagnosis, we require an interactive diagnostic tool for early detection particularly in developing countries where cervical cancer incidence and related mortality is high. Incorporation of digital pathology in place of manual pathology for cervical cancer screening and diagnosis can increase the precision and strongly reduce the chances of error in a time-specific manner. Thus, we propose a robust and interactive cervical cancer image analysis and diagnostic tool, which can categorically process both histopatholgical and cytopathological images to identify abnormal cells in the least amount of time and settings with minimum resources. Furthermore, incorporation of a set of specific parameters that are typically referred to for identification of abnormal cells with the help of open source software -’R’ is one of the major highlights of the tool. The software has the ability to automatically identify and quantify the morphological features, color intensity, sensitivity and other parameters digitally to differentiate abnormal from normal cells, which may improve and accelerate screening and early diagnosis, ultimately leading to timely treatment of cervical cancer.

Keywords: cervical cancer, early detection, digital Pathology, screening

Procedia PDF Downloads 157
2082 Impact of Gaming Environment in Education

Authors: Md. Ataur Rahman Bhuiyan, Quazi Mahabubul Hasan, Md. Rifat Ullah

Abstract:

In this research, we did explore the effectiveness of the gaming environment in education and compared it with the traditional education system. We take several workshops in both learning environments. We measured student’s performance by providing a grading score (by professional academics) on their attitude in different criteria. We also collect data from survey questionnaires to understand student’s experiences towards education and study. Finally, we examine the impact of the different learning environments by applying statistical hypothesis tests, the T-test, and the ANOVA test.

Keywords: gamification, game-based learning, education, statistical analysis, human-computer interaction

Procedia PDF Downloads 204
2081 A Novel Method for Face Detection

Authors: H. Abas Nejad, A. R. Teymoori

Abstract:

Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, etc. in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as the user stays neutral for the majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this work, we propose a light-weight neutral vs. emotion classification engine, which acts as a preprocessor to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at Key Emotion (KE) points using a textural statistical model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a textural statistical model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves ER accuracy and simultaneously reduces the computational complexity of ER system, as validated on multiple databases.

Keywords: neutral vs. emotion classification, Constrained Local Model, procrustes analysis, Local Binary Pattern Histogram, statistical model

Procedia PDF Downloads 325
2080 Combined Mindfulness and Exercise Intervention for Depressive and Insomnia Symptoms in Chinese Students: A Pilot Randomized Controlled Trial

Authors: Xinli Chi, Xiaoqi Wei

Abstract:

Background: Body-mind theory refers to the concept that the mind and body are interconnected; in this case, combining aerobic exercise and mindfulness-based training may be beneficial for mind-body health; however, there is limited evidence regarding their effects and potential mechanisms among Chinese university students. Therefore, the current study aims to examine the preliminary effects and feasibility of the combined intervention on depressive and insomnia symptoms, as well as to explore the underlying mechanisms. Methods: This is a two-arm pilot study of a randomized, controlled trial. Sixty-one Chinese university students were randomly allocated to 8-week combined intervention group (aerobic exercise plus mindfulness, N = 36) or control group (N = 36). In addition, 8 participants in combined intervention group were later volunteer to engage in semi-structured interview. The Self-Rating Depression Scale (SDS) and the Youth Self-Rating Insomnia Scales (YSIS) were used to measure depressive and insomnia symptoms, respectively. The intervention outcome and feasibility were tested by repeated-measures ANOVA, mediation model, and qualitative analysis. Results: The study included 31 participants in the intervention group and 30 participants in the control group, all of whom completed pre-test and post-test questionnaires. The results of the repeated-measures ANOVA showed that the combined intervention was effective in reducing depressive and insomnia symptoms among university students. Moreover, the mediation analysis suggested that improvement in insomnia symptoms might be a significant mechanism for the combined intervention. Qualitative analysis identified two main themes: “Helpful aspects of mind-body state” (including 7 sub-themes) and “Factors that influence the training effects” (including 3 sub-themes). Conclusions: The study confirmed the preliminary effect and feasibility of the combined intervention of mindfulness and aerobic exercise, while also exploring the potential mechanisms underlying this effect. Additionally, qualitative data provided valuable insights for optimizing future protocols.

Keywords: combined intervention, mindfulness, aerobic exercise, depressive symptoms, insomnia symptoms

Procedia PDF Downloads 76
2079 The Resistance of Fish Outside of Water Medium

Authors: Febri Ramadhan

Abstract:

Water medium is a vital necessity for the survival of fish. Fish can survive inside/outside of water medium within a certain time. By knowing the level of survival fish at outside of water medium, a person can transport the fish to a place with more efficiently. Transport of live fish from one place to another can be done with wet and dry media system. In this experiment the treatment-given the observed differences in fish species. This experiment aimed to test the degree of resilience of fish out of water media. Based on the ANOVA table is obtained, it can be concluded that the type of fish affects the level of resilience of fish outside the water (Fhit> Ftab).

Keywords: fish, transport, retention rate, fish resiliance

Procedia PDF Downloads 316
2078 A Corpus Output Error Analysis of Chinese L2 Learners From America, Myanmar, and Singapore

Authors: Qiao-Yu Warren Cai

Abstract:

Due to the rise of big data, building corpora and using them to analyze ChineseL2 learners’ language output has become a trend. Various empirical research has been conducted using Chinese corpora built by different academic institutes. However, most of the research analyzed the data in the Chinese corpora usingcorpus-based qualitative content analysis with descriptive statistics. Descriptive statistics can be used to make summations about the subjects or samples that research has actually measured to describe the numerical data, but the collected data cannot be generalized to the population. Comte, a Frenchpositivist, has argued since the 19th century that human beings’ knowledge, whether the discipline is humanistic and social science or natural science, should be verified in a scientific way to construct a universal theory to explain the truth and human beings behaviors. Inferential statistics, able to make judgments of the probability of a difference observed between groups being dependable or caused by chance (Free Geography Notes, 2015)and to infer from the subjects or examples what the population might think or behave, is just the right method to support Comte’s argument in the field of TCSOL. Also, inferential statistics is a core of quantitative research, but little research has been conducted by combing corpora with inferential statistics. Little research analyzes the differences in Chinese L2 learners’ language corpus output errors by using theOne-way ANOVA so that the findings of previous research are limited to inferring the population's Chinese errors according to the given samples’ Chinese corpora. To fill this knowledge gap in the professional development of Taiwanese TCSOL, the present study aims to utilize the One-way ANOVA to analyze corpus output errors of Chinese L2 learners from America, Myanmar, and Singapore. The results show that no significant difference exists in ‘shì (是) sentence’ and word order errors, but compared with Americans and Singaporeans, it is significantly easier for Myanmar to have ‘sentence blends.’ Based on the above results, the present study provides an instructional approach and contributes to further exploration of how Chinese L2 learners can have (and use) learning strategies to lower errors.

Keywords: Chinese corpus, error analysis, one-way analysis of variance, Chinese L2 learners, Americans, myanmar, Singaporeans

Procedia PDF Downloads 86
2077 A Hybrid Watermarking Scheme Using Discrete and Discrete Stationary Wavelet Transformation For Color Images

Authors: Bülent Kantar, Numan Ünaldı

Abstract:

This paper presents a new method which includes robust and invisible digital watermarking on images that is colored. Colored images are used as watermark. Frequency region is used for digital watermarking. Discrete wavelet transform and discrete stationary wavelet transform are used for frequency region transformation. Low, medium and high frequency coefficients are obtained by applying the two-level discrete wavelet transform to the original image. Low frequency coefficients are obtained by applying one level discrete stationary wavelet transform separately to all frequency coefficient of the two-level discrete wavelet transformation of the original image. For every low frequency coefficient obtained from one level discrete stationary wavelet transformation, watermarks are added. Watermarks are added to all frequency coefficients of two-level discrete wavelet transform. Totally, four watermarks are added to original image. In order to get back the watermark, the original and watermarked images are applied with two-level discrete wavelet transform and one level discrete stationary wavelet transform. The watermark is obtained from difference of the discrete stationary wavelet transform of the low frequency coefficients. A total of four watermarks are obtained from all frequency of two-level discrete wavelet transform. Obtained watermark results are compared with real watermark results, and a similarity result is obtained. A watermark is obtained from the highest similarity values. Proposed methods of watermarking are tested against attacks of the geometric and image processing. The results show that proposed watermarking method is robust and invisible. All features of frequencies of two level discrete wavelet transform watermarking are combined to get back the watermark from the watermarked image. Watermarks have been added to the image by converting the binary image. These operations provide us with better results in getting back the watermark from watermarked image by attacking of the geometric and image processing.

Keywords: watermarking, DWT, DSWT, copy right protection, RGB

Procedia PDF Downloads 514
2076 A Comparison of Methods for Estimating Dichotomous Treatment Effects: A Simulation Study

Authors: Jacqueline Y. Thompson, Sam Watson, Lee Middleton, Karla Hemming

Abstract:

Introduction: The odds ratio (estimated via logistic regression) is a well-established and common approach for estimating covariate-adjusted binary treatment effects when comparing a treatment and control group with dichotomous outcomes. Its popularity is primarily because of its stability and robustness to model misspecification. However, the situation is different for the relative risk and risk difference, which are arguably easier to interpret and better suited to specific designs such as non-inferiority studies. So far, there is no equivalent, widely acceptable approach to estimate an adjusted relative risk and risk difference when conducting clinical trials. This is partly due to the lack of a comprehensive evaluation of available candidate methods. Methods/Approach: A simulation study is designed to evaluate the performance of relevant candidate methods to estimate relative risks to represent conditional and marginal estimation approaches. We consider the log-binomial, generalised linear models (GLM) with iteratively weighted least-squares (IWLS) and model-based standard errors (SE); log-binomial GLM with convex optimisation and model-based SEs; log-binomial GLM with convex optimisation and permutation tests; modified-Poisson GLM IWLS and robust SEs; log-binomial generalised estimation equations (GEE) and robust SEs; marginal standardisation and delta method SEs; and marginal standardisation and permutation test SEs. Independent and identically distributed datasets are simulated from a randomised controlled trial to evaluate these candidate methods. Simulations are replicated 10000 times for each scenario across all possible combinations of sample sizes (200, 1000, and 5000), outcomes (10%, 50%, and 80%), and covariates (ranging from -0.05 to 0.7) representing weak, moderate or strong relationships. Treatment effects (ranging from 0, -0.5, 1; on the log-scale) will consider null (H0) and alternative (H1) hypotheses to evaluate coverage and power in realistic scenarios. Performance measures (bias, mean square error (MSE), relative efficiency, and convergence rates) are evaluated across scenarios covering a range of sample sizes, event rates, covariate prognostic strength, and model misspecifications. Potential Results, Relevance & Impact: There are several methods for estimating unadjusted and adjusted relative risks. However, it is unclear which method(s) is the most efficient, preserves type-I error rate, is robust to model misspecification, or is the most powerful when adjusting for non-prognostic and prognostic covariates. GEE estimations may be biased when the outcome distributions are not from marginal binary data. Also, it seems that marginal standardisation and convex optimisation may perform better than GLM IWLS log-binomial.

Keywords: binary outcomes, statistical methods, clinical trials, simulation study

Procedia PDF Downloads 97
2075 A Robust Visual Simultaneous Localization and Mapping for Indoor Dynamic Environment

Authors: Xiang Zhang, Daohong Yang, Ziyuan Wu, Lei Li, Wanting Zhou

Abstract:

Visual Simultaneous Localization and Mapping (VSLAM) uses cameras to collect information in unknown environments to realize simultaneous localization and environment map construction, which has a wide range of applications in autonomous driving, virtual reality and other related fields. At present, the related research achievements about VSLAM can maintain high accuracy in static environment. But in dynamic environment, due to the presence of moving objects in the scene, the movement of these objects will reduce the stability of VSLAM system, resulting in inaccurate localization and mapping, or even failure. In this paper, a robust VSLAM method was proposed to effectively deal with the problem in dynamic environment. We proposed a dynamic region removal scheme based on semantic segmentation neural networks and geometric constraints. Firstly, semantic extraction neural network is used to extract prior active motion region, prior static region and prior passive motion region in the environment. Then, the light weight frame tracking module initializes the transform pose between the previous frame and the current frame on the prior static region. A motion consistency detection module based on multi-view geometry and scene flow is used to divide the environment into static region and dynamic region. Thus, the dynamic object region was successfully eliminated. Finally, only the static region is used for tracking thread. Our research is based on the ORBSLAM3 system, which is one of the most effective VSLAM systems available. We evaluated our method on the TUM RGB-D benchmark and the results demonstrate that the proposed VSLAM method improves the accuracy of the original ORBSLAM3 by 70%˜98.5% under high dynamic environment.

Keywords: dynamic scene, dynamic visual SLAM, semantic segmentation, scene flow, VSLAM

Procedia PDF Downloads 92
2074 Classifying Turbomachinery Blade Mode Shapes Using Artificial Neural Networks

Authors: Ismail Abubakar, Hamid Mehrabi, Reg Morton

Abstract:

Currently, extensive signal analysis is performed in order to evaluate structural health of turbomachinery blades. This approach is affected by constraints of time and the availability of qualified personnel. Thus, new approaches to blade dynamics identification that provide faster and more accurate results are sought after. Generally, modal analysis is employed in acquiring dynamic properties of a vibrating turbomachinery blade and is widely adopted in condition monitoring of blades. The analysis provides useful information on the different modes of vibration and natural frequencies by exploring different shapes that can be taken up during vibration since all mode shapes have their corresponding natural frequencies. Experimental modal testing and finite element analysis are the traditional methods used to evaluate mode shapes with limited application to real live scenario to facilitate a robust condition monitoring scheme. For a real time mode shape evaluation, rapid evaluation and low computational cost is required and traditional techniques are unsuitable. In this study, artificial neural network is developed to evaluate the mode shape of a lab scale rotating blade assembly by using result from finite element modal analysis as training data. The network performance evaluation shows that artificial neural network (ANN) is capable of mapping the correlation between natural frequencies and mode shapes. This is achieved without the need of extensive signal analysis. The approach offers advantage from the perspective that the network is able to classify mode shapes and can be employed in real time including simplicity in implementation and accuracy of the prediction. The work paves the way for further development of robust condition monitoring system that incorporates real time mode shape evaluation.

Keywords: modal analysis, artificial neural network, mode shape, natural frequencies, pattern recognition

Procedia PDF Downloads 138
2073 Innovative Screening Tool Based on Physical Properties of Blood

Authors: Basant Singh Sikarwar, Mukesh Roy, Ayush Goyal, Priya Ranjan

Abstract:

This work combines two bodies of knowledge which includes biomedical basis of blood stain formation and fluid communities’ wisdom that such formation of blood stain depends heavily on physical properties. Moreover biomedical research tells that different patterns in stains of blood are robust indicator of blood donor’s health or lack thereof. Based on these valuable insights an innovative screening tool is proposed which can act as an aide in the diagnosis of diseases such Anemia, Hyperlipidaemia, Tuberculosis, Blood cancer, Leukemia, Malaria etc., with enhanced confidence in the proposed analysis. To realize this powerful technique, simple, robust and low-cost micro-fluidic devices, a micro-capillary viscometer and a pendant drop tensiometer are designed and proposed to be fabricated to measure the viscosity, surface tension and wettability of various blood samples. Once prognosis and diagnosis data has been generated, automated linear and nonlinear classifiers have been applied into the automated reasoning and presentation of results. A support vector machine (SVM) classifies data on a linear fashion. Discriminant analysis and nonlinear embedding’s are coupled with nonlinear manifold detection in data and detected decisions are made accordingly. In this way, physical properties can be used, using linear and non-linear classification techniques, for screening of various diseases in humans and cattle. Experiments are carried out to validate the physical properties measurement devices. This framework can be further developed towards a real life portable disease screening cum diagnostics tool. Small-scale production of screening cum diagnostic devices is proposed to carry out independent test.

Keywords: blood, physical properties, diagnostic, nonlinear, classifier, device, surface tension, viscosity, wettability

Procedia PDF Downloads 363
2072 Infrastructure Development – Stages in Development

Authors: Seppo Sirkemaa

Abstract:

Information systems infrastructure is the basis of business systems and processes in the company. It should be a reliable platform for business processes and activities but also have the flexibility to change business needs. The development of an infrastructure that is robust, reliable, and flexible is a challenge. Understanding technological capabilities and business needs is a key element in the development of successful information systems infrastructure.

Keywords: development, information technology, networks, technology

Procedia PDF Downloads 96
2071 Rhythm-Reading Success Using Conversational Solfege

Authors: Kelly Jo Hollingsworth

Abstract:

Conversational Solfege, a research-based, 12-step music literacy instructional method using the sound-before-sight approach, was used to teach rhythm-reading to 128-second grade students at a public school in the southeastern United States. For each step, multiple scripted techniques are supplied to teach each skill. Unit one was the focus of this study, which is quarter note and barred eighth note rhythms. During regular weekly music instruction, students completed method steps one through five, which includes aural discrimination, decoding familiar and unfamiliar rhythm patterns, and improvising rhythmic phrases using quarter notes and barred eighth notes. Intact classes were randomly assigned to two treatment groups for teaching steps six through eight, which was the visual presentation and identification of quarter notes and barred eighth notes, visually presenting and decoding familiar patterns, and visually presenting and decoding unfamiliar patterns using said notation. For three weeks, students practiced steps six through eight during regular weekly music class. One group spent five-minutes of class time on steps six through eight technique work, while the other group spends ten-minutes of class time practicing the same techniques. A pretest and posttest were administered, and ANOVA results reveal both the five-minute (p < .001) and ten-minute group (p < .001) reached statistical significance suggesting Conversational Solfege is an efficient, effective approach to teach rhythm-reading to second grade students. After two weeks of no instruction, students were retested to measure retention. Using a repeated-measures ANOVA, both groups reached statistical significance (p < .001) on the second posttest, suggesting both the five-minute and ten-minute group retained rhythm-reading skill after two weeks of no instruction. Statistical significance was not reached between groups (p=.252), suggesting five-minutes is equally as effective as ten-minutes of rhythm-reading practice using Conversational Solfege techniques. Future research includes replicating the study with other grades and units in the text.

Keywords: conversational solfege, length of instructional time, rhythm-reading, rhythm instruction

Procedia PDF Downloads 143
2070 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review

Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha

Abstract:

Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.

Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text

Procedia PDF Downloads 100
2069 Development of a Robust Protein Classifier to Predict EMT Status of Cervical Squamous Cell Carcinoma and Endocervical Adenocarcinoma (CESC) Tumors

Authors: ZhenlinJu, Christopher P. Vellano, RehanAkbani, Yiling Lu, Gordon B. Mills

Abstract:

The epithelial–mesenchymal transition (EMT) is a process by which epithelial cells acquire mesenchymal characteristics, such as profound disruption of cell-cell junctions, loss of apical-basolateral polarity, and extensive reorganization of the actin cytoskeleton to induce cell motility and invasion. A hallmark of EMT is its capacity to promote metastasis, which is due in part to activation of several transcription factors and subsequent downregulation of E-cadherin. Unfortunately, current approaches have yet to uncover robust protein marker sets that can classify tumors as possessing strong EMT signatures. In this study, we utilize reverse phase protein array (RPPA) data and consensus clustering methods to successfully classify a subset of cervical squamous cell carcinoma and endocervical adenocarcinoma (CESC) tumors into an EMT protein signaling group (EMT group). The overall survival (OS) of patients in the EMT group is significantly worse than those in the other Hormone and PI3K/AKT signaling groups. In addition to a shrinkage and selection method for linear regression (LASSO), we applied training/test set and Monte Carlo resampling approaches to identify a set of protein markers that predicts the EMT status of CESC tumors. We fit a logistic model to these protein markers and developed a classifier, which was fixed in the training set and validated in the testing set. The classifier robustly predicted the EMT status of the testing set with an area under the curve (AUC) of 0.975 by Receiver Operating Characteristic (ROC) analysis. This method not only identifies a core set of proteins underlying an EMT signature in cervical cancer patients, but also provides a tool to examine protein predictors that drive molecular subtypes in other diseases.

Keywords: consensus clustering, TCGA CESC, Silhouette, Monte Carlo LASSO

Procedia PDF Downloads 447
2068 Comparison of Cyclone Design Methods for Removal of Fine Particles from Plasma Generated Syngas

Authors: Mareli Hattingh, I. Jaco Van der Walt, Frans B. Waanders

Abstract:

A waste-to-energy plasma system was designed by Necsa for commercial use to create electricity from unsorted municipal waste. Fly ash particles must be removed from the syngas stream at operating temperatures of 1000 °C and recycled back into the reactor for complete combustion. A 2D2D high efficiency cyclone separator was chosen for this purpose. During this study, two cyclone design methods were explored: The Classic Empirical Method (smaller cyclone) and the Flow Characteristics Method (larger cyclone). These designs were optimized with regard to efficiency, so as to remove at minimum 90% of the fly ash particles of average size 10 μm by 50 μm. Wood was used as feed source at a concentration of 20 g/m3 syngas. The two designs were then compared at room temperature, using Perspex test units and three feed gases of different densities, namely nitrogen, helium and air. System conditions were imitated by adapting the gas feed velocity and particle load for each gas respectively. Helium, the least dense of the three gases, would simulate higher temperatures, whereas air, the densest gas, simulates a lower temperature. The average cyclone efficiencies ranged between 94.96% and 98.37%, reaching up to 99.89% in individual runs. The lowest efficiency attained was 94.00%. Furthermore, the design of the smaller cyclone proved to be more robust, while the larger cyclone demonstrated a stronger correlation between its separation efficiency and the feed temperatures. The larger cyclone can be assumed to achieve slightly higher efficiencies at elevated temperatures. However, both design methods led to good designs. At room temperature, the difference in efficiency between the two cyclones was almost negligible. At higher temperatures, however, these general tendencies are expected to be amplified so that the difference between the two design methods will become more obvious. Though the design specifications were met for both designs, the smaller cyclone is recommended as default particle separator for the plasma system due to its robust nature.

Keywords: Cyclone, design, plasma, renewable energy, solid separation, waste processing

Procedia PDF Downloads 196
2067 A Tale of Seven Districts: Reviewing The Past, Present and Future of Patent Litigation Filings to Form a Two-Step Burden-Shifting Framework for 28 U.S.C. § 1404(a)

Authors: Timothy T. Hsieh

Abstract:

Current patent venue transfer laws under 28 U.S.C. § 1404(a) e.g., the Gilbert factors from Gulf Oil Corp. v. Gilbert, 330 U.S. 501 (1947) are too malleable in that they often lead to frequent mandamus orders from the U.S. Court of Appeals for the Federal Circuit (“Federal Circuit”) overturning district court rulings on venue transfer motions. Thus, this paper proposes a more robust two-step burden-shifting framework that replaces the eight Gilbert factors. Moreover, a brief history of venue transfer patterns in the seven most active federal patent district courts is covered, with special focus devoted to the venue transfer orders from Judge Alan D Albright of the U.S. District Court for the Western District of Texas. A comprehensive data summary of 45 case sets where the Federal Circuit ruled on writs of mandamus involving Judge Albright’s transfer orders is subsequently provided, with coverage summaries of certain cases including four precedential ones from the Federal Circuit. This proposed two-step burden shifting framework is then applied to these venue transfer cases, as well as Federal Circuit mandamus orders ruling on those decisions. Finally, alternative approaches to remedying the frequent reversals for venue transfer will be discussed, including potential legislative solutions, adjustments to common law framework approaches to venue transfer, deference to the inherent powers of Article III U.S. District Judge, and a unified federal patent district court. Overall, this paper seeks to offer a more robust and consistent three-step burden-shifting framework for venue transfer and for the Federal Circuit to follow in administering mandamus orders, which might change somewhat in light of Western District of Texas Chief Judge Orlando Garcia’s order on redistributing Judge Albright’s patent cases.

Keywords: Patent law, venue, judge Alan Albright, minimum contacts, western district of Texas

Procedia PDF Downloads 83
2066 Habermas: A Unity of the Law and Democracy

Authors: Qi Jing

Abstract:

This paper examines and defends Jürgen Habermas’s claim that law is the other side of democracy. It is believed that law and democracy are related, for Habermas, through the mediation of communicative rationality and discourse ethics. These ground a procedural conception of democracy, which legitimizes and rationalizes legal codes in a robust public sphere, linking the exercise of democratic political power to the form of law. The strengths of Habermas’s approach lie, it should be claimed, in its overcoming of relativism, its combination of democratically-enacted law with post-conventional morality, and its correction of the one-sided emphasis on private and public autonomy in Kant and Rousseau, respectively.

Keywords: habermas, law, democracy, reason, public sphere

Procedia PDF Downloads 52
2065 Measuring Self-Regulation and Self-Direction in Flipped Classroom Learning

Authors: S. A. N. Danushka, T. A. Weerasinghe

Abstract:

The diverse necessities of instruction could be addressed effectively with the support of new dimensions of ICT integrated learning such as blended learning –which is a combination of face-to-face and online instruction which ensures greater flexibility in student learning and congruity of course delivery. As blended learning has been the ‘new normality' in education, many experimental and quasi-experimental research studies provide ample of evidence on its successful implementation in many fields of studies, but it is hard to justify whether blended learning could work similarly in the delivery of technology-teacher development programmes (TTDPs). The present study is bound with the particular research uncertainty, and having considered existing research approaches, the study methodology was set to decide the efficient instructional strategies for flipped classroom learning in TTDPs. In a quasi-experimental pre-test and post-test design with a mix-method research approach, the major study objective was tested with two heterogeneous samples (N=135) identified in a virtual learning environment in a Sri Lankan university. Non-randomized informal ‘before-and-after without control group’ design was employed, and two data collection methods, identical pre-test and post-test and Likert-scale questionnaires were used in the study. Selected two instructional strategies, self-directed learning (SDL) and self-regulated learning (SRL), were tested in an appropriate instructional framework with two heterogeneous samples (pre-service and in-service teachers). Data were statistically analyzed, and an efficient instructional strategy was decided via t-test, ANOVA, ANCOVA. The effectiveness of the two instructional strategy implementation models was decided via multiple linear regression analysis. ANOVA (p < 0.05) shows that age, prior-educational qualifications, gender, and work-experiences do not impact on learning achievements of the two diverse groups of learners through the instructional strategy is changed. ANCOVA (p < 0.05) analysis shows that SDL is efficient for two diverse groups of technology-teachers than SRL. Multiple linear regression (p < 0.05) analysis shows that the staged self-directed learning (SSDL) model and four-phased model of motivated self-regulated learning (COPES Model) are efficient in the delivery of course content in flipped classroom learning.

Keywords: COPES model, flipped classroom learning, self-directed learning, self-regulated learning, SSDL model

Procedia PDF Downloads 171
2064 Domestic Led Lighting Designs Using Internet of Things

Authors: Gouresh Singhal, Rajib Kumar Panigrahi

Abstract:

In this paper, we try to examine historical and technological changes in lighting industry. We propose a (proto) technical solution at block diagram and circuit level. Untapped and upcoming technologies such as Cloud and 6LoWPAN are further explored. The paper presents a robust hardware realistic design. A mobile application is also provided to provide last mile user interface. The paper highlights the current challenges to be faced and concludes with a pragmatic view of lighting industry.

Keywords: 6lowpan, internet of things, mobile application, led

Procedia PDF Downloads 563
2063 Stabilization of Metastable Skyrmion Phase in Polycrystalline Chiral β-Mn Type Co₇Zn₇Mn₆ Alloy

Authors: Pardeep, Yugandhar Bitla, A. K. Patra, G. A. Basheed

Abstract:

The topological protected nanosized particle-like swirling spin textures, “skyrmion,” has been observed in various ferromagnets with chiral crystal structures like MnSi, FeGe, Cu₂OSeO₃ alloys, however the magnetic ordering in these systems takes place at very low temperatures. For skyrmion-based spintronics devices, the skyrmion phase is required to stabilize in a wide temperature – field (T - H) region. The equilibrium skyrmion phase (SkX) in Co₇Zn₇Mn₆ alloy exists in a narrow T – H region just below transition temperature (TC ~ 215 K) and can be quenched by field cooling as a metastable skyrmion phase (MSkX) below SkX region. To realize robust MSkX at 110 K, field sweep ac susceptibility χ(H) measurements were performed after the zero field cooling (ZFC) and field cooling (FC) process. In ZFC process, the sample was cooled from 320 K to 110 K in zero applied magnetic field and then field sweep measurement was performed (up to 2 T) in positive direction (black curve). The real part of ac susceptibility (χ′(H)) at 110 K in positive field direction after ZFC confirms helical to conical phase transition at low field HC₁ (= 42 mT) and conical to ferromagnetic (FM) transition at higher field HC₂ (= 300 mT). After ZFC, FC measurements were performed i.e., sample was initially cooled in zero fields from 320 to 206 K and then a sample was field cooled in the presence of 15 mT field down to the temperature 110 K. After FC process, isothermal χ(H) was measured in positive (+H, red curve) and negative (-H, blue curve) field direction with increasing and decreasing field upto 2 T. Hysteresis behavior in χ′(H), measured after ZFC and FC process, indicates the stabilization of MSkX at 110 K which is in close agreement with literature. Also, the asymmetry between field-increasing curves measured after FC process in both sides confirm the stabilization of MSkX. In the returning process from the high field polarized FM state, helical state below HC₁ is destroyed and only the conical state is observed. Thus, the robust MSkX state is stabilized below its SkX phase over a much wider T - H region by FC in polycrystalline Co₇Zn₇Mn₆ alloy.

Keywords: skyrmions, magnetic susceptibility, metastable phases, topological phases

Procedia PDF Downloads 92
2062 Robust Numerical Solution for Flow Problems

Authors: Gregor Kosec

Abstract:

Simple and robust numerical approach for solving flow problems is presented, where involved physical fields are represented through the local approximation functions, i.e., the considered field is approximated over a local support domain. The approximation functions are then used to evaluate the partial differential operators. The type of approximation, the size of support domain, and the type and number of basis function can be general. The solution procedure is formulated completely through local computational operations. Besides local numerical method also the pressure velocity is performed locally with retaining the correct temporal transient. The complete locality of the introduced numerical scheme has several beneficial effects. One of the most attractive is the simplicity since it could be understood as a generalized Finite Differences Method, however, much more powerful. Presented methodology offers many possibilities for treating challenging cases, e.g. nodal adaptivity to address regions with sharp discontinuities or p-adaptivity to treat obscure anomalies in physical field. The stability versus computation complexity and accuracy can be regulated by changing number of support nodes, etc. All these features can be controlled on the fly during the simulation. The presented methodology is relatively simple to understand and implement, which makes it potentially powerful tool for engineering simulations. Besides simplicity and straightforward implementation, there are many opportunities to fully exploit modern computer architectures through different parallel computing strategies. The performance of the method is presented on the lid driven cavity problem, backward facing step problem, de Vahl Davis natural convection test, extended also to low Prandtl fluid and Darcy porous flow. Results are presented in terms of velocity profiles, convergence plots, and stability analyses. Results of all cases are also compared against published data.

Keywords: fluid flow, meshless, low Pr problem, natural convection

Procedia PDF Downloads 215
2061 Hybridization of Mathematical Transforms for Robust Video Watermarking Technique

Authors: Harpal Singh, Sakshi Batra

Abstract:

The widespread and easy accesses to multimedia contents and possibility to make numerous copies without loss of significant fidelity have roused the requirement of digital rights management. Thus this problem can be effectively solved by Digital watermarking technology. This is a concept of embedding some sort of data or special pattern (watermark) in the multimedia content; this information will later prove ownership in case of a dispute, trace the marked document’s dissemination, identify a misappropriating person or simply inform user about the rights-holder. The primary motive of digital watermarking is to embed the data imperceptibly and robustly in the host information. Extensive counts of watermarking techniques have been developed to embed copyright marks or data in digital images, video, audio and other multimedia objects. With the development of digital video-based innovations, copyright dilemma for the multimedia industry increases. Video watermarking had been proposed in recent years to serve the issue of illicit copying and allocation of videos. It is the process of embedding copyright information in video bit streams. Practically video watermarking schemes have to address some serious challenges as compared to image watermarking schemes like real-time requirements in the video broadcasting, large volume of inherently redundant data between frames, the unbalance between the motion and motionless regions etc. and they are particularly vulnerable to attacks, for example, frame swapping, statistical analysis, rotation, noise, median and crop attacks. In this paper, an effective, robust and imperceptible video watermarking algorithm is proposed based on hybridization of powerful mathematical transforms; Fractional Fourier Transform (FrFT), Discrete Wavelet transforms (DWT) and Singular Value Decomposition (SVD) using redundant wavelet. This scheme utilizes various transforms for embedding watermarks on different layers by using Hybrid systems. For this purpose, the video frames are portioned into layers (RGB) and the watermark is being embedded in two forms in the video frames using SVD portioning of the watermark, and DWT sub-band decomposition of host video, to facilitate copyright safeguard as well as reliability. The FrFT orders are used as the encryption key that allows the watermarking method to be more robust against various attacks. The fidelity of the scheme is enhanced by introducing key generation and wavelet based key embedding watermarking scheme. Thus, for watermark embedding and extraction, same key is required. Therefore the key must be shared between the owner and the verifier via some safe network. This paper demonstrates the performance by considering different qualitative metrics namely Peak Signal to Noise ratio, Structure similarity index and correlation values and also apply some attacks to prove the robustness. The Experimental results are presented to demonstrate that the proposed scheme can withstand a variety of video processing attacks as well as imperceptibility.

Keywords: discrete wavelet transform, robustness, video watermarking, watermark

Procedia PDF Downloads 212
2060 Hierarchical Clustering Algorithms in Data Mining

Authors: Z. Abdullah, A. R. Hamdan

Abstract:

Clustering is a process of grouping objects and data into groups of clusters to ensure that data objects from the same cluster are identical to each other. Clustering algorithms in one of the areas in data mining and it can be classified into partition, hierarchical, density based, and grid-based. Therefore, in this paper, we do a survey and review for four major hierarchical clustering algorithms called CURE, ROCK, CHAMELEON, and BIRCH. The obtained state of the art of these algorithms will help in eliminating the current problems, as well as deriving more robust and scalable algorithms for clustering.

Keywords: clustering, unsupervised learning, algorithms, hierarchical

Procedia PDF Downloads 862
2059 Clinician's Perspective of Common Factors of Change in Family Therapy: A Cross-National Exploration

Authors: Hassan Karimi, Fred Piercy, Ruoxi Chen, Ana L. Jaramillo-Sierra, Wei-Ning Chang, Manjushree Palit, Catherine Martosudarmo, Angelito Antonio

Abstract:

Background: The two psychotherapy camps, the randomized clinical trials (RCTs) and the common factors model, have competitively claimed specific explanations for therapy effectiveness. Recently, scholars called for empirical evidence to show the role of common factors in therapeutic outcome in marriage and family therapy. Purpose: This cross-national study aims to explore how clinicians, across different nations and theoretical orientations, attribute the contribution of common factors to therapy outcome. Method: A brief common factors questionnaire (CFQ-with a Cronbach’s Alpha, 0.77) was developed and administered in seven nations. A series of statistical analyses (paired-samples t-test, independent sample t-test, ANOVA) were conducted: to compare clinicians perceived contribution of total common factors versus model-specific factors, to compare each pair of common factors’ categories, and to compare clinicians from collectivistic nations versus clinicians from individualistic nation. Results: Clinicians across seven nations attributed 86% to common factors versus 14% to model-specific factors. Clinicians attributed 34% of therapeutic change to client’s factors, 26% to therapist’s factors, 26% to relationship factors, and 14% to model-specific techniques. The ANOVA test indicated each of the three categories of common factors (client 34%, therapist 26%, relationship 26%) showed higher contribution in therapeutic outcome than the category of model specific factors (techniques 14%). Clinicians with psychology degree attributed more contribution to model-specific factors than clinicians with MFT and counseling degrees who attributed more contribution to client factors. Clinicians from collectivistic nations attributed larger contributions to therapist’s factors (M=28.96, SD=12.75) than the US clinicians (M=23.22, SD=7.73). The US clinicians attributed a larger contribution to client’s factors (M=39.02, SD=1504) than clinicians from the collectivistic nations (M=28.71, SD=15.74). Conclusion: The findings indicate clinicians across the globe attributed more than two thirds of therapeutic change to CFs, which emphasize the training of the common factors model in the field. CFs, like model-specific factors, vary in their contribution to therapy outcome in relation to specific client, therapist, problem, treatment model, and sociocultural context. Sociocultural expectations and norms should be considered as a context in which both CFs and model-specific factors function toward therapeutic goals. Clinicians need to foster a cultural competency specifically regarding the divergent ways that CFs can be activated due to specific sociocultural values.

Keywords: common factors, model-specific factors, cross-national survey, therapist cultural competency, enhancing therapist efficacy

Procedia PDF Downloads 271
2058 Robust Decision Support Framework for Addressing Uncertainties in Water Resources Management in the Mekong

Authors: Chusit Apirumanekul, Chayanis Krittasudthacheewa, Ratchapat Ratanavaraha, Yanyong Inmuong

Abstract:

Rapid economic development in the Lower Mekong region is leading to changes in water quantity and quality. Changes in land- and forest-use, infrastructure development, increasing urbanization, migration patterns and climate risks are increasing demands for water, within various sectors, placing pressure on scarce water resources. Appropriate policies, strategies, and planning are urgently needed for improved water resource management. Over the last decade, Thailand has experienced more frequent and intense drought situations, affecting the level of water storage in reservoirs along with insufficient water allocation for agriculture during the dry season. The Huay Saibat River Basin, one of the well-known water-scarce areas in the northeastern region of Thailand, is experiencing ongoing water scarcity that affects both farming livelihoods and household consumption. Drought management in Thailand mainly focuses on emergency responses, rather than advance preparation and mitigation for long-term solutions. Despite many efforts from local authorities to mitigate the drought situation, there is yet no long-term comprehensive water management strategy, that integrates climate risks alongside other uncertainties. This paper assesses the application in the Huay Saibat River Basin, of the Robust Decision Support framework, to explore the feasibility of multiple drought management policies; including a shift in cropping season, in crop changes, in infrastructural operations and in the use of groundwater, under a wide range of uncertainties, including climate and land-use change. A series of consultative meetings were organized with relevant agencies and experts at the local level, to understand and explore plausible water resources strategies and identify thresholds to evaluate the performance of those strategies. Three different climate conditions were identified (dry, normal and wet). Other non-climatic factors influencing water allocation were further identified, including changes from sugarcane to rubber, delaying rice planting, increasing natural retention storage and using groundwater to supply demands for household consumption and small-scale gardening. Water allocation and water use in various sectors, such as in agriculture, domestic, industry and the environment, were estimated by utilising the Water Evaluation And Planning (WEAP) system, under various scenarios developed from the combination of climatic and non-climatic factors mentioned earlier. Water coverage (i.e. percentage of water demand being successfully supplied) was defined as a threshold for water resource strategy assessment. Thresholds for different sectors (agriculture, domestic, industry, and environment) were specified during multi-stakeholder engagements. Plausible water strategies (e.g. increasing natural retention storage, change of crop type and use of groundwater as an alternative source) were evaluated based on specified thresholds in 4 sectors (agriculture, domestic, industry, and environment) under 3 climate conditions. 'Business as usual' was evaluated for comparison. The strategies considered robust, emerge when performance is assessed as successful, under a wide range of uncertainties across the river basin. Without adopting any strategy, the water scarcity situation is likely to escalate in the future. Among the strategies identified, the use of groundwater as an alternative source was considered a potential option in combating water scarcity for the basin. Further studies are needed to explore the feasibility for groundwater use as a potential sustainable source.

Keywords: climate change, robust decision support, scenarios, water resources management

Procedia PDF Downloads 153