Search results for: discriminate analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27110

Search results for: discriminate analysis

27020 Probing Language Models for Multiple Linguistic Information

Authors: Bowen Ding, Yihao Kuang

Abstract:

In recent years, large-scale pre-trained language models have achieved state-of-the-art performance on a variety of natural language processing tasks. The word vectors produced by these language models can be viewed as dense encoded presentations of natural language that in text form. However, it is unknown how much linguistic information is encoded and how. In this paper, we construct several corresponding probing tasks for multiple linguistic information to clarify the encoding capabilities of different language models and performed a visual display. We firstly obtain word presentations in vector form from different language models, including BERT, ELMo, RoBERTa and GPT. Classifiers with a small scale of parameters and unsupervised tasks are then applied on these word vectors to discriminate their capability to encode corresponding linguistic information. The constructed probe tasks contain both semantic and syntactic aspects. The semantic aspect includes the ability of the model to understand semantic entities such as numbers, time, and characters, and the grammatical aspect includes the ability of the language model to understand grammatical structures such as dependency relationships and reference relationships. We also compare encoding capabilities of different layers in the same language model to infer how linguistic information is encoded in the model.

Keywords: language models, probing task, text presentation, linguistic information

Procedia PDF Downloads 82
27019 Correlations and Impacts Of Optimal Rearing Parameters on Nutritional Value Of Mealworm (Tenebrio Molitor)

Authors: Fabienne Vozy, Anick Lepage

Abstract:

Insects are displaying high nutritional value, low greenhouse gas emissions, low land use requirements and high food conversion efficiency. They can contribute to the food chain and be one of many solutions to protein shortages. Currently, in North America, nutritional entomology is under-developed and the needs to better understand its benefits remain to convince large-scale producers and consumers (both for human and agricultural needs). As such, large-scale production of mealworms offers a promising alternative to replacing traditional sources of protein and fatty acids. To proceed orderly, it is required to collect more data on the nutritional values of insects such as, a) Evaluate the diets of insects to improve their dietary value; b) Test the breeding conditions to optimize yields; c) Evaluate the use of by-products and organic residues as sources of food. Among the featured technical parameters, relative humidity (RH) percentage and temperature, optimal substrates and hydration sources are critical elements, thus establishing potential benchmarks for to optimize conversion rates of protein and fatty acids. This research is to establish the combination of the most influential rearing parameters with local food residues, to correlate the findings with the nutritional value of the larvae harvested. 125 same-monthly old adults/replica are randomly selected in the mealworm breeding pool then placed to oviposit in growth chambers preset at 26°C and 65% RH. Adults are removed after 7 days. Larvae are harvested upon the apparition of the first nymphosis signs and batches, are analyzed for their nutritional values using wet chemistry analysis. The first samples analyses include total weight of both fresh and dried larvae, residual humidity, crude proteins (CP%), and crude fats (CF%). Further analyses are scheduled to include soluble proteins and fatty acids. Although they are consistent with previous published data, the preliminary results show no significant differences between treatments for any type of analysis. Nutritional properties of each substrate combination have yet allowed to discriminate the most effective residue recipe. Technical issues such as the particles’ size of the various substrate combinations and larvae screen compatibility are to be investigated since it induced a variable percentage of lost larvae upon harvesting. To address those methodological issues are key to develop a standardized efficient procedure. The aim is to provide producers with easily reproducible conditions, without incurring additional excessive expenditure on their part in terms of equipment and workforce.

Keywords: entomophagy, nutritional value, rearing parameters optimization, Tenebrio molitor

Procedia PDF Downloads 91
27018 Biopsy or Biomarkers: Which Is the Sample of Choice in Assessment of Liver Fibrosis?

Authors: S. H. Atef, N. H. Mahmoud, S. Abdrahman, A. Fattoh

Abstract:

Background: The aim of the study is to assess the diagnostic value of fibrotest and hyaluronic acid in discriminate between insignificant and significant fibrosis. Also, to find out if these parameters could replace liver biopsy which is currently used for selection of chronic hepatitis C patients eligible for antiviral therapy. Study design: This study was conducted on 52 patients with HCV RNA detected by polymerase chain reaction (PCR) who had undergone liver biopsy and attending the internal medicine clinic at Ain Shams University Hospital. Liver fibrosis was evaluated according to the METAVIR scoring system on a scale of F0 to F4. Biochemical markers assessed were: alpha-2 macroglobulin (α2-MG), apolipoprotein A1 (Apo-A1), haptoglobin, gamma-glutamyl transferase (GGT), total bilirubin (TB) and hyaluronic acid (HA). The fibrotest score was computed after adjusting for age and gender. Predictive values and ROC curves were used to assess the accuracy of fibrotest and HA results. Results: For fibrotest, the observed area under curve for the discrimination between minimal or no fibrosis (F0-F1) and significant fibrosis (F2-F4) was 0.6736 for cutoff value 0.19 with sensitivity of 84.2% and specificity of 85.7%. For HA, the sensitivity was 89.5% and specificity was 85.7% and area under curve was 0.540 at the best cutoff value 71 mg/dL. Multi-use of both parameters, HA at 71 mg/dL with fibrotest score at 0.22 give a sensitivity 89.5%, specificity 100 and efficacy 92.3% (AUC 0.895). Conclusion: The use of both fibrotest score and HA could be as alternative to biopsy in most patients with chronic hepaitis C putting in consideration some limitations of the proposed markers in evaluating liver fibrosis.

Keywords: fibrotest, liver fibrosis, HCV RNA, biochemical markers

Procedia PDF Downloads 266
27017 Application of Subversion Analysis in the Search for the Causes of Cracking in a Marine Engine Injector Nozzle

Authors: Leszek Chybowski, Artur Bejger, Katarzyna Gawdzińska

Abstract:

Subversion analysis is a tool used in the TRIZ (Theory of Inventive Problem Solving) methodology. This article introduces the history and describes the process of subversion analysis, as well as function analysis and analysis of the resources, used at the design stage when generating possible undesirable situations. The article charts the course of subversion analysis when applied to a fuel injection nozzle of a marine engine. The work describes the fuel injector nozzle as a technological system and presents principles of analysis for the causes of a cracked tip of the nozzle body. The system is modelled with functional analysis. A search for potential causes of the damage is undertaken and a cause-and-effect analysis for various hypotheses concerning the damage is drawn up. The importance of particular hypotheses is evaluated and the most likely causes of damage identified.

Keywords: complex technical system, fuel injector, function analysis, importance analysis, resource analysis, sabotage analysis, subversion analysis, TRIZ (Theory of Inventive Problem Solving)

Procedia PDF Downloads 594
27016 Comparative Diagnostic Performance of Diffusion-Weighted Imaging Combined With Microcalcifications on Mammography for Discriminating Malignant From Benign Bi-rads 4 Lesions With the Kaiser Score

Authors: Wangxu Xia

Abstract:

BACKGROUND BI-RADS 4 lesions raise the possibility of malignancy that warrant further clinical and radiologic work-up. This study aimed to evaluate the predictive performance of diffusion-weighted imaging(DWI) and microcalcifications on mammography for predicting malignancy of BI-RADS 4 lesions. In addition, the predictive performance of DWI combined with microcalcifications was alsocompared with the Kaiser score. METHODS During January 2021 and June 2023, 144 patients with 178 BI-RADS 4 lesions underwent conventional MRI, DWI, and mammography were included. The lesions were dichotomized intobenign or malignant according to the pathological results from core needle biopsy or surgical mastectomy. DWI was performed with a b value of 0 and 800s/mm2 and analyzed using theapparent diffusion coefficient, and a Kaiser score > 4 was considered to suggest malignancy. Thediagnostic performances for various diagnostic tests were evaluated with the receiver-operatingcharacteristic (ROC) curve. RESULTS The area under the curve (AUC) for DWI was significantly higher than that of the of mammography (0.86 vs 0.71, P<0.001), but was comparable with that of the Kaiser score (0.86 vs 0.84, P=0.58). However, the AUC for DWI combined with mammography was significantly highthan that of the Kaiser score (0.93 vs 0.84, P=0.007). The sensitivity for discriminating malignant from benign BI-RADS 4 lesions was highest at 89% for Kaiser score, but the highest specificity of 83% can be achieved with DWI combined with mammography. CONCLUSION DWI combined with microcalcifications on mammography could discriminate malignant BI-RADS4 lesions from benign ones with a high AUC and specificity. However, Kaiser score had a better sensitivity for discrimination.

Keywords: MRI, DWI, mammography, breast disease

Procedia PDF Downloads 37
27015 SPPO-Based Cation Exchange Membranes with a Positively Charged Layer for Cation Fractionation

Authors: Noor Ul Afsar, Wengen Ji, Bin Wu, Muhammad A. Shehzad, Liang Ge, Tongwen Xu

Abstract:

The synthesis of monovalent cation perm-selective membranes (MCPMs) to efficiently discriminate amongst cations from seawater is of great importance for several industrial applications. However, a technical approach is highly desired to construct MCPMs to obtain a high ionic flux and sustain perm-selectivity simultaneously. In the present work, the thickness of the quaternized poly (2, 6-dimethyl-1, 4-phenylene oxide) (QPPO) layer on the surface of the SPPO-PVA (SPVA) composite membrane was adjusted using a facile procedure to achieve high permselectivity without scarifying the ionic flux. The thickness of the selective layer was precisely controlled using various concentrations of the QPPO solution. By the introduction of the cationic layer on the SPVA membrane, the monovalent cation can be separated from the divalent cation by their difference in charge density. The influence of the selective barrier (thickness) endows MCPMs with high perm-selectivity up to 12.7 for 0.1 mol L⁻¹ Li⁺/Mg²⁺ system, which is very satisfactory for polymeric membranes. The fabricated membranes have low electrical resistance and high limiting current density (iₗᵢₘ). Keeping in view the ED results, the prepared membranes with selective surface layers could be a viable candidate for Li⁺ selective separation from divalent cation Mg²⁺.

Keywords: monovalent cation perm-selective membranes, cation fractionation, perm-selectivity, ionic flux, electrodialysis

Procedia PDF Downloads 44
27014 Estimating the Receiver Operating Characteristic Curve from Clustered Data and Case-Control Studies

Authors: Yalda Zarnegarnia, Shari Messinger

Abstract:

Receiver operating characteristic (ROC) curves have been widely used in medical research to illustrate the performance of the biomarker in correctly distinguishing the diseased and non-diseased groups. Correlated biomarker data arises in study designs that include subjects that contain same genetic or environmental factors. The information about correlation might help to identify family members at increased risk of disease development, and may lead to initiating treatment to slow or stop the progression to disease. Approaches appropriate to a case-control design matched by family identification, must be able to accommodate both the correlation inherent in the design in correctly estimating the biomarker’s ability to differentiate between cases and controls, as well as to handle estimation from a matched case control design. This talk will review some developed methods for ROC curve estimation in settings with correlated data from case control design and will discuss the limitations of current methods for analyzing correlated familial paired data. An alternative approach using Conditional ROC curves will be demonstrated, to provide appropriate ROC curves for correlated paired data. The proposed approach will use the information about the correlation among biomarker values, producing conditional ROC curves that evaluate the ability of a biomarker to discriminate between diseased and non-diseased subjects in a familial paired design.

Keywords: biomarker, correlation, familial paired design, ROC curve

Procedia PDF Downloads 215
27013 Production of Camel Nanobodies against of Anti-Morphine-3-Glucuronide for the Development of a Biosensor for Detecting Illicit Drug

Authors: Shirin Jalili, Sadegh Hasannia, Hadi Shirzad, Afshin Khara

Abstract:

Morphine is one of the most medicinally important analgesics and narcotics. Structurally, it is classified as an alkaloid because of the presence of nitrogen. Its structure is similar to that of codeine, thebaine, and heroin. An immunoassay to accurately discriminate between these analogous alkaloids would be highly beneficial. A key factor for such an assay is specificity with high sensitivity, which is totally dependent on the antibody employed. However, most antibodies against haptens are polyclonal serum antibodies that exhibit significant cross-reactivities with closely related compounds. The camel-derived single-chain antibody fragments (VHH) are the smallest molecules with antigen-binding capacity, possessing unique properties compared to other conventional antibodies. In this study, a library containing the VHH genes of a camel immunized with with morphine conjugated BSA following phage display technology was generated. By screening the camel-derived variable region of the heavy chain cDNA phage display library with the ability to bind the desired hapten, we obtained some nanobodies that recognize this hapten. Phage display expression of the Nbs from this library and pannings against this hapten resulted in a clear enrichment of four distinct Nb-displaying phages with specificity for morphine that could be a potential target site for the development of new strategies for the development of a biosensor for detecting illicit drug.

Keywords: phage display, nanobody, Morphine-3, glucuronide, ELISA, biosensor

Procedia PDF Downloads 404
27012 Investigation of the EEG Signal Parameters during Epileptic Seizure Phases in Consequence to the Application of External Healing Therapy on Subjects

Authors: Karan Sharma, Ajay Kumar

Abstract:

Epileptic seizure is a type of disease due to which electrical charge in the brain flows abruptly resulting in abnormal activity by the subject. One percent of total world population gets epileptic seizure attacks.Due to abrupt flow of charge, EEG (Electroencephalogram) waveforms change. On the display appear a lot of spikes and sharp waves in the EEG signals. Detection of epileptic seizure by using conventional methods is time-consuming. Many methods have been evolved that detect it automatically. The initial part of this paper provides the review of techniques used to detect epileptic seizure automatically. The automatic detection is based on the feature extraction and classification patterns. For better accuracy decomposition of the signal is required before feature extraction. A number of parameters are calculated by the researchers using different techniques e.g. approximate entropy, sample entropy, Fuzzy approximate entropy, intrinsic mode function, cross-correlation etc. to discriminate between a normal signal & an epileptic seizure signal.The main objective of this review paper is to present the variations in the EEG signals at both stages (i) Interictal (recording between the epileptic seizure attacks). (ii) Ictal (recording during the epileptic seizure), using most appropriate methods of analysis to provide better healthcare diagnosis. This research paper then investigates the effects of a noninvasive healing therapy on the subjects by studying the EEG signals using latest signal processing techniques. The study has been conducted with Reiki as a healing technique, beneficial for restoring balance in cases of body mind alterations associated with an epileptic seizure. Reiki is practiced around the world and is recommended for different health services as a treatment approach. Reiki is an energy medicine, specifically a biofield therapy developed in Japan in the early 20th century. It is a system involving the laying on of hands, to stimulate the body’s natural energetic system. Earlier studies have shown an apparent connection between Reiki and the autonomous nervous system. The Reiki sessions are applied by an experienced therapist. EEG signals are measured at baseline, during session and post intervention to bring about effective epileptic seizure control or its elimination altogether.

Keywords: EEG signal, Reiki, time consuming, epileptic seizure

Procedia PDF Downloads 384
27011 Prospective Validation of the FibroTest Score in Assessing Liver Fibrosis in Hepatitis C Infection with Genotype 4

Authors: G. Shiha, S. Seif, W. Samir, K. Zalata

Abstract:

Prospective Validation of the FibroTest Score in assessing Liver Fibrosis in Hepatitis C Infection with Genotype 4 FibroTest (FT) is non-invasive score of liver fibrosis that combines the quantitative results of 5 serum biochemical markers (alpha-2-macroglobulin, haptoglobin, apolipoprotein A1, gamma glutamyl transpeptidase (GGT) and bilirubin) and adjusted with the patient's age and sex in a patented algorithm to generate a measure of fibrosis. FT has been validated in patients with chronic hepatitis C (CHC) (Halfon et al., Gastroenterol. Clin Biol.( 2008), 32 6suppl 1, 22-39). The validation of fibro test ( FT) in genotype IV is not well studied. Our aim was to evaluate the performance of FibroTest in an independent prospective cohort of hepatitis C patients with genotype 4. Subject was 122 patients with CHC. All liver biopsies were scored using METAVIR system. Our fibrosis score(FT) were measured, and the performance of the cut-off score were done using ROC curve. Among patients with advanced fibrosis, the FT was identically matched with the liver biopsy in 18.6%, overestimated the stage of fibrosis in 44.2% and underestimated the stage of fibrosis in 37.7% of cases. Also in patients with no/mild fibrosis, identical matching was detected in 39.2% of cases with overestimation in 48.1% and underestimation in 12.7%. So, the overall results of the test were identical matching, overestimation and underestimation in 32%, 46.7% and 21.3% respectively. Using ROC curve it was found that (FT) at the cut-off point of 0.555 could discriminate early from advanced stages of fibrosis with an area under ROC curve (AUC) of 0.72, sensitivity of 65%, specificity of 69%, PPV of 68%, NPV of 66% and accuracy of 67%. As FibroTest Score overestimates the stage of advanced fibrosis, it should not be considered as a reliable surrogate for liver biopsy in hepatitis C infection with genotype 4.

Keywords: fibrotest, chronic Hepatitis C, genotype 4, liver biopsy

Procedia PDF Downloads 393
27010 A Real-Time Snore Detector Using Neural Networks and Selected Sound Features

Authors: Stelios A. Mitilineos, Nicolas-Alexander Tatlas, Georgia Korompili, Lampros Kokkalas, Stelios M. Potirakis

Abstract:

Obstructive Sleep Apnea Hypopnea Syndrome (OSAHS) is a widespread chronic disease that mostly remains undetected, mainly due to the fact that it is diagnosed via polysomnography which is a time and resource-intensive procedure. Screening the disease’s symptoms at home could be used as an alternative approach in order to alert individuals that potentially suffer from OSAHS without compromising their everyday routine. Since snoring is usually linked to OSAHS, developing a snore detector is appealing as an enabling technology for screening OSAHS at home using ubiquitous equipment like commodity microphones (included in, e.g., smartphones). In this context, this study developed a snore detection tool and herein present the approach and selection of specific sound features that discriminate snoring vs. environmental sounds, as well as the performance of the proposed tool. Furthermore, a Real-Time Snore Detector (RTSD) is built upon the snore detection tool and employed in whole-night sleep sound recordings resulting to a large dataset of snoring sound excerpts that are made freely available to the public. The RTSD may be used either as a stand-alone tool that offers insight to an individual’s sleep quality or as an independent component of OSAHS screening applications in future developments.

Keywords: obstructive sleep apnea hypopnea syndrome, apnea screening, snoring detection, machine learning, neural networks

Procedia PDF Downloads 185
27009 Effects of Wind Load on the Tank Structures with Various Shapes and Aspect Ratios

Authors: Doo Byong Bae, Jae Jun Yoo, Il Gyu Park, Choi Seowon, Oh Chang Kook

Abstract:

There are several wind load provisions to evaluate the wind response on tank structures such as API, Euro-code, etc. the assessment of wind action applying these provisions is made by performing the finite element analysis using both linear bifurcation analysis and geometrically nonlinear analysis. By comparing the pressure patterns obtained from the analysis with the results of wind tunnel test, most appropriate wind load criteria will be recommended.

Keywords: wind load, finite element analysis, linear bifurcation analysis, geometrically nonlinear analysis

Procedia PDF Downloads 609
27008 Reduction of False Positives in Head-Shoulder Detection Based on Multi-Part Color Segmentation

Authors: Lae-Jeong Park

Abstract:

The paper presents a method that utilizes figure-ground color segmentation to extract effective global feature in terms of false positive reduction in the head-shoulder detection. Conventional detectors that rely on local features such as HOG due to real-time operation suffer from false positives. Color cue in an input image provides salient information on a global characteristic which is necessary to alleviate the false positives of the local feature based detectors. An effective approach that uses figure-ground color segmentation has been presented in an effort to reduce the false positives in object detection. In this paper, an extended version of the approach is presented that adopts separate multipart foregrounds instead of a single prior foreground and performs the figure-ground color segmentation with each of the foregrounds. The multipart foregrounds include the parts of the head-shoulder shape and additional auxiliary foregrounds being optimized by a search algorithm. A classifier is constructed with the feature that consists of a set of the multiple resulting segmentations. Experimental results show that the presented method can discriminate more false positive than the single prior shape-based classifier as well as detectors with the local features. The improvement is possible because the presented approach can reduce the false positives that have the same colors in the head and shoulder foregrounds.

Keywords: pedestrian detection, color segmentation, false positive, feature extraction

Procedia PDF Downloads 258
27007 Photocatalysis with Fe/Ti-Pillared Clays for the Oxofunctionalization of Alkylaromatics by O2

Authors: Houria Rezala, Jose Luis Valverde, Amaya Romero, Alessandra Molinari, Andrea Maldotti

Abstract:

A pillared montmorillonite containing iron doped titania (Fe/Ti-PILC) has been prepared from a natural clay. This material has been characterized by X-ray diffraction, nitrogen adsorption, temperature programmed desorption of ammonia, inductively coupled plasma atomic emission spectroscopy, atomic absorption, and diffuse reflectance UV-VIS spectroscopy. The layer structure of Fe/Ti-PILC resulted to be ordered with an insertion of pillars, which caused a slight increase in the basal spacing of the clay. Its specific surface area was about three times larger than that of the parent Na-montmorillonite due principally to the creation of a remarkable microporous network. The doped material was a robust photocatalyst able to oxidize liquid alkyl aromatics to the corresponding carbonylic derivatives, using O2 as the oxidizing species, at mild pressure and temperature conditions. Accumulation of valuable carbonylic derivatives was possible since their over-oxidation to carbon dioxide was negligible. Fe/Ti-PILC was able to discriminate between toluene and cyclohexane in favor of the aromatic compound with an efficiency that is about three times higher than that of titanium pillared clays (Ti-PILC). It is likely that the addition of iron favored the formation of new acid sites able to interact with the aromatic substrate. Iron doping caused a significant TiO2 visible light-induced activity (wavelength > 400 nm) with only minor negative effects on its performance under UV-light irradiation (wavelength > 290 nm).

Keywords: alkyl aromatics oxidation, heterogeneous photocatalysis, iron doping, pillared clays

Procedia PDF Downloads 427
27006 Bartlett Factor Scores in Multiple Linear Regression Equation as a Tool for Estimating Economic Traits in Broilers

Authors: Oluwatosin M. A. Jesuyon

Abstract:

In order to propose a simpler tool that eliminates the age-long problems associated with the traditional index method for selection of multiple traits in broilers, the Barttlet factor regression equation is being proposed as an alternative selection tool. 100 day-old chicks each of Arbor Acres (AA) and Annak (AN) broiler strains were obtained from two rival hatcheries in Ibadan Nigeria. These were raised in deep litter system in a 56-day feeding trial at the University of Ibadan Teaching and Research Farm, located in South-west Tropical Nigeria. The body weight and body dimensions were measured and recorded during the trial period. Eight (8) zoometric measurements namely live weight (g), abdominal circumference, abdominal length, breast width, leg length, height, wing length and thigh circumference (all in cm) were recorded randomly from 20 birds within strain, at a fixed time on the first day of the new week respectively with a 5-kg capacity Camry scale. These records were analyzed and compared using completely randomized design (CRD) of SPSS analytical software, with the means procedure, Factor Scores (FS) in stepwise Multiple Linear Regression (MLR) procedure for initial live weight equations. Bartlett Factor Score (BFS) analysis extracted 2 factors for each strain, termed Body-length and Thigh-meatiness Factors for AA, and; Breast Size and Height Factors for AN. These derived orthogonal factors assisted in deducing and comparing trait-combinations that best describe body conformation and Meatiness in experimental broilers. BFS procedure yielded different body conformational traits for the two strains, thus indicating the different economic traits and advantages of strains. These factors could be useful as selection criteria for improving desired economic traits. The final Bartlett Factor Regression equations for prediction of body weight were highly significant with P < 0.0001, R2 of 0.92 and above, VIF of 1.00, and DW of 1.90 and 1.47 for Arbor Acres and Annak respectively. These FSR equations could be used as a simple and potent tool for selection during poultry flock improvement, it could also be used to estimate selection index of flocks in order to discriminate between strains, and evaluate consumer preference traits in broilers.

Keywords: alternative selection tool, Bartlet factor regression model, consumer preference trait, linear and body measurements, live body weight

Procedia PDF Downloads 183
27005 The Role of Environmental Analysis in Managing Knowledge in Small and Medium Sized Enterprises

Authors: Liu Yao, B. T. Wan Maseri, Wan Mohd, B. T. Nurul Izzah, Mohd Shah, Wei Wei

Abstract:

Effectively managing knowledge has become a vital weapon for businesses to survive or to succeed in the increasingly competitive market. But do they perform environmental analysis when managing knowledge? If yes, how is the level and significance? This paper established a conceptual framework covering the basic knowledge management activities (KMA) to examine their contribution towards organizational performance (OP). Environmental analysis (EA) was then investigated from both internal and external aspects, to identify its effects on that contribution. Data was collected from 400 Chinese SMEs by questionnaires. Cronbach's α and factor analysis were conducted. Regression results show that the external analysis presents higher level than internal analysis. However, the internal analysis mediates the effects of external analysis on the KMA-OP relation and plays more significant role in the relation comparing with the external analysis. Thus, firms shall improve environmental analysis especially the internal analysis to enhance their KM practices.

Keywords: knowledge management, environmental analysis, performance, mediating, small sized enterprises, medium sized enterprises

Procedia PDF Downloads 593
27004 Efficient Human Motion Detection Feature Set by Using Local Phase Quantization Method

Authors: Arwa Alzughaibi

Abstract:

Human Motion detection is a challenging task due to a number of factors including variable appearance, posture and a wide range of illumination conditions and background. So, the first need of such a model is a reliable feature set that can discriminate between a human and a non-human form with a fair amount of confidence even under difficult conditions. By having richer representations, the classification task becomes easier and improved results can be achieved. The Aim of this paper is to investigate the reliable and accurate human motion detection models that are able to detect the human motions accurately under varying illumination levels and backgrounds. Different sets of features are tried and tested including Histogram of Oriented Gradients (HOG), Deformable Parts Model (DPM), Local Decorrelated Channel Feature (LDCF) and Aggregate Channel Feature (ACF). However, we propose an efficient and reliable human motion detection approach by combining Histogram of oriented gradients (HOG) and local phase quantization (LPQ) as the feature set, and implementing search pruning algorithm based on optical flow to reduce the number of false positive. Experimental results show the effectiveness of combining local phase quantization descriptor and the histogram of gradient to perform perfectly well for a large range of illumination conditions and backgrounds than the state-of-the-art human detectors. Areaunder th ROC Curve (AUC) of the proposed method achieved 0.781 for UCF dataset and 0.826 for CDW dataset which indicates that it performs comparably better than HOG, DPM, LDCF and ACF methods.

Keywords: human motion detection, histograms of oriented gradient, local phase quantization, local phase quantization

Procedia PDF Downloads 235
27003 Spatial and Temporal Analysis of Forest Cover Change with Special Reference to Anthropogenic Activities in Kullu Valley, North-Western Indian Himalayan Region

Authors: Krisala Joshi, Sayanta Ghosh, Renu Lata, Jagdish C. Kuniyal

Abstract:

Throughout the world, monitoring and estimating the changing pattern of forests across diverse landscapes through remote sensing is instrumental in understanding the interactions of human activities and the ecological environment with the changing climate. Forest change detection using satellite imageries has emerged as an important means to gather information on a regional scale. Kullu valley in Himachal Pradesh, India is situated in a transitional zone between the lesser and the greater Himalayas. Thus, it presents a typical rugged mountainous terrain with moderate to high altitude which varies from 1200 meters to over 6000 meters. Due to changes in agricultural cropping patterns, urbanization, industrialization, hydropower generation, climate change, tourism, and anthropogenic forest fire, it has undergone a tremendous transformation in forest cover in the past three decades. The loss and degradation of forest cover results in soil erosion, loss of biodiversity including damage to wildlife habitats, and degradation of watershed areas, and deterioration of the overall quality of nature and life. The supervised classification of LANDSAT satellite data was performed to assess the changes in forest cover in Kullu valley over the years 2000 to 2020. Normalized Burn Ratio (NBR) was calculated to discriminate between burned and unburned areas of the forest. Our study reveals that in Kullu valley, the increasing number of forest fire incidents specifically, those due to anthropogenic activities has been on a rise, each subsequent year. The main objective of the present study is, therefore, to estimate the change in the forest cover of Kullu valley and to address the various social aspects responsible for the anthropogenic forest fires. Also, to assess its impact on the significant changes in the regional climatic factors, specifically, temperature, humidity, and precipitation over three decades, with the help of satellite imageries and ground data. The main outcome of the paper, we believe, will be helpful for the administration for making a quantitative assessment of the forest cover area changes due to anthropogenic activities and devising long-term measures for creating awareness among the local people of the area.

Keywords: Anthropogenic Activities, Forest Change Detection, Normalized Burn Ratio (NBR), Supervised Classification

Procedia PDF Downloads 153
27002 Improving Taint Analysis of Android Applications Using Finite State Machines

Authors: Assad Maalouf, Lunjin Lu, James Lynott

Abstract:

We present a taint analysis that can automatically detect when string operations result in a string that is free of taints, where all the tainted patterns have been removed. This is an improvement on the conservative behavior of previous taint analyzers, where a string operation on a tainted string always leads to a tainted string unless the operation is manually marked as a sanitizer. The taint analysis is built on top of a string analysis that uses finite state automata to approximate the sets of values that string variables can take during the execution of a program. The proposed approach has been implemented as an extension of FlowDroid and experimental results show that the resulting taint analyzer is much more precise than the original FlowDroid.

Keywords: android, static analysis, string analysis, taint analysis

Procedia PDF Downloads 156
27001 The Documentary Analysis of Meta-Analysis Research in Violence of Media

Authors: Proud Arunrangsiwed

Abstract:

The part of “future direction” in the findings of meta-analysis could provide the great direction to conduct the future studies. This study, “The Documentary Analysis of Meta-Analysis Research in Violence of Media” would conclude “future directions” out of 10 meta-analysis papers. The purposes of this research are to find an appropriate research design or an appropriate methodology for the future research related to the topic, “violence of media”. Further research needs to explore by longitudinal and experimental design, and also needs to have a careful consideration about age effects, time spent effects, enjoyment effects, and ordinary lifestyle of each media consumer.

Keywords: aggressive, future direction, meta-analysis, media, violence

Procedia PDF Downloads 388
27000 Microfluidic Plasmonic Bio-Sensing of Exosomes by Using a Gold Nano-Island Platform

Authors: Srinivas Bathini, Duraichelvan Raju, Simona Badilescu, Muthukumaran Packirisamy

Abstract:

A bio-sensing method, based on the plasmonic property of gold nano-islands, has been developed for detection of exosomes in a clinical setting. The position of the gold plasmon band in the UV-Visible spectrum depends on the size and shape of gold nanoparticles as well as on the surrounding environment. By adsorbing various chemical entities, or binding them, the gold plasmon band will shift toward longer wavelengths and the shift is proportional to the concentration. Exosomes transport cargoes of molecules and genetic materials to proximal and distal cells. Presently, the standard method for their isolation and quantification from body fluids is by ultracentrifugation, not a practical method to be implemented in a clinical setting. Thus, a versatile and cutting-edge platform is required to selectively detect and isolate exosomes for further analysis at clinical level. The new sensing protocol, instead of antibodies, makes use of a specially synthesized polypeptide (Vn96), to capture and quantify the exosomes from different media, by binding the heat shock proteins from exosomes. The protocol has been established and optimized by using a glass substrate, in order to facilitate the next stage, namely the transfer of the protocol to a microfluidic environment. After each step of the protocol, the UV-Vis spectrum was recorded and the position of gold Localized Surface Plasmon Resonance (LSPR) band was measured. The sensing process was modelled, taking into account the characteristics of the nano-island structure, prepared by thermal convection and annealing. The optimal molar ratios of the most important chemical entities, involved in the detection of exosomes were calculated as well. Indeed, it was found that the results of the sensing process depend on the two major steps: the molar ratios of streptavidin to biotin-PEG-Vn96 and, the final step, the capture of exosomes by the biotin-PEG-Vn96 complex. The microfluidic device designed for sensing of exosomes consists of a glass substrate, sealed by a PDMS layer that contains the channel and a collecting chamber. In the device, the solutions of linker, cross-linker, etc., are pumped over the gold nano-islands and an Ocean Optics spectrometer is used to measure the position of the Au plasmon band at each step of the sensing. The experiments have shown that the shift of the Au LSPR band is proportional to the concentration of exosomes and, thereby, exosomes can be accurately quantified. An important advantage of the method is the ability to discriminate between exosomes having different origins.

Keywords: exosomes, gold nano-islands, microfluidics, plasmonic biosensing

Procedia PDF Downloads 151
26999 Considering Partially Developed Artifacts in Change Impact Analysis Implementation

Authors: Nazri Kama, Sufyan Basri, Roslina Ibrahim

Abstract:

It is important to manage the changes in the software to meet the evolving needs of the customer. Accepting too many changes causes delay in the completion and it incurs additional cost. One type of information that helps to make the decision is through change impact analysis. Current impact analysis approaches assume that all classes in the class artifact are completely developed and the class artifact is used as a source of analysis. However, these assumptions are impractical for impact analysis in the software development phase as some classes in the class artifact are still under development or partially developed that leads to inaccuracy. This paper presents a novel impact analysis approach to be used in the software development phase. The significant achievements of the approach are demonstrated through an extensive experimental validation using three case studies.

Keywords: software development, impact analysis, traceability, static analysis.

Procedia PDF Downloads 587
26998 On the Analysis of Pseudorandom Partial Quotient Sequences Generated from Continued Fractions

Authors: T. Padma, Jayashree S. Pillai

Abstract:

Random entities are an essential component in any cryptographic application. The suitability of a number theory based novel pseudorandom sequence called Pseudorandom Partial Quotient Sequence (PPQS) generated from the continued fraction expansion of irrational numbers, in cryptographic applications, is analyzed in this paper. An approach to build the algorithm around a hard mathematical problem has been considered. The PQ sequence is tested for randomness and its suitability as a cryptographic key by performing randomness analysis, key sensitivity and key space analysis, precision analysis and evaluating the correlation properties is established.

Keywords: pseudorandom sequences, key sensitivity, correlation, security analysis, randomness analysis, sensitivity analysis

Procedia PDF Downloads 565
26997 Development of Alpha Spectroscopy Method with Solid State Nuclear Track Detector Using Aluminium Thin Films

Authors: Nidal Dwaikat

Abstract:

This work presents the development of alpha spectroscopy method with Solid-state nuclear track detectors using aluminum thin films. The resolution of this method is high, and it is able to discriminate between alpha particles at different incident energy. It can measure the exact number of alpha particles at specific energy without needing a calibration of alpha track diameter versus alpha energy. This method was tested by using Cf-252 alpha standard source at energies 5.11 Mev, 3.86 MeV and 2.7 MeV, which produced by the variation of detector -standard source distance. On front side, two detectors were covered with two Aluminum thin films and the third detector was kept uncovered. The thickness of Aluminum thin films was selected carefully (using SRIM 2013) such that one of the films will block the lower two alpha particles (3.86 MeV and 2.7 MeV) and the alpha particles at higher energy (5.11 Mev) can penetrate the film and reach the detector’s surface. The second thin film will block alpha particles at lower energy of 2.7 MeV and allow alpha particles at higher two energies (5.11 Mev and 3.86 MeV) to penetrate and produce tracks. For uncovered detector, alpha particles at three different energies can produce tracks on it. For quality assurance and accuracy, the detectors were mounted on thick enough copper substrates to block exposure from the backside. The tracks on the first detector are due to alpha particles at energy of 5.11 MeV. The difference between the tracks number on the first detector and the tracks number on the second detector is due to alpha particles at energy of 3.8 MeV. Finally, by subtracting the tracks number on the second detector from the tracks number on the third detector (uncovered), we can find the tracks number due to alpha particles at energy 2.7 MeV. After knowing the efficiency calibration factor, we can exactly calculate the activity of standard source.

Keywords: aluminium thin film, alpha particles, copper substrate, CR-39 detector

Procedia PDF Downloads 343
26996 Impact on the Results of Sub-Group Analysis on Performance of Recommender Systems

Authors: Ho Yeon Park, Kyoung-Jae Kim

Abstract:

The purpose of this study is to investigate whether friendship in social media can be an important factor in recommender system through social scientific analysis of friendship in popular social media such as Facebook and Twitter. For this purpose, this study analyzes data on friendship in real social media using component analysis and clique analysis among sub-group analysis in social network analysis. In this study, we propose an algorithm to reflect the results of sub-group analysis on the recommender system. The key to this algorithm is to ensure that recommendations from users in friendships are more likely to be reflected in recommendations from users. As a result of this study, outcomes of various subgroup analyzes were derived, and it was confirmed that the results were different from the results of the existing recommender system. Therefore, it is considered that the results of the subgroup analysis affect the recommendation performance of the system. Future research will attempt to generalize the results of the research through further analysis of various social data.

Keywords: sub-group analysis, social media, social network analysis, recommender systems

Procedia PDF Downloads 338
26995 Sentiment Analysis: Comparative Analysis of Multilingual Sentiment and Opinion Classification Techniques

Authors: Sannikumar Patel, Brian Nolan, Markus Hofmann, Philip Owende, Kunjan Patel

Abstract:

Sentiment analysis and opinion mining have become emerging topics of research in recent years but most of the work is focused on data in the English language. A comprehensive research and analysis are essential which considers multiple languages, machine translation techniques, and different classifiers. This paper presents, a comparative analysis of different approaches for multilingual sentiment analysis. These approaches are divided into two parts: one using classification of text without language translation and second using the translation of testing data to a target language, such as English, before classification. The presented research and results are useful for understanding whether machine translation should be used for multilingual sentiment analysis or building language specific sentiment classification systems is a better approach. The effects of language translation techniques, features, and accuracy of various classifiers for multilingual sentiment analysis is also discussed in this study.

Keywords: cross-language analysis, machine learning, machine translation, sentiment analysis

Procedia PDF Downloads 691
26994 Chatbots and the Future of Globalization: Implications of Businesses and Consumers

Authors: Shoury Gupta

Abstract:

Chatbots are a rapidly growing technological trend that has revolutionized the way businesses interact with their customers. With the advancements in artificial intelligence, chatbots can now mimic human-like conversations and provide instant and efficient responses to customer inquiries. In this research paper, we aim to explore the implications of chatbots on the future of globalization for both businesses and consumers. The paper begins by providing an overview of the current state of chatbots in the global market and their growth potential in the future. The focus is on how chatbots have become a valuable tool for businesses looking to expand their global reach, especially in areas with high population density and language barriers. With chatbots, businesses can engage with customers in different languages and provide 24/7 customer service support, creating a more accessible and convenient customer experience. The paper then examines the impact of chatbots on cross-cultural communication and how they can help bridge communication gaps between businesses and consumers from different cultural backgrounds. Chatbots can potentially facilitate cross-cultural communication by offering real-time translations, voice recognition, and other innovative features that can help users communicate effectively across different languages and cultures. By providing more accessible and inclusive communication channels, chatbots can help businesses reach new markets and expand their customer base, making them more competitive in the global market. However, the paper also acknowledges that there are potential drawbacks associated with chatbots. For instance, chatbots may not be able to address complex customer inquiries that require human input. Additionally, chatbots may perpetuate biases if they are programmed with certain stereotypes or assumptions about different cultures. These drawbacks may have significant implications for businesses and consumers alike. To explore the implications of chatbots on the future of globalization in greater detail, the paper provides a thorough review of existing literature and case studies. The review covers topics such as the benefits of chatbots for businesses and consumers, the potential drawbacks of chatbots, and how businesses can mitigate any risks associated with chatbot use. The paper also discusses the ethical considerations associated with chatbot use, such as privacy concerns and the need to ensure that chatbots do not discriminate against certain groups of people. The ethical implications of chatbots are particularly important given the potential for chatbots to be used in sensitive areas such as healthcare and financial services. Overall, this research paper provides a comprehensive analysis of chatbots and their implications for the future of globalization. By exploring both the potential benefits and drawbacks of chatbot use, the paper aims to provide insights into how businesses and consumers can leverage this technology to achieve greater global reach and improve cross-cultural communication. Ultimately, the paper concludes that chatbots have the potential to be a powerful tool for businesses looking to expand their global footprint and improve their customer experience, but that care must be taken to mitigate any risks associated with their use.

Keywords: chatbots, conversational AI, globalization, businesses

Procedia PDF Downloads 74
26993 Sentiment Analysis in Social Networks Sites Based on a Bibliometrics Analysis: A Comprehensive Analysis and Trends for Future Research Planning

Authors: Jehan Fahim M. Alsulami

Abstract:

Academic research about sentiment analysis in sentiment analysis has obtained significant advancement over recent years and is flourishing from the collection of knowledge provided by various academic disciplines. In the current study, the status and development trend of the field of sentiment analysis in social networks is evaluated through a bibliometric analysis of academic publications. In particular, the distributions of publications and citations, the distribution of subject, predominant journals, authors, countries are analyzed. The collaboration degree is applied to measure scientific connections from different aspects. Moreover, the keyword co-occurrence analysis is used to find out the major research topics and their evolutions throughout the time span. The area of sentiment analysis in social networks has gained growing attention in academia, with computer science and engineering as the top main research subjects. China and the USA provide the most to the area development. Authors prefer to collaborate more with those within the same nation. Among the research topics, newly risen topics such as COVID-19, customer satisfaction are discovered.

Keywords: bibliometric analysis, sentiment analysis, social networks, social media

Procedia PDF Downloads 186
26992 High School Gain Analytics From National Assessment Program – Literacy and Numeracy and Australian Tertiary Admission Rankin Linkage

Authors: Andrew Laming, John Hattie, Mark Wilson

Abstract:

Nine Queensland Independent high schools provided deidentified student-matched ATAR and NAPLAN data for all 1217 ATAR graduates since 2020 who also sat NAPLAN at the school. Graduating cohorts from the nine schools contained a mean 100 ATAR graduates with previous NAPLAN data from their school. Excluded were vocational students (mean=27) and any ATAR graduates without NAPLAN data (mean=20). Based on Index of Community Socio-Educational Access (ICSEA) prediction, all schools had larger that predicted proportions of their students graduating with ATARs. There were an additional 173 students not releasing their ATARs to their school (14%), requiring this data to be inferred by schools. Gain was established by first converting each student’s strongest NAPLAN domain to a statewide percentile, then subtracting this result from final ATAR. The resulting ‘percentile shift’ was corrected for plausible ATAR participation at each NAPLAN level. Strongest NAPLAN domain had the highest correlation with ATAR (R2=0.58). RESULTS School mean NAPLAN scores fitted ICSEA closely (R2=0.97). Schools achieved a mean cohort gain of two ATAR rankings, but only 66% of students gained. This ranged from 46% of top-NAPLAN decile students gaining, rising to 75% achieving gains outside the top decile. The 54% of top-decile students whose ATAR fell short of prediction lost a mean 4.0 percentiles (or 6.2 percentiles prior to correction for regression to the mean). 71% of students in smaller schools gained, compared to 63% in larger schools. NAPLAN variability in each of the 13 ICSEA1100 cohorts was 17%, with both intra-school and inter-school variation of these values extremely low (0.3% to 1.8%). Mean ATAR change between years in each school was just 1.1 ATAR ranks. This suggests consecutive school cohorts and ICSEA-similar schools share very similar distributions and outcomes over time. Quantile analysis of the NAPLAN/ATAR revealed heteroscedasticity, but splines offered little additional benefit over simple linear regression. The NAPLAN/ATAR R2 was 0.33. DISCUSSION Standardised data like NAPLAN and ATAR offer educators a simple no-cost progression metric to analyse performance in conjunction with their internal test results. Change is expressed in percentiles, or ATAR shift per student, which is layperson intuitive. Findings may also reduce ATAR/vocational stream mismatch, reveal proportions of cohorts meeting or falling short of expectation and demonstrate by how much. Finally, ‘crashed’ ATARs well below expectation are revealed, which schools can reasonably work to minimise. The percentile shift method is neither value-add nor a growth percentile. In the absence of exit NAPLAN testing, this metric is unable to discriminate academic gain from legitimate ATAR-maximizing strategies. But by controlling for ICSEA, ATAR proportion variation and student mobility, it uncovers progression to ATAR metrics which are not currently publicly available. However achieved, ATAR maximisation is a sought-after private good. So long as standardised nationwide data is available, this analysis offers useful analytics for educators and reasonable predictivity when counselling subsequent cohorts about their ATAR prospects.  

Keywords: NAPLAN, ATAR, analytics, measurement, gain, performance, data, percentile, value-added, high school, numeracy, reading comprehension, variability, regression to the mean

Procedia PDF Downloads 43
26991 Design and Creation of a BCI Videogame for Training and Measure of Sustained Attention in Children with ADHD

Authors: John E. Muñoz, Jose F. Lopez, David S. Lopez

Abstract:

Attention Deficit Hyperactivity Disorder (ADHD) is a disorder that affects 1 out of 5 Colombian children, converting into a real public health problem in the country. Conventional treatments such as medication and neuropsychological therapy have been proved to be insufficient in order to decrease high incidence levels of ADHD in the principal Colombian cities. This work demonstrates a design and development of a videogame that uses a brain computer interface not only to serve as an input device but also as a tool to monitor neurophysiologic signal. The video game named “The Harvest Challenge” puts a cultural scene of a Colombian coffee grower in its context, where a player can use his/her avatar in three mini games created in order to reinforce four fundamental aspects: i) waiting ability, ii) planning ability, iii) ability to follow instructions and iv) ability to achieve objectives. The details of this collaborative designing process of the multimedia tool according to the exact clinic necessities and the description of interaction proposals are presented through the mental stages of attention and relaxation. The final videogame is presented as a tool for sustained attention training in children with ADHD using as an action mechanism the neuromodulation of Beta and Theta waves through an electrode located in the central part of the front lobe of the brain. The processing of an electroencephalographic signal is produced automatically inside the videogame allowing to generate a report of the theta/beta ratio evolution - a biological marker, which has been demonstrated to be a sufficient measure to discriminate of children with deficit and without.

Keywords: BCI, neuromodulation, ADHD, videogame, neurofeedback, theta/beta ratio

Procedia PDF Downloads 349