Search results for: lean literature classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8892

Search results for: lean literature classification

7812 Contribution to the Study of Automatic Epileptiform Pattern Recognition in Long Term EEG Signals

Authors: Christine F. Boos, Fernando M. Azevedo

Abstract:

Electroencephalogram (EEG) is a record of the electrical activity of the brain that has many applications, such as monitoring alertness, coma and brain death; locating damaged areas of the brain after head injury, stroke and tumor; monitoring anesthesia depth; researching physiology and sleep disorders; researching epilepsy and localizing the seizure focus. Epilepsy is a chronic condition, or a group of diseases of high prevalence, still poorly explained by science and whose diagnosis is still predominantly clinical. The EEG recording is considered an important test for epilepsy investigation and its visual analysis is very often applied for clinical confirmation of epilepsy diagnosis. Moreover, this EEG analysis can also be used to help define the types of epileptic syndrome, determine epileptiform zone, assist in the planning of drug treatment and provide additional information about the feasibility of surgical intervention. In the context of diagnosis confirmation the analysis is made using long term EEG recordings with at least 24 hours long and acquired by a minimum of 24 electrodes in which the neurophysiologists perform a thorough visual evaluation of EEG screens in search of specific electrographic patterns called epileptiform discharges. Considering that the EEG screens usually display 10 seconds of the recording, the neurophysiologist has to evaluate 360 screens per hour of EEG or a minimum of 8,640 screens per long term EEG recording. Analyzing thousands of EEG screens in search patterns that have a maximum duration of 200 ms is a very time consuming, complex and exhaustive task. Because of this, over the years several studies have proposed automated methodologies that could facilitate the neurophysiologists’ task of identifying epileptiform discharges and a large number of methodologies used neural networks for the pattern classification. One of the differences between all of these methodologies is the type of input stimuli presented to the networks, i.e., how the EEG signal is introduced in the network. Five types of input stimuli have been commonly found in literature: raw EEG signal, morphological descriptors (i.e. parameters related to the signal’s morphology), Fast Fourier Transform (FFT) spectrum, Short-Time Fourier Transform (STFT) spectrograms and Wavelet Transform features. This study evaluates the application of these five types of input stimuli and compares the classification results of neural networks that were implemented using each of these inputs. The performance of using raw signal varied between 43 and 84% efficiency. The results of FFT spectrum and STFT spectrograms were quite similar with average efficiency being 73 and 77%, respectively. The efficiency of Wavelet Transform features varied between 57 and 81% while the descriptors presented efficiency values between 62 and 93%. After simulations we could observe that the best results were achieved when either morphological descriptors or Wavelet features were used as input stimuli.

Keywords: Artificial neural network, electroencephalogram signal, pattern recognition, signal processing

Procedia PDF Downloads 512
7811 Sustainable Business Model Archetypes – A Systematic Review and Application to the Plastic Industry

Authors: Felix Schumann, Giorgia Carratta, Tobias Dauth, Liv Jaeckel

Abstract:

In the last few decades, the rapid growth of the use and disposal of plastic items has led to their overaccumulation in the environment. As a result, plastic pollution has become a subject of global concern. Today plastics are used as raw materials in almost every industry. While the recognition of the ecological, social, and economic impact of plastics in academic research is on the rise, the potential role of the ‘plastic industry’ in dealing with such issues is still largely underestimated. Therefore, the literature on sustainable plastic management is still nascent and fragmented. Working towards sustainability requires a fundamental shift in the way companies employ plastics in their day-to-day business. For that reason, the applicability of the business model concept has recently gained momentum in environmental research. Business model innovation is increasingly recognized as an important driver to re-conceptualize the purpose of the firm and to readily integrate sustainability in their business. It can serve as a starting point to investigate whether and how sustainability can be realized under industry- and firm-specific circumstances. Yet, there is no comprehensive view in the plastic industry on how firms start refining their business models to embed sustainability in their operations. Our study addresses this gap, looking primarily at the industrial sectors responsible for the production of the largest amount of plastic waste today: plastic packaging, consumer goods, construction, textile, and transport. Relying on the archetypes of sustainable business models and applying them to the aforementioned sectors, we try to identify companies’ current strategies to make their business models more sustainable. Based on the thematic clustering, we can develop an integrative framework for the plastic industry. The findings are underpinned and illustrated by a variety of relevant plastic management solutions that the authors have identified through a systematic literature review and analysis of existing, empirically grounded research in this field. Using the archetypes, we can promote options for business model innovations for the most important sectors in which plastics are used. Moreover, by linking the proposed business model archetypes to the plastic industry, our research approach guides firms in exploring sustainable business opportunities. Likewise, researchers and policymakers can utilize our classification to identify best practices. The authors believe that the study advances the current knowledge on sustainable plastic management through its broad empirical industry analyses. Hence, the application of business model archetypes in the plastic industry will be useful for shaping companies’ transformation to create and deliver more sustainability and provides avenues for future research endeavors.

Keywords: business models, environmental economics, plastic management, plastic pollution, sustainability

Procedia PDF Downloads 80
7810 Preliminary Evaluation of Decommissioning Wastes for the First Commercial Nuclear Power Reactor in South Korea

Authors: Kyomin Lee, Joohee Kim, Sangho Kang

Abstract:

The commercial nuclear power reactor in South Korea, Kori Unit 1, which was a 587 MWe pressurized water reactor that started operation since 1978, was permanently shut down in June 2017 without an additional operating license extension. The Kori 1 Unit is scheduled to become the nuclear power unit to enter the decommissioning phase. In this study, the preliminary evaluation of the decommissioning wastes for the Kori Unit 1 was performed based on the following series of process: firstly, the plant inventory is investigated based on various documents (i.e., equipment/ component list, construction records, general arrangement drawings). Secondly, the radiological conditions of systems, structures and components (SSCs) are established to estimate the amount of radioactive waste by waste classification. Third, the waste management strategies for Kori Unit 1 including waste packaging are established. Forth, selection of the proper decontamination and dismantling (D&D) technologies is made considering the various factors. Finally, the amount of decommissioning waste by classification for Kori 1 is estimated using the DeCAT program, which was developed by KEPCO-E&C for a decommissioning cost estimation. The preliminary evaluation results have shown that the expected amounts of decommissioning wastes were less than about 2% and 8% of the total wastes generated (i.e., sum of clean wastes and radwastes) before/after waste processing, respectively, and it was found that the majority of contaminated material was carbon or alloy steel and stainless steel. In addition, within the range of availability of information, the results of the evaluation were compared with the results from the various decommissioning experiences data or international/national decommissioning study. The comparison results have shown that the radioactive waste amount from Kori Unit 1 decommissioning were much less than those from the plants decommissioned in U.S. and were comparable to those from the plants in Europe. This result comes from the difference of disposal cost and clearance criteria (i.e., free release level) between U.S. and non-U.S. The preliminary evaluation performed using the methodology established in this study will be useful as a important information in establishing the decommissioning planning for the decommissioning schedule and waste management strategy establishment including the transportation, packaging, handling, and disposal of radioactive wastes.

Keywords: characterization, classification, decommissioning, decontamination and dismantling, Kori 1, radioactive waste

Procedia PDF Downloads 196
7809 Age and Gender Differences in the Language Deficits of Individuals with Asperger Syndrome (AS) and High Functioning Autism (HFA): Systematic Literature Review (SLR) and Meta-Analysis (MA)

Authors: Sadeq Al Yaari, Muhammad Alkhunayn, Montaha Al Yaari, Ayman Al Yaari, Aayah Al Yaari, Adham Al Yaari, Sajedah Al Yaari, Fatehi Eissa

Abstract:

Background: In spite of the fact that several language deficits, both internalizing and externalizing, have been documented in comorbidity with Asperger Syndrome (AS) and High Functioning Autism (HFA), there is a paucity of the continuity of these deficits in these individuals’ life span. Furthermore, findings regarding differences in the occurrence of these language deficits both in HFA and AS males and females are mixed. Aims: Systematic Literature Review and meta-analysis (SLR & Meta-analysis) provides a more valid indicator; that is why it has been used here to distinguish HFA and AS individuals in terms of (a) When does language deficits prevails in these individuals’ life and (b) in which gender the prevalence of these language deficits is seen more. Materials and Method: In this SLR & Meta-analysis, PubMed, ScienceDirect, SpringerLink, SAGE journals online, WILEY online library, Google Scholar, CINAHL, EMBASE, Scopus, and ERIC databases in addition to unpublished literature were systematically searched between 1st of January 1980 and 30th of May 2022. Interpretations: Although overall sample sizes were small, the combined results do permit the tentative conclusion that prevalence of language deficits both in AS and HFA children and adults with more prevalence of phonological deficit in HFA male children and pragmatic deficits in AS male children. Further research should be separately undertaken in each linguistic branch to verify the occlusions of this study.

Keywords: high-functioning autism, Asperger syndrome, systematic literature review, meta-analysis

Procedia PDF Downloads 31
7808 Myanmar Character Recognition Using Eight Direction Chain Code Frequency Features

Authors: Kyi Pyar Zaw, Zin Mar Kyu

Abstract:

Character recognition is the process of converting a text image file into editable and searchable text file. Feature Extraction is the heart of any character recognition system. The character recognition rate may be low or high depending on the extracted features. In the proposed paper, 25 features for one character are used in character recognition. Basically, there are three steps of character recognition such as character segmentation, feature extraction and classification. In segmentation step, horizontal cropping method is used for line segmentation and vertical cropping method is used for character segmentation. In the Feature extraction step, features are extracted in two ways. The first way is that the 8 features are extracted from the entire input character using eight direction chain code frequency extraction. The second way is that the input character is divided into 16 blocks. For each block, although 8 feature values are obtained through eight-direction chain code frequency extraction method, we define the sum of these 8 feature values as a feature for one block. Therefore, 16 features are extracted from that 16 blocks in the second way. We use the number of holes feature to cluster the similar characters. We can recognize the almost Myanmar common characters with various font sizes by using these features. All these 25 features are used in both training part and testing part. In the classification step, the characters are classified by matching the all features of input character with already trained features of characters.

Keywords: chain code frequency, character recognition, feature extraction, features matching, segmentation

Procedia PDF Downloads 300
7807 Partial Least Square Regression for High-Dimentional and High-Correlated Data

Authors: Mohammed Abdullah Alshahrani

Abstract:

The research focuses on investigating the use of partial least squares (PLS) methodology for addressing challenges associated with high-dimensional correlated data. Recent technological advancements have led to experiments producing data characterized by a large number of variables compared to observations, with substantial inter-variable correlations. Such data patterns are common in chemometrics, where near-infrared (NIR) spectrometer calibrations record chemical absorbance levels across hundreds of wavelengths, and in genomics, where thousands of genomic regions' copy number alterations (CNA) are recorded from cancer patients. PLS serves as a widely used method for analyzing high-dimensional data, functioning as a regression tool in chemometrics and a classification method in genomics. It handles data complexity by creating latent variables (components) from original variables. However, applying PLS can present challenges. The study investigates key areas to address these challenges, including unifying interpretations across three main PLS algorithms and exploring unusual negative shrinkage factors encountered during model fitting. The research presents an alternative approach to addressing the interpretation challenge of predictor weights associated with PLS. Sparse estimation of predictor weights is employed using a penalty function combining a lasso penalty for sparsity and a Cauchy distribution-based penalty to account for variable dependencies. The results demonstrate sparse and grouped weight estimates, aiding interpretation and prediction tasks in genomic data analysis. High-dimensional data scenarios, where predictors outnumber observations, are common in regression analysis applications. Ordinary least squares regression (OLS), the standard method, performs inadequately with high-dimensional and highly correlated data. Copy number alterations (CNA) in key genes have been linked to disease phenotypes, highlighting the importance of accurate classification of gene expression data in bioinformatics and biology using regularized methods like PLS for regression and classification.

Keywords: partial least square regression, genetics data, negative filter factors, high dimensional data, high correlated data

Procedia PDF Downloads 34
7806 Dimensional Investigation of Food Addiction in Individuals Who Have Undergone Bariatric Surgery

Authors: Ligia Florio, João Mauricio Castaldelli-Maia

Abstract:

Background: Food addiction (FA) emerged in the 1990s as a possible contributor to the increasing prevalence of obesity and overweight, in conjunction with changing food environments and mental health conditions. However, FA is not yet listed as one of the disorders in the DSM-5 and/or the ICD-11. Although there are controversies and debates in the literature about the classification and construct of FA, the most common approach to access it is the use of a research tool - the Yale Food Addiction Scale (YFAS) - which approximates the concept of FA to the concept diagnosis of dependence on psychoactive substances. There is a need to explore the dimensional phenotypes accessed by YFAS in different population groups for a better understanding and scientific support of FA diagnoses. Methods: The primary objective of this project was to investigate the construct validity of the FA concept by mYFAS 2.0 in individuals who underwent bariatric surgery (n = 100) at the Hospital Estadual Mário Covas since 2011. Statistical analyzes were conducted using the STATA software. In this sense, structural or factor validity was the type of construct validity investigated using exploratory factor analysis (EFA) and item response theory (IRT) techniques. Results: EFA showed that the one-dimensional model was the most parsimonious. The IRT showed that all criteria contributed to the latent structure, presenting discrimination values greater than 0.5, with most presenting values greater than 2. Conclusion: This study reinforces a FA dimension in patients who underwent bariatric surgery. Within this dimension, we identified the most severe and discriminating criteria for the diagnosis of FA.

Keywords: obesity, food addiction, bariatric surgery, regain

Procedia PDF Downloads 59
7805 Classification of Multiple Cancer Types with Deep Convolutional Neural Network

Authors: Nan Deng, Zhenqiu Liu

Abstract:

Thousands of patients with metastatic tumors were diagnosed with cancers of unknown primary sites each year. The inability to identify the primary cancer site may lead to inappropriate treatment and unexpected prognosis. Nowadays, a large amount of genomics and transcriptomics cancer data has been generated by next-generation sequencing (NGS) technologies, and The Cancer Genome Atlas (TCGA) database has accrued thousands of human cancer tumors and healthy controls, which provides an abundance of resource to differentiate cancer types. Meanwhile, deep convolutional neural networks (CNNs) have shown high accuracy on classification among a large number of image object categories. Here, we utilize 25 cancer primary tumors and 3 normal tissues from TCGA and convert their RNA-Seq gene expression profiling to color images; train, validate and test a CNN classifier directly from these images. The performance result shows that our CNN classifier can archive >80% test accuracy on most of the tumors and normal tissues. Since the gene expression pattern of distant metastases is similar to their primary tumors, the CNN classifier may provide a potential computational strategy on identifying the unknown primary origin of metastatic cancer in order to plan appropriate treatment for patients.

Keywords: bioinformatics, cancer, convolutional neural network, deep leaning, gene expression pattern

Procedia PDF Downloads 284
7804 Perceiving Interpersonal Conflict and the Big Five Personality Traits

Authors: Emily Rivera, Toni DiDona

Abstract:

The Big Five personality traits is a hierarchical classification of personality traits that applies factor analysis to a personality survey data in order to describe human personality using five broad dimensions: Extraversion, Agreeableness, Conscientiousness, Neuroticism, and Openness (Fetvadjiev & Van de Vijer, 2015). Research shows that personality constructs underline individual differences in processing conflict and interpersonal relations. (Graziano et al., 1996). This research explores the understudied correlation between the Big Five personality traits and perceived interpersonal conflict in the workplace. It revises social psychological literature on Big Five personality traits within a social context and discusses organizational development journal articles on the perceived efficacy of conflict tactics and approach to interpersonal relationships. The study also presents research undertaken on a survey group of 867 subjects over the age of 18 that were recruited by means of convenience sampling through social media, email, and text messaging. The central finding of this study is that only two of the Big Five personality traits had a significant correlation with perceiving interpersonal conflict in the workplace. Individuals who score higher on agreeableness and neuroticism, perceive more interpersonal conflict in the workplace compared to those that score lower on each dimension. The relationship between both constructs is worthy of research due to its everyday frequency and unique individual psycho-social consequences. This multimethod research associated the Big Five personality dimensions to interpersonal conflict. Its findings that can be utilized to further understand social cognition, person perception, complex social behavior and social relationships in the work environment.

Keywords: five-factor model, interpersonal conflict, personality, The Big Five personality traits

Procedia PDF Downloads 140
7803 Semantic Differences between Bug Labeling of Different Repositories via Machine Learning

Authors: Pooja Khanal, Huaming Zhang

Abstract:

Labeling of issues/bugs, also known as bug classification, plays a vital role in software engineering. Some known labels/classes of bugs are 'User Interface', 'Security', and 'API'. Most of the time, when a reporter reports a bug, they try to assign some predefined label to it. Those issues are reported for a project, and each project is a repository in GitHub/GitLab, which contains multiple issues. There are many software project repositories -ranging from individual projects to commercial projects. The labels assigned for different repositories may be dependent on various factors like human instinct, generalization of labels, label assignment policy followed by the reporter, etc. While the reporter of the issue may instinctively give that issue a label, another person reporting the same issue may label it differently. This way, it is not known mathematically if a label in one repository is similar or different to the label in another repository. Hence, the primary goal of this research is to find the semantic differences between bug labeling of different repositories via machine learning. Independent optimal classifiers for individual repositories are built first using the text features from the reported issues. The optimal classifiers may include a combination of multiple classifiers stacked together. Then, those classifiers are used to cross-test other repositories which leads the result to be deduced mathematically. The produce of this ongoing research includes a formalized open-source GitHub issues database that is used to deduce the similarity of the labels pertaining to the different repositories.

Keywords: bug classification, bug labels, GitHub issues, semantic differences

Procedia PDF Downloads 179
7802 Algerian Literature Written in English: A Comparative Analysis of Four Novels and Their Historical, Cultural, and Identity Themes

Authors: Wafa Nouari

Abstract:

This study compares four novels written in English by Algerian writers: Donkey Heart Monkey Mind by Djaffar Chetouane, Pebble in the River by Noufel Bouzeboudja, Sophia in the White City by Belkacem Mezghouchene, and The Inner Light of Darkness by Iheb Kharab. It applies comparative research methods and cultural studies as the literary theory to analyze how these novels depict Algeria’s culture, history, and identity through their genre, style, tone, perspective, and structure. It identifies some common themes shared by them, such as the quest for freedom and dignity in a context of oppression and colonialism and the use of storytelling, imagination, and creativity as coping mechanisms for trauma and adversity. It also highlights their differences in terms of style, genre, setting, period, and perspectives. It concludes that these novels offer rich and diverse insights into Algeria and its multifaceted reality. It also discusses some limitations and challenges related to Algerian literature in English and suggests some directions for future research.

Keywords: Algeri an literature in English, comparative research methods, cultural studies, diversity and complexity

Procedia PDF Downloads 112
7801 Identifying Enablers and Barriers of Healthcare Knowledge Transfer: A Systematic Review

Authors: Yousuf Nasser Al Khamisi

Abstract:

Purpose: This paper presents a Knowledge Transfer (KT) Framework in healthcare sectors by applying a systematic literature review process to the healthcare organizations domain to identify enablers and barriers of KT in Healthcare. Methods: The paper conducted a systematic literature search of peer-reviewed papers that described key elements of KT using four databases (Medline, Cinahl, Scopus, and Proquest) for a 10-year period (1/1/2008–16/10/2017). The results of the literature review were used to build a conceptual framework of KT in healthcare organizations. The author used a systematic review of the literature, as described by Barbara Kitchenham in Procedures for Performing Systematic Reviews. Findings: The paper highlighted the impacts of using Knowledge Management (KM) concept at a healthcare organization in controlling infectious diseases in hospitals, improving family medicine performance and enhancing quality improvement practices. Moreover, it found that good-coding performance is analytically linked with a knowledge sharing network structure rich in brokerage and hierarchy rather than in density. The unavailability or ignored of the latest evidence on more cost-effective or more efficient delivery approaches leads to increase the healthcare costs and may lead to unintended results. Originality: Search procedure produced 12,093 results, of which 3523 were general articles about KM and KT. The titles and abstracts of these articles had been screened to segregate what is related and what is not. 94 articles identified by the researchers for full-text assessment. The total number of eligible articles after removing un-related articles was 22 articles.

Keywords: healthcare organisation, knowledge management, knowledge transfer, KT framework

Procedia PDF Downloads 125
7800 Ecocriticism and Sustainable Development: A Study of Kamila Shamsie's a God in Every Stone

Authors: Shaista Maseeh

Abstract:

English Literature from the beginning itself has had psychological, social and environment concerns. Virgil, Shakespeare, John Milton, William Wordsworth to the most current Robert Hass have shown and proved their environmental and ecological interests as well as distress related to its loss. Pastoral literature is also one such genre that links literature with environment. Thanks to the contemporary literary theories that they successfully are relating Literature formally to the subjects other than written text. One of such literary theory is 'Ecocriticism.' It stands under the umbrella of the Economics term, Sustainable Development,' or it can also be understood as an ecological extension of it. Ecocriticism helps the reader to study the dynamic relation between literature and our degrading environment. It draws attention towards the ravaged condition of nature and animals, that how nature is exploited by human beings for their own benefit leaving nature at a repairable loss. For instance, deforestation is reducing the size of forest every year, injuring permanently flora, fauna and also the habitat of animals. This paper will study the ecological and environmental concerns in the latest novel by Pakistani British writer Kamila Shamsie, A God in every Stone (2014). The book is not only a literary masterpiece in elegant prose, but also a novel posing a lot of questions about 'nature and environment' in general and 'animals' in particular. It gives the glimpses of the interesting history of Temple of Zeus in Greece and Ancient Caria, and covers many episodes of history the Indian freedom struggle. In course of novel's narrative Kamila Shamsie poses disturbing question about environmental abuse, about how human beings are more 'beasts' than so call beasts, poor animals. She also glorifies the simplicity of past. The novel has enough instances to prove Shamsie's positive stand on saving the earth that is being more abused than used by human beings. This paper will provide an ecocritical approach to study A God in Every Stone (2014).

Keywords: animals, ecocriticism, environment, nature

Procedia PDF Downloads 398
7799 An Overview of Electronic Waste as Aggregate in Concrete

Authors: S. R. Shamili, C. Natarajan, J. Karthikeyan

Abstract:

Rapid growth of world population and widespread urbanization has remarkably increased the development of the construction industry which caused a huge demand for sand and gravels. Environmental problems occur when the rate of extraction of sand, gravels, and other materials exceeds the rate of generation of natural resources; therefore, an alternative source is essential to replace the materials used in concrete. Now-a-days, electronic products have become an integral part of daily life which provides more comfort, security, and ease of exchange of information. These electronic waste (E-Waste) materials have serious human health concerns and require extreme care in its disposal to avoid any adverse impacts. Disposal or dumping of these E-Wastes also causes major issues because it is highly complex to handle and often contains highly toxic chemicals such as lead, cadmium, mercury, beryllium, brominates flame retardants (BFRs), polyvinyl chloride (PVC), and phosphorus compounds. Hence, E-Waste can be incorporated in concrete to make a sustainable environment. This paper deals with the composition, preparation, properties, classification of E-Waste. All these processes avoid dumping to landfills whilst conserving natural aggregate resources, and providing a better environmental option. This paper also provides a detailed literature review on the behaviour of concrete with incorporation of E-Wastes. Many research shows the strong possibility of using E-Waste as a substitute of aggregates eventually it reduces the use of natural aggregates in concrete.

Keywords: dumping, electronic waste, landfill, toxic chemicals

Procedia PDF Downloads 156
7798 Object-Scene: Deep Convolutional Representation for Scene Classification

Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang

Abstract:

Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.

Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization

Procedia PDF Downloads 314
7797 A Framework Based on Dempster-Shafer Theory of Evidence Algorithm for the Analysis of the TV-Viewers’ Behaviors

Authors: Hamdi Amroun, Yacine Benziani, Mehdi Ammi

Abstract:

In this paper, we propose an approach of detecting the behavior of the viewers of a TV program in a non-controlled environment. The experiment we propose is based on the use of three types of connected objects (smartphone, smart watch, and a connected remote control). 23 participants were observed while watching their TV programs during three phases: before, during and after watching a TV program. Their behaviors were detected using an approach based on The Dempster Shafer Theory (DST) in two phases. The first phase is to approximate dynamically the mass functions using an approach based on the correlation coefficient. The second phase is to calculate the approximate mass functions. To approximate the mass functions, two approaches have been tested: the first approach was to divide each features data space into cells; each one has a specific probability distribution over the behaviors. The probability distributions were computed statistically (estimated by empirical distribution). The second approach was to predict the TV-viewing behaviors through the use of classifiers algorithms and add uncertainty to the prediction based on the uncertainty of the model. Results showed that mixing the fusion rule with the computation of the initial approximate mass functions using a classifier led to an overall of 96%, 95% and 96% success rate for the first, second and third TV-viewing phase respectively. The results were also compared to those found in the literature. This study aims to anticipate certain actions in order to maintain the attention of TV viewers towards the proposed TV programs with usual connected objects, taking into account the various uncertainties that can be generated.

Keywords: Iot, TV-viewing behaviors identification, automatic classification, unconstrained environment

Procedia PDF Downloads 215
7796 Linking Corporate Entrepreneurship with Human Resources Management Practices

Authors: R. Maalej, I. Amami, S. Saadaoui

Abstract:

Within the growing body of literature on corporate entrepreneurship, there is a need to understand the relationship between human resource management and corporate entrepreneurship. This paper outlines the linkage between human resource management practices with corporate entrepreneurship. In response, we propose a review of the literature that is based on a conceptual reading of corporate entrepreneurship, human resource management practices and the relationship between them.

Keywords: human resource management, human resources management practices, corporate entrepreneurship, entrepreneur

Procedia PDF Downloads 398
7795 Exploring Gender-Based Violence in Indigenous Communities in Argentina and Costa Rica: A Review of the Current Literature

Authors: Jocelyn Jones

Abstract:

The objective of this literature review is to provide an assessment of the current literature concerning gender-based violence (GBV) within indigenous communities in Argentina and Costa Rica, and various public intervention strategies that have been implemented to counter the increasing rates of violence within these populations. The review will address some of the unique challenges and contextual factors influencing the prevalence and response to such violence, including the enduring impact of colonialism on familial structures, community dynamics, and the perpetuation of violence. Drawing on indigenous feminist perspectives, the paper critically assesses the intersectionality of gender, ethnicity, and socio-economic status in shaping the experiences of indigenous women, men, and gender-diverse individuals. In comparing the two nations, the literature review identifies commonalities and divergences in policy frameworks, legal responses, and grassroots initiatives aimed at addressing GBV. Regarding the assessment of the efficacy of existing interventions, the paper will consider the role of cultural revitalization, community engagement, and collaborative efforts between indigenous communities and external agencies in the development of future policies. Moreover, the review will highlight the importance of decolonizing methodologies in research and intervention strategies, and the need to emphasise culturally sensitive approaches that respect and integrate indigenous worldviews and traditional knowledge systems. Additionally, the paper will explore the potential impact of colonial legacies, resource extraction, and land dispossession on exacerbating vulnerabilities to GBV within indigenous communities. The aim of this paper is to contribute to a more in-depth understanding of GBV in indigenous contexts in order to promote cross-cultural learning and inform future research. Ultimately, this review will demonstrate the necessity of adopting a holistic and context-specific approach to address gender-based violence in indigenous communities.

Keywords: gender based violence, indigenous, colonialism, literature review

Procedia PDF Downloads 61
7794 Machine Learning Approach for Automating Electronic Component Error Classification and Detection

Authors: Monica Racha, Siva Chandrasekaran, Alex Stojcevski

Abstract:

The engineering programs focus on promoting students' personal and professional development by ensuring that students acquire technical and professional competencies during four-year studies. The traditional engineering laboratory provides an opportunity for students to "practice by doing," and laboratory facilities aid them in obtaining insight and understanding of their discipline. Due to rapid technological advancements and the current COVID-19 outbreak, the traditional labs were transforming into virtual learning environments. Aim: To better understand the limitations of the physical laboratory, this research study aims to use a Machine Learning (ML) algorithm that interfaces with the Augmented Reality HoloLens and predicts the image behavior to classify and detect the electronic components. The automated electronic components error classification and detection automatically detect and classify the position of all components on a breadboard by using the ML algorithm. This research will assist first-year undergraduate engineering students in conducting laboratory practices without any supervision. With the help of HoloLens, and ML algorithm, students will reduce component placement error on a breadboard and increase the efficiency of simple laboratory practices virtually. Method: The images of breadboards, resistors, capacitors, transistors, and other electrical components will be collected using HoloLens 2 and stored in a database. The collected image dataset will then be used for training a machine learning model. The raw images will be cleaned, processed, and labeled to facilitate further analysis of components error classification and detection. For instance, when students conduct laboratory experiments, the HoloLens captures images of students placing different components on a breadboard. The images are forwarded to the server for detection in the background. A hybrid Convolutional Neural Networks (CNNs) and Support Vector Machines (SVMs) algorithm will be used to train the dataset for object recognition and classification. The convolution layer extracts image features, which are then classified using Support Vector Machine (SVM). By adequately labeling the training data and classifying, the model will predict, categorize, and assess students in placing components correctly. As a result, the data acquired through HoloLens includes images of students assembling electronic components. It constantly checks to see if students appropriately position components in the breadboard and connect the components to function. When students misplace any components, the HoloLens predicts the error before the user places the components in the incorrect proportion and fosters students to correct their mistakes. This hybrid Convolutional Neural Networks (CNNs) and Support Vector Machines (SVMs) algorithm automating electronic component error classification and detection approach eliminates component connection problems and minimizes the risk of component damage. Conclusion: These augmented reality smart glasses powered by machine learning provide a wide range of benefits to supervisors, professionals, and students. It helps customize the learning experience, which is particularly beneficial in large classes with limited time. It determines the accuracy with which machine learning algorithms can forecast whether students are making the correct decisions and completing their laboratory tasks.

Keywords: augmented reality, machine learning, object recognition, virtual laboratories

Procedia PDF Downloads 120
7793 A Psychophysiological Evaluation of an Effective Recognition Technique Using Interactive Dynamic Virtual Environments

Authors: Mohammadhossein Moghimi, Robert Stone, Pia Rotshtein

Abstract:

Recording psychological and physiological correlates of human performance within virtual environments and interpreting their impacts on human engagement, ‘immersion’ and related emotional or ‘effective’ states is both academically and technologically challenging. By exposing participants to an effective, real-time (game-like) virtual environment, designed and evaluated in an earlier study, a psychophysiological database containing the EEG, GSR and Heart Rate of 30 male and female gamers, exposed to 10 games, was constructed. Some 174 features were subsequently identified and extracted from a number of windows, with 28 different timing lengths (e.g. 2, 3, 5, etc. seconds). After reducing the number of features to 30, using a feature selection technique, K-Nearest Neighbour (KNN) and Support Vector Machine (SVM) methods were subsequently employed for the classification process. The classifiers categorised the psychophysiological database into four effective clusters (defined based on a 3-dimensional space – valence, arousal and dominance) and eight emotion labels (relaxed, content, happy, excited, angry, afraid, sad, and bored). The KNN and SVM classifiers achieved average cross-validation accuracies of 97.01% (±1.3%) and 92.84% (±3.67%), respectively. However, no significant differences were found in the classification process based on effective clusters or emotion labels.

Keywords: virtual reality, effective computing, effective VR, emotion-based effective physiological database

Procedia PDF Downloads 212
7792 Hacking's 'Between Goffman and Foucault': A Theoretical Frame for Criminology

Authors: Tomás Speziale

Abstract:

This paper aims to analyse how Ian Hacking states the theoretical basis of his research on the classification of people. Although all his early philosophical education had been based in Foucault, it is also true that Erving Goffman’s perspective provided him with epistemological and methodological tools for understanding face-to-face relationships. Hence, all his works must be thought of as social science texts that combine the research on how the individuals are constituted ‘top-down’ (as in Foucault), with the inquiry into how people renegotiate ‘bottom-up’ the classifications about them. Thus, Hacking´s proposal constitutes a middle ground between the French Philosopher and the American Sociologist. Placing himself between both authors allows Hacking to build a frame that is expected to adjust to Social Sciences’ main particularity: the fact that they study interactive kinds. These are kinds of people, which imply that those who are classified can change in certain ways that prompt the need for changing previous classifications themselves. It is all about the interaction between the labelling of people and the people who are classified. Consequently, understanding the way in which Hacking uses Foucault’s and Goffman’s theories is essential to fully comprehend the social dynamic between individuals and concepts, what Bert Hansen had called dialectical realism. His theoretical proposal, therefore, is not only valuable because it combines diverse perspectives, but also because it constitutes an utterly original and relevant framework for Sociological theory and particularly for Criminology.

Keywords: classification of people, Foucault's archaeology, Goffman's interpersonal sociology, interactive kinds

Procedia PDF Downloads 325
7791 A Review of Routing Protocols for Mobile Ad-Hoc NETworks (MANET)

Authors: Hafiza Khaddija Saman, Muhammad Sufyan

Abstract:

The increase in availability and popularity of mobile wireless devices has led researchers to develop a wide variety of Mobile Ad-hoc Networking (MANET) protocols to exploit the unique communication opportunities presented by these devices. Devices are able to communicate directly using the wireless spectrum in a peer-to-peer fashion, and route messages through intermediate nodes, however, the nature of wireless shared communication and mobile devices result in many routing and security challenges which must be addressed before deploying a MANET. In this paper, we investigate the range of MANET routing protocols available and discuss the functionalities of several ranging from early protocols such as DSDV to more advanced such as MAODV, our protocol study focuses upon works by Perkins in developing and improving MANET routing. A range of literature relating to the field of MANET routing was identified and reviewed, we also reviewed literature on the topic of securing AODV based MANETs as this may be the most popular MANET protocol. The literature review identified a number of trends within research papers such as exclusive use of the random waypoint mobility model, excluding key metrics from simulation results and not comparing protocol performance against available alternatives.

Keywords: protocol, MANET, ad-Hoc, communication

Procedia PDF Downloads 237
7790 A Review of Data Visualization Best Practices: Lessons for Open Government Data Portals

Authors: Bahareh Ansari

Abstract:

Background: The Open Government Data (OGD) movement in the last decade has encouraged many government organizations around the world to make their data publicly available to advance democratic processes. But current open data platforms have not yet reached to their full potential in supporting all interested parties. To make the data useful and understandable for everyone, scholars suggested that opening the data should be supplemented by visualization. However, different visualizations of the same information can dramatically change an individual’s cognitive and emotional experience in working with the data. This study reviews the data visualization literature to create a list of the methods empirically tested to enhance users’ performance and experience in working with a visualization tool. This list can be used in evaluating the OGD visualization practices and informing the future open data initiatives. Methods: Previous reviews of visualization literature categorized the visualization outcomes into four categories including recall/memorability, insight/comprehension, engagement, and enjoyment. To identify the papers, a search for these outcomes was conducted in the abstract of the publications of top-tier visualization venues including IEEE Transactions for Visualization and Computer Graphics, Computer Graphics, and proceedings of the CHI Conference on Human Factors in Computing Systems. The search results are complemented with a search in the references of the identified articles, and a search for 'open data visualization,' and 'visualization evaluation' keywords in the IEEE explore and ACM digital libraries. Articles are included if they provide empirical evidence through conducting controlled user experiments, or provide a review of these empirical studies. The qualitative synthesis of the studies focuses on identification and classifying the methods, and the conditions under which they are examined to positively affect the visualization outcomes. Findings: The keyword search yields 760 studies, of which 30 are included after the title/abstract review. The classification of the included articles shows five distinct methods: interactive design, aesthetic (artistic) style, storytelling, decorative elements that do not provide extra information including text, image, and embellishment on the graphs), and animation. Studies on decorative elements show consistency on the positive effects of these elements on user engagement and recall but are less consistent in their examination of the user performance. This inconsistency could be attributable to the particular data type or specific design method used in each study. The interactive design studies are consistent in their findings of the positive effect on the outcomes. Storytelling studies show some inconsistencies regarding the design effect on user engagement, enjoyment, recall, and performance, which could be indicative of the specific conditions required for the use of this method. Last two methods, aesthetics and animation, have been less frequent in the included articles, and provide consistent positive results on some of the outcomes. Implications for e-government: Review of the visualization best-practice methods show that each of these methods is beneficial under specific conditions. By using these methods in a potentially beneficial condition, OGD practices can promote a wide range of individuals to involve and work with the government data and ultimately engage in government policy-making procedures.

Keywords: best practices, data visualization, literature review, open government data

Procedia PDF Downloads 90
7789 Technologic Information about Photovoltaic Applied in Urban Residences

Authors: Stephanie Fabris Russo, Daiane Costa Guimarães, Jonas Pedro Fabris, Maria Emilia Camargo, Suzana Leitão Russo, José Augusto Andrade Filho

Abstract:

Among renewable energy sources, solar energy is the one that has stood out. Solar radiation can be used as a thermal energy source and can also be converted into electricity by means of effects on certain materials, such as thermoelectric and photovoltaic panels. These panels are often used to generate energy in homes, buildings, arenas, etc., and have low pollution emissions. Thus, a technological prospecting was performed to find patents related to the use of photovoltaic plates in urban residences. The patent search was based on ESPACENET, associating the keywords photovoltaic and home, where we found 136 patent documents in the period of 1994-2015 in the fields title and abstract. Note that the years 2009, 2010, 2011, 2012, 2013 and 2014 had the highest number of applicants, with respectively, 11, 13, 23, 29, 15 and 21. Regarding the country that deposited about this technology, it is clear that China leads with 67 patent deposits, followed by Japan with 38 patents applications. It is important to note that most depositors, 50% are companies, 44% are individual inventors and only 6% are universities. On the International Patent classification (IPC) codes, we noted that the most present classification in results was H02J3/38, which represents provisions in parallel to feed a single network by two or more generators, converters or transformers. Among all categories, there is the H session, which means Electricity, with 70% of the patents.

Keywords: photovoltaic, urban residences, technology forecasting, prospecting

Procedia PDF Downloads 277
7788 Movie Genre Preference Prediction Using Machine Learning for Customer-Based Information

Authors: Haifeng Wang, Haili Zhang

Abstract:

Most movie recommendation systems have been developed for customers to find items of interest. This work introduces a predictive model usable by small and medium-sized enterprises (SMEs) who are in need of a data-based and analytical approach to stock proper movies for local audiences and retain more customers. We used classification models to extract features from thousands of customers’ demographic, behavioral and social information to predict their movie genre preference. In the implementation, a Gaussian kernel support vector machine (SVM) classification model and a logistic regression model were established to extract features from sample data and their test error-in-sample were compared. Comparison of error-out-sample was also made under different Vapnik–Chervonenkis (VC) dimensions in the machine learning algorithm to find and prevent overfitting. Gaussian kernel SVM prediction model can correctly predict movie genre preferences in 85% of positive cases. The accuracy of the algorithm increased to 93% with a smaller VC dimension and less overfitting. These findings advance our understanding of how to use machine learning approach to predict customers’ preferences with a small data set and design prediction tools for these enterprises.

Keywords: computational social science, movie preference, machine learning, SVM

Procedia PDF Downloads 244
7787 An Improved Parallel Algorithm of Decision Tree

Authors: Jiameng Wang, Yunfei Yin, Xiyu Deng

Abstract:

Parallel optimization is one of the important research topics of data mining at this stage. Taking Classification and Regression Tree (CART) parallelization as an example, this paper proposes a parallel data mining algorithm based on SSP-OGini-PCCP. Aiming at the problem of choosing the best CART segmentation point, this paper designs an S-SP model without data association; and in order to calculate the Gini index efficiently, a parallel OGini calculation method is designed. In addition, in order to improve the efficiency of the pruning algorithm, a synchronous PCCP pruning strategy is proposed in this paper. In this paper, the optimal segmentation calculation, Gini index calculation, and pruning algorithm are studied in depth. These are important components of parallel data mining. By constructing a distributed cluster simulation system based on SPARK, data mining methods based on SSP-OGini-PCCP are tested. Experimental results show that this method can increase the search efficiency of the best segmentation point by an average of 89%, increase the search efficiency of the Gini segmentation index by 3853%, and increase the pruning efficiency by 146% on average; and as the size of the data set increases, the performance of the algorithm remains stable, which meets the requirements of contemporary massive data processing.

Keywords: classification, Gini index, parallel data mining, pruning ahead

Procedia PDF Downloads 106
7786 Theoretical Literature Review on Lack of Cardiorespiratory Fitness and Its Effects on Children

Authors: E. Abdi

Abstract:

The purpose of this theoretical literature review is to study the relevant academic literature on lack of cardiorespiratory fitness and its effects on children. The total of thirty eight relevant documents were identified and considered for this review which nineteen of those were original research articles published in peer reviewed journals. The other nineteen articles were statistical documents. This document is structured to examine 4 effects in deficiency of cardiorespiratory fitness in school aged children: (a) obesity, (b) inadequate fitness level, (c) unhealthy life style, and (d) academics. The categories provide a theoretical framework for future studies. The results are broken down into 6 sections: (a) academics,( b) healthy life style, (c) low cost, (d) obesity, (e) Relative Age Effect (RAE), and (f) race/poverty. The study discusses that regular physical fitness assists children and adolescents to develop healthy physical activity behaviors which can be sustained throughout adult life. Conclusion suggests that advocacy for increasing physical activity and decreasing sedentary behaviors at school and home are necessary.

Keywords: cardiorespiratory, endurance, physical activity, physical fitness

Procedia PDF Downloads 417
7785 Remote Sensing Application in Environmental Researches: Case Study of Iran Mangrove Forests Quantitative Assessment

Authors: Neda Orak, Mostafa Zarei

Abstract:

Environmental assessment is an important session in environment management. Since various methods and techniques have been produces and implemented. Remote sensing (RS) is widely used in many scientific and research fields such as geology, cartography, geography, agriculture, forestry, land use planning, environment, etc. It can show earth surface objects cyclical changes. Also, it can show earth phenomena limits on basis of electromagnetic reflectance changes and deviations records. The research has been done on mangrove forests assessment by RS techniques. Mangrove forests quantitative analysis in Basatin and Bidkhoon estuaries was the aim of this research. It has been done by Landsat satellite images from 1975- 2013 and match to ground control points. This part of mangroves are the last distribution in northern hemisphere. It can provide a good background to improve better management on this important ecosystem. Landsat has provided valuable images to earth changes detection to researchers. This research has used MSS, TM, +ETM, OLI sensors from 1975, 1990, 2000, 2003-2013. Changes had been studied after essential corrections such as fix errors, bands combination, georeferencing on 2012 images as basic image, by maximum likelihood and IPVI Index. It was done by supervised classification. 2004 google earth image and ground points by GPS (2010-2012) was used to compare satellite images obtained changes. Results showed mangrove area in bidkhoon was 1119072 m2 by GPS and 1231200 m2 by maximum likelihood supervised classification and 1317600 m2 by IPVI in 2012. Basatin areas is respectively: 466644 m2, 88200 m2, 63000 m2. Final results show forests have been declined naturally. It is due to human activities in Basatin. The defect was offset by planting in many years. Although the trend has been declining in recent years again. So, it mentioned satellite images have high ability to estimation all environmental processes. This research showed high correlation between images and indexes such as IPVI and NDVI with ground control points.

Keywords: IPVI index, Landsat sensor, maximum likelihood supervised classification, Nayband National Park

Procedia PDF Downloads 274
7784 The Involvement of Viruses and Fungi in the Pathogenesis of Dental Infections

Authors: Wael Khalil, Elias Rahal, Ghassan Matar

Abstract:

Tooth related infections or commonly named dental infections have been described as the most common causes of tooth loss in adults. These pathologies were mostly periodontitis, pericoronitis, and periapical infection. The involvement of various bacteria in the pathogenesis of these pathologies has been thoroughly mentioned and approved in the literature. However, the variability in the severity and prognosis of these lesions among patients suggests the association of other pathogens, like viruses and fungi, in the pathogenesis of these lesions. Several studies in the literature investigated the association of multiple viruses and fungi with the above-mentioned lesions, yet, a vast controversy was reached concerning this subject.Aim: Our study aims to fill the gap in the literature concerning the contribution of adenovirus, HPV-16, EBV, fungi, and candida in the pathogenesis of periodontitis, pericoronitis, and periapical infection. For this purpose, we utilized the quantitative PCR for pathogen detection in saliva, gingival, and lesions samples of involved subjects. Results: Some of these pathogens appeared to have an association with the investigated dental pathologies, while others showed no contribution to the pathogenesis of these lesions. Further investigation is required in order to identify the subtype of the involved pathogens in these tooth related oral pathology.

Keywords: periodontitis, pericoronitis, dental abscess, PCR, microbiology

Procedia PDF Downloads 79
7783 Recent Trends in Supply Chain Delivery Models

Authors: Alfred L. Guiffrida

Abstract:

A review of the literature on supply chain delivery models which use delivery windows to measure delivery performance is presented. The review herein serves to meet the following objectives: (i) provide a synthesis of previously published literature on supply chain delivery performance models, (ii) provide in one paper a consolidation of research that can serve as a single source to keep researchers up to date with the research developments in supply chain delivery models, and (iii) identify gaps in the modeling of supply chain delivery performance which could stimulate new research agendas.

Keywords: delivery performance, delivery window, supply chain delivery models, supply chain performance

Procedia PDF Downloads 399