Search results for: PDF to story feature
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2148

Search results for: PDF to story feature

1638 A Conceptual Analysis of Right of Taxpayers to Claim Refund in Nigeria

Authors: Hafsat Iyabo Sa'adu

Abstract:

A salient feature of the Nigerian Tax Law is the right of the taxpayer to demand for a refund where excess tax is paid. Section 23 of the Federal Inland Revenue Service (Establishment) Act, 2007 vests Federal Inland Revenue Services with the power to make tax refund as well as set guidelines and requirements for refund process from time to time. In addition, Section 61 of the Federal Inland Revenue Service (Establishment) Act, 2007, empowers the Federal Inland Revenue Services to issue information circular to acquaint stakeholders with the policy on the refund process. A Circular was issued to that effect to correct the position that until after the annual audit of the Service before such excess can be paid to the claimant/taxpayer. But it is amazing that such circular issuance does not feature under the states’ laws. Hence, there is an inconsistencies in the tax paying system in Nigeria. This study, therefore, sets an objective, to examine the trending concept of tax refund in Nigeria. In order to achieve this set objective, a doctrinal study went under way, wherein both federal and states laws were consulted including journals and textbooks. At the end of the research, it was revealed that the law should be specific as to the time frame within which to make the refund. It further revealed that it is essential to put up a legal framework for the tax system to recognize excess payment as debt due from the state. This would provide a foundational framework for the relationship between taxpayers and Federal Inland Revenue Service as well as promote effective tax administration in all the states of the federation. Several Recommendations were made especially relating to legislative passage of ‘’Refund Circular Bill at the states levels’ pursuant to the Federal Inland Revenue Service (Establishment) Act, 2007.

Keywords: claim, Nigeria, refund, right

Procedia PDF Downloads 119
1637 Interlingual Translation of Manipuri Folktales with the Ideas of André Lefevere's Translation

Authors: Thoudam Jomita Devi

Abstract:

This paper is an attempt to analyze the problems of translating Manipuri folktales into English and the strategies deployed. In Manipuri, folktales are known as Fungawari/Phungawari, which is similar to a western bed time story. The work is with the special reference to folktales of Meetei community. Meetei are the majority ethnic group of Manipur, India. For this paper’s purpose, two folktales Shandrembi Cheisra and Pebet will be chosen for analysis and discussion. The translation of folktales can contribute to intercultural communication and bridge the gap between the generations. Translating Manipuri Folktales is problematic on both cultural and linguistic levels. Therefore, the aim of this analysis is to understand, how the idea of André Lefevere (1992) translation could be implicated in translating Manipuri folktales.

Keywords: cultural, folktales, intercultural, interlingual, translation

Procedia PDF Downloads 187
1636 A New Method Separating Relevant Features from Irrelevant Ones Using Fuzzy and OWA Operator Techniques

Authors: Imed Feki, Faouzi Msahli

Abstract:

Selection of relevant parameters from a high dimensional process operation setting space is a problem frequently encountered in industrial process modelling. This paper presents a method for selecting the most relevant fabric physical parameters for each sensory quality feature. The proposed relevancy criterion has been developed using two approaches. The first utilizes a fuzzy sensitivity criterion by exploiting from experimental data the relationship between physical parameters and all the sensory quality features for each evaluator. Next an OWA aggregation procedure is applied to aggregate the ranking lists provided by different evaluators. In the second approach, another panel of experts provides their ranking lists of physical features according to their professional knowledge. Also by applying OWA and a fuzzy aggregation model, the data sensitivity-based ranking list and the knowledge-based ranking list are combined using our proposed percolation technique, to determine the final ranking list. The key issue of the proposed percolation technique is to filter automatically and objectively the relevant features by creating a gap between scores of relevant and irrelevant parameters. It permits to automatically generate threshold that can effectively reduce human subjectivity and arbitrariness when manually choosing thresholds. For a specific sensory descriptor, the threshold is defined systematically by iteratively aggregating (n times) the ranking lists generated by OWA and fuzzy models, according to a specific algorithm. Having applied the percolation technique on a real example, of a well known finished textile product especially the stonewashed denims, usually considered as the most important quality criteria in jeans’ evaluation, we separate the relevant physical features from irrelevant ones for each sensory descriptor. The originality and performance of the proposed relevant feature selection method can be shown by the variability in the number of physical features in the set of selected relevant parameters. Instead of selecting identical numbers of features with a predefined threshold, the proposed method can be adapted to the specific natures of the complex relations between sensory descriptors and physical features, in order to propose lists of relevant features of different sizes for different descriptors. In order to obtain more reliable results for selection of relevant physical features, the percolation technique has been applied for combining the fuzzy global relevancy and OWA global relevancy criteria in order to clearly distinguish scores of the relevant physical features from those of irrelevant ones.

Keywords: data sensitivity, feature selection, fuzzy logic, OWA operators, percolation technique

Procedia PDF Downloads 605
1635 Volunteered Geographic Information Coupled with Wildfire Fire Progression Maps: A Spatial and Temporal Tool for Incident Storytelling

Authors: Cassandra Hansen, Paul Doherty, Chris Ferner, German Whitley, Holly Torpey

Abstract:

Wildfire is a natural and inevitable occurrence, yet changing climatic conditions have increased the severity, frequency, and risk to human populations in the wildland/urban interface (WUI) of the Western United States. Rapid dissemination of accurate wildfire information is critical to both the Incident Management Team (IMT) and the affected community. With the advent of increasingly sophisticated information systems, GIS can now be used as a web platform for sharing geographic information in new and innovative ways, such as virtual story map applications. Crowdsourced information can be extraordinarily useful when coupled with authoritative information. Information abounds in the form of social media, emergency alerts, radio, and news outlets, yet many of these resources lack a spatial component when first distributed. In this study, we describe how twenty-eight volunteer GIS professionals across nine Geographic Area Coordination Centers (GACC) sourced, curated, and distributed Volunteered Geographic Information (VGI) from authoritative social media accounts focused on disseminating information about wildfires and public safety. The combination of fire progression maps with VGI incident information helps answer three critical questions about an incident, such as: where the first started. How and why the fire behaved in an extreme manner and how we can learn from the fire incident's story to respond and prepare for future fires in this area. By adding a spatial component to that shared information, this team has been able to visualize shared information about wildfire starts in an interactive map that answers three critical questions in a more intuitive way. Additionally, long-term social and technical impacts on communities are examined in relation to situational awareness of the disaster through map layers and agency links, the number of views in a particular region of a disaster, community involvement and sharing of this critical resource. Combined with a GIS platform and disaster VGI applications, this workflow and information become invaluable to communities within the WUI and bring spatial awareness for disaster preparedness, response, mitigation, and recovery. This study highlights progression maps as the ultimate storytelling mechanism through incident case studies and demonstrates the impact of VGI and sophisticated applied cartographic methodology make this an indispensable resource for authoritative information sharing.

Keywords: storytelling, wildfire progression maps, volunteered geographic information, spatial and temporal

Procedia PDF Downloads 179
1634 Modern Detection and Description Methods for Natural Plants Recognition

Authors: Masoud Fathi Kazerouni, Jens Schlemper, Klaus-Dieter Kuhnert

Abstract:

Green planet is one of the Earth’s names which is known as a terrestrial planet and also can be named the fifth largest planet of the solar system as another scientific interpretation. Plants do not have a constant and steady distribution all around the world, and even plant species’ variations are not the same in one specific region. Presence of plants is not only limited to one field like botany; they exist in different fields such as literature and mythology and they hold useful and inestimable historical records. No one can imagine the world without oxygen which is produced mostly by plants. Their influences become more manifest since no other live species can exist on earth without plants as they form the basic food staples too. Regulation of water cycle and oxygen production are the other roles of plants. The roles affect environment and climate. Plants are the main components of agricultural activities. Many countries benefit from these activities. Therefore, plants have impacts on political and economic situations and future of countries. Due to importance of plants and their roles, study of plants is essential in various fields. Consideration of their different applications leads to focus on details of them too. Automatic recognition of plants is a novel field to contribute other researches and future of studies. Moreover, plants can survive their life in different places and regions by means of adaptations. Therefore, adaptations are their special factors to help them in hard life situations. Weather condition is one of the parameters which affect plants life and their existence in one area. Recognition of plants in different weather conditions is a new window of research in the field. Only natural images are usable to consider weather conditions as new factors. Thus, it will be a generalized and useful system. In order to have a general system, distance from the camera to plants is considered as another factor. The other considered factor is change of light intensity in environment as it changes during the day. Adding these factors leads to a huge challenge to invent an accurate and secure system. Development of an efficient plant recognition system is essential and effective. One important component of plant is leaf which can be used to implement automatic systems for plant recognition without any human interface and interaction. Due to the nature of used images, characteristic investigation of plants is done. Leaves of plants are the first characteristics to select as trusty parts. Four different plant species are specified for the goal to classify them with an accurate system. The current paper is devoted to principal directions of the proposed methods and implemented system, image dataset, and results. The procedure of algorithm and classification is explained in details. First steps, feature detection and description of visual information, are outperformed by using Scale invariant feature transform (SIFT), HARRIS-SIFT, and FAST-SIFT methods. The accuracy of the implemented methods is computed. In addition to comparison, robustness and efficiency of results in different conditions are investigated and explained.

Keywords: SIFT combination, feature extraction, feature detection, natural images, natural plant recognition, HARRIS-SIFT, FAST-SIFT

Procedia PDF Downloads 278
1633 Tumor Size and Lymph Node Metastasis Detection in Colon Cancer Patients Using MR Images

Authors: Mohammadreza Hedyehzadeh, Mahdi Yousefi

Abstract:

Colon cancer is one of the most common cancer, which predicted to increase its prevalence due to the bad eating habits of peoples. Nowadays, due to the busyness of people, the use of fast foods is increasing, and therefore, diagnosis of this disease and its treatment are of particular importance. To determine the best treatment approach for each specific colon cancer patients, the oncologist should be known the stage of the tumor. The most common method to determine the tumor stage is TNM staging system. In this system, M indicates the presence of metastasis, N indicates the extent of spread to the lymph nodes, and T indicates the size of the tumor. It is clear that in order to determine all three of these parameters, an imaging method must be used, and the gold standard imaging protocols for this purpose are CT and PET/CT. In CT imaging, due to the use of X-rays, the risk of cancer and the absorbed dose of the patient is high, while in the PET/CT method, there is a lack of access to the device due to its high cost. Therefore, in this study, we aimed to estimate the tumor size and the extent of its spread to the lymph nodes using MR images. More than 1300 MR images collected from the TCIA portal, and in the first step (pre-processing), histogram equalization to improve image qualities and resizing to get the same image size was done. Two expert radiologists, which work more than 21 years on colon cancer cases, segmented the images and extracted the tumor region from the images. The next step is feature extraction from segmented images and then classify the data into three classes: T0N0، T3N1 و T3N2. In this article, the VGG-16 convolutional neural network has been used to perform both of the above-mentioned tasks, i.e., feature extraction and classification. This network has 13 convolution layers for feature extraction and three fully connected layers with the softmax activation function for classification. In order to validate the proposed method, the 10-fold cross validation method used in such a way that the data was randomly divided into three parts: training (70% of data), validation (10% of data) and the rest for testing. It is repeated 10 times, each time, the accuracy, sensitivity and specificity of the model are calculated and the average of ten repetitions is reported as the result. The accuracy, specificity and sensitivity of the proposed method for testing dataset was 89/09%, 95/8% and 96/4%. Compared to previous studies, using a safe imaging technique (MRI) and non-use of predefined hand-crafted imaging features to determine the stage of colon cancer patients are some of the study advantages.

Keywords: colon cancer, VGG-16, magnetic resonance imaging, tumor size, lymph node metastasis

Procedia PDF Downloads 61
1632 A Neurofeedback Learning Model Using Time-Frequency Analysis for Volleyball Performance Enhancement

Authors: Hamed Yousefi, Farnaz Mohammadi, Niloufar Mirian, Navid Amini

Abstract:

Investigating possible capacities of visual functions where adapted mechanisms can enhance the capability of sports trainees is a promising area of research, not only from the cognitive viewpoint but also in terms of unlimited applications in sports training. In this paper, the visual evoked potential (VEP) and event-related potential (ERP) signals of amateur and trained volleyball players in a pilot study were processed. Two groups of amateur and trained subjects are asked to imagine themselves in the state of receiving a ball while they are shown a simulated volleyball field. The proposed method is based on a set of time-frequency features using algorithms such as Gabor filter, continuous wavelet transform, and a multi-stage wavelet decomposition that are extracted from VEP signals that can be indicative of being amateur or trained. The linear discriminant classifier achieves the accuracy, sensitivity, and specificity of 100% when the average of the repetitions of the signal corresponding to the task is used. The main purpose of this study is to investigate the feasibility of a fast, robust, and reliable feature/model determination as a neurofeedback parameter to be utilized for improving the volleyball players’ performance. The proposed measure has potential applications in brain-computer interface technology where a real-time biomarker is needed.

Keywords: visual evoked potential, time-frequency feature extraction, short-time Fourier transform, event-related spectrum potential classification, linear discriminant analysis

Procedia PDF Downloads 138
1631 Spectrogram Pre-Processing to Improve Isotopic Identification to Discriminate Gamma and Neutrons Sources

Authors: Mustafa Alhamdi

Abstract:

Industrial application to classify gamma rays and neutron events is investigated in this study using deep machine learning. The identification using a convolutional neural network and recursive neural network showed a significant improvement in predication accuracy in a variety of applications. The ability to identify the isotope type and activity from spectral information depends on feature extraction methods, followed by classification. The features extracted from the spectrum profiles try to find patterns and relationships to present the actual spectrum energy in low dimensional space. Increasing the level of separation between classes in feature space improves the possibility to enhance classification accuracy. The nonlinear nature to extract features by neural network contains a variety of transformation and mathematical optimization, while principal component analysis depends on linear transformations to extract features and subsequently improve the classification accuracy. In this paper, the isotope spectrum information has been preprocessed by finding the frequencies components relative to time and using them as a training dataset. Fourier transform implementation to extract frequencies component has been optimized by a suitable windowing function. Training and validation samples of different isotope profiles interacted with CdTe crystal have been simulated using Geant4. The readout electronic noise has been simulated by optimizing the mean and variance of normal distribution. Ensemble learning by combing voting of many models managed to improve the classification accuracy of neural networks. The ability to discriminate gamma and neutron events in a single predication approach using deep machine learning has shown high accuracy using deep learning. The paper findings show the ability to improve the classification accuracy by applying the spectrogram preprocessing stage to the gamma and neutron spectrums of different isotopes. Tuning deep machine learning models by hyperparameter optimization of neural network models enhanced the separation in the latent space and provided the ability to extend the number of detected isotopes in the training database. Ensemble learning contributed significantly to improve the final prediction.

Keywords: machine learning, nuclear physics, Monte Carlo simulation, noise estimation, feature extraction, classification

Procedia PDF Downloads 151
1630 Unveiling Comorbidities in Irritable Bowel Syndrome: A UK BioBank Study utilizing Supervised Machine Learning

Authors: Uswah Ahmad Khan, Muhammad Moazam Fraz, Humayoon Shafique Satti, Qasim Aziz

Abstract:

Approximately 10-14% of the global population experiences a functional disorder known as irritable bowel syndrome (IBS). The disorder is defined by persistent abdominal pain and an irregular bowel pattern. IBS significantly impairs work productivity and disrupts patients' daily lives and activities. Although IBS is widespread, there is still an incomplete understanding of its underlying pathophysiology. This study aims to help characterize the phenotype of IBS patients by differentiating the comorbidities found in IBS patients from those in non-IBS patients using machine learning algorithms. In this study, we extracted samples coding for IBS from the UK BioBank cohort and randomly selected patients without a code for IBS to create a total sample size of 18,000. We selected the codes for comorbidities of these cases from 2 years before and after their IBS diagnosis and compared them to the comorbidities in the non-IBS cohort. Machine learning models, including Decision Trees, Gradient Boosting, Support Vector Machine (SVM), AdaBoost, Logistic Regression, and XGBoost, were employed to assess their accuracy in predicting IBS. The most accurate model was then chosen to identify the features associated with IBS. In our case, we used XGBoost feature importance as a feature selection method. We applied different models to the top 10% of features, which numbered 50. Gradient Boosting, Logistic Regression and XGBoost algorithms yielded a diagnosis of IBS with an optimal accuracy of 71.08%, 71.427%, and 71.53%, respectively. Among the comorbidities most closely associated with IBS included gut diseases (Haemorrhoids, diverticular diseases), atopic conditions(asthma), and psychiatric comorbidities (depressive episodes or disorder, anxiety). This finding emphasizes the need for a comprehensive approach when evaluating the phenotype of IBS, suggesting the possibility of identifying new subsets of IBS rather than relying solely on the conventional classification based on stool type. Additionally, our study demonstrates the potential of machine learning algorithms in predicting the development of IBS based on comorbidities, which may enhance diagnosis and facilitate better management of modifiable risk factors for IBS. Further research is necessary to confirm our findings and establish cause and effect. Alternative feature selection methods and even larger and more diverse datasets may lead to more accurate classification models. Despite these limitations, our findings highlight the effectiveness of Logistic Regression and XGBoost in predicting IBS diagnosis.

Keywords: comorbidities, disease association, irritable bowel syndrome (IBS), predictive analytics

Procedia PDF Downloads 119
1629 Natural Language News Generation from Big Data

Authors: Bastian Haarmann, Likas Sikorski

Abstract:

In this paper, we introduce an NLG application for the automatic creation of ready-to-publish texts from big data. The fully automatic generated stories have a high resemblance to the style in which the human writer would draw up a news story. Topics may include soccer games, stock exchange market reports, weather forecasts and many more. The generation of the texts runs according to the human language production. Each generated text is unique. Ready-to-publish stories written by a computer application can help humans to quickly grasp the outcomes of big data analyses, save time-consuming pre-formulations for journalists and cater to rather small audiences by offering stories that would otherwise not exist.

Keywords: big data, natural language generation, publishing, robotic journalism

Procedia PDF Downloads 431
1628 Women's Contemporary Dystopias: Feminist Protagonists Taking Back Control

Authors: Natalia Fontes De Oliveira

Abstract:

The Canadian author Margaret Atwood deconstructs the tainted dichotomies between women and men by embracing the disorder throughout her dystopias. In Atwood’s The Testaments, nature can be seen as a background to the story as well as a metaphorical expression of the characters’ state of mind, nevertheless, the protagonists’ nature writing portrays conveys a curiosity to the pre-established sanctions of a docile garden, viewing nature as an autonomous entity, especially when they are away from the confinements of Gilead’s regime. The three narrating protagonists, Agnes, Aunt Lydia, and Nicole, use nature writing subversively as a form of rebellion. This paper investigates how the three protagonists narrate nature through an intimist point of view, with sensibility to observe the multiple relationships among humanity, nature, and the impositions of a theocratic ultra conservative patriarchal society.

Keywords: contemporary literature, dystopias, feminism, women’s writing

Procedia PDF Downloads 171
1627 A Robust and Efficient Segmentation Method Applied for Cardiac Left Ventricle with Abnormal Shapes

Authors: Peifei Zhu, Zisheng Li, Yasuki Kakishita, Mayumi Suzuki, Tomoaki Chono

Abstract:

Segmentation of left ventricle (LV) from cardiac ultrasound images provides a quantitative functional analysis of the heart to diagnose disease. Active Shape Model (ASM) is a widely used approach for LV segmentation but suffers from the drawback that initialization of the shape model is not sufficiently close to the target, especially when dealing with abnormal shapes in disease. In this work, a two-step framework is proposed to improve the accuracy and speed of the model-based segmentation. Firstly, a robust and efficient detector based on Hough forest is proposed to localize cardiac feature points, and such points are used to predict the initial fitting of the LV shape model. Secondly, to achieve more accurate and detailed segmentation, ASM is applied to further fit the LV shape model to the cardiac ultrasound image. The performance of the proposed method is evaluated on a dataset of 800 cardiac ultrasound images that are mostly of abnormal shapes. The proposed method is compared to several combinations of ASM and existing initialization methods. The experiment results demonstrate that the accuracy of feature point detection for initialization was improved by 40% compared to the existing methods. Moreover, the proposed method significantly reduces the number of necessary ASM fitting loops, thus speeding up the whole segmentation process. Therefore, the proposed method is able to achieve more accurate and efficient segmentation results and is applicable to unusual shapes of heart with cardiac diseases, such as left atrial enlargement.

Keywords: hough forest, active shape model, segmentation, cardiac left ventricle

Procedia PDF Downloads 340
1626 Code Embedding for Software Vulnerability Discovery Based on Semantic Information

Authors: Joseph Gear, Yue Xu, Ernest Foo, Praveen Gauravaran, Zahra Jadidi, Leonie Simpson

Abstract:

Deep learning methods have been seeing an increasing application to the long-standing security research goal of automatic vulnerability detection for source code. Attention, however, must still be paid to the task of producing vector representations for source code (code embeddings) as input for these deep learning models. Graphical representations of code, most predominantly Abstract Syntax Trees and Code Property Graphs, have received some use in this task of late; however, for very large graphs representing very large code snip- pets, learning becomes prohibitively computationally expensive. This expense may be reduced by intelligently pruning this input to only vulnerability-relevant information; however, little research in this area has been performed. Additionally, most existing work comprehends code based solely on the structure of the graph at the expense of the information contained by the node in the graph. This paper proposes Semantic-enhanced Code Embedding for Vulnerability Discovery (SCEVD), a deep learning model which uses semantic-based feature selection for its vulnerability classification model. It uses information from the nodes as well as the structure of the code graph in order to select features which are most indicative of the presence or absence of vulnerabilities. This model is implemented and experimentally tested using the SARD Juliet vulnerability test suite to determine its efficacy. It is able to improve on existing code graph feature selection methods, as demonstrated by its improved ability to discover vulnerabilities.

Keywords: code representation, deep learning, source code semantics, vulnerability discovery

Procedia PDF Downloads 161
1625 Design and Development of Bar Graph Data Visualization in 2D and 3D Space Using Front-End Technologies

Authors: Sourabh Yaduvanshi, Varsha Namdeo, Namrata Yaduvanshi

Abstract:

This study delves into the design and development intricacies of crafting detailed 2D bar charts via d3.js, recognizing its limitations in generating 3D visuals within the Document Object Model (DOM). The study combines three.js with d3.js, facilitating a smooth evolution from 2D to immersive 3D representations. This fusion epitomizes the synergy between front-end technologies, expanding horizons in data visualization. Beyond technical expertise, it symbolizes a creative convergence, pushing boundaries in visual representation. The abstract illuminates methodologies, unraveling the intricate integration of this fusion and guiding enthusiasts. It narrates a compelling story of transcending 2D constraints, propelling data visualization into captivating three-dimensional realms, and igniting creativity in front-end visualization endeavors.

Keywords: design, development, front-end technologies, visualization

Procedia PDF Downloads 38
1624 Comparison of Wind Fragility for Window System in the Simplified 10 and 15-Story Building Considering Exposure Category

Authors: Viriyavudh Sim, WooYoung Jung

Abstract:

Window system in high rise building is occasionally subjected to an excessive wind intensity, particularly during typhoon. The failure of window system did not affect overall safety of structural performance; however, it could endanger the safety of the residents. In this paper, comparison of fragility curves for window system of two residential buildings was studied. The probability of failure for individual window was determined with Monte Carlo Simulation method. Then, lognormal cumulative distribution function was used to represent the fragility. The results showed that windows located on the edge of leeward wall were more susceptible to wind load and the probability of failure for each window panel increased at higher floors.

Keywords: wind fragility, window system, high rise building, wind disaster

Procedia PDF Downloads 314
1623 Hybrid Approach for Face Recognition Combining Gabor Wavelet and Linear Discriminant Analysis

Authors: A: Annis Fathima, V. Vaidehi, S. Ajitha

Abstract:

Face recognition system finds many applications in surveillance and human computer interaction systems. As the applications using face recognition systems are of much importance and demand more accuracy, more robustness in the face recognition system is expected with less computation time. In this paper, a hybrid approach for face recognition combining Gabor Wavelet and Linear Discriminant Analysis (HGWLDA) is proposed. The normalized input grayscale image is approximated and reduced in dimension to lower the processing overhead for Gabor filters. This image is convolved with bank of Gabor filters with varying scales and orientations. LDA, a subspace analysis techniques are used to reduce the intra-class space and maximize the inter-class space. The techniques used are 2-dimensional Linear Discriminant Analysis (2D-LDA), 2-dimensional bidirectional LDA ((2D)2LDA), Weighted 2-dimensional bidirectional Linear Discriminant Analysis (Wt (2D)2 LDA). LDA reduces the feature dimension by extracting the features with greater variance. k-Nearest Neighbour (k-NN) classifier is used to classify and recognize the test image by comparing its feature with each of the training set features. The HGWLDA approach is robust against illumination conditions as the Gabor features are illumination invariant. This approach also aims at a better recognition rate using less number of features for varying expressions. The performance of the proposed HGWLDA approaches is evaluated using AT&T database, MIT-India face database and faces94 database. It is found that the proposed HGWLDA approach provides better results than the existing Gabor approach.

Keywords: face recognition, Gabor wavelet, LDA, k-NN classifier

Procedia PDF Downloads 467
1622 Fragility Assessment for Torsionally Asymmetric Buildings in Plan

Authors: S. Feli, S. Tavousi Tafreshi, A. Ghasemi

Abstract:

The present paper aims at evaluating the response of three-dimensional buildings with in-plan stiffness irregularities that have been subjected to two-way excitation ground motion records simultaneously. This study is broadly-based fragility assessment with greater emphasis on structural response at in-plan flexible and stiff sides. To this end, three type of three-dimensional 5-story steel building structures with stiffness eccentricities, were subjected to extensive nonlinear incremental dynamic analyses (IDA) utilizing Ibarra-Krawinkler deterioration models. Fragility assessment was implemented for different configurations of braces to investigate the losses in buildings with center of resisting (CR) eccentricities.

Keywords: Ibarra-Krawinkler, fragility assessment, flexible and stiff side, center of resisting

Procedia PDF Downloads 205
1621 Seismic Behavior of Existing Reinforced Concrete Buildings in California under Mainshock-Aftershock Scenarios

Authors: Ahmed Mantawy, James C. Anderson

Abstract:

Numerous cases of earthquakes (main-shocks) that were followed by aftershocks have been recorded in California. In 1992 a pair of strong earthquakes occurred within three hours of each other in Southern California. The first shock occurred near the community of Landers and was assigned a magnitude of 7.3 then the second shock occurred near the city of Big Bear about 20 miles west of the initial shock and was assigned a magnitude of 6.2. In the same year, a series of three earthquakes occurred over two days in the Cape-Mendocino area of Northern California. The main-shock was assigned a magnitude of 7.0 while the second and the third shocks were both assigned a value of 6.6. This paper investigates the effect of a main-shock accompanied with aftershocks of significant intensity on reinforced concrete (RC) frame buildings to indicate nonlinear behavior using PERFORM-3D software. A 6-story building in San Bruno and a 20-story building in North Hollywood were selected for the study as both of them have RC moment resisting frame systems. The buildings are also instrumented at multiple floor levels as a part of the California Strong Motion Instrumentation Program (CSMIP). Both buildings have recorded responses during past events such as Loma-Prieta and Northridge earthquakes which were used in verifying the response parameters of the numerical models in PERFORM-3D. The verification of the numerical models shows good agreement between the calculated and the recorded response values. Then, different scenarios of a main-shock followed by a series of aftershocks from real cases in California were applied to the building models in order to investigate the structural behavior of the moment-resisting frame system. The behavior was evaluated in terms of the lateral floor displacements, the ductility demands, and the inelastic behavior at critical locations. The analysis results showed that permanent displacements may have happened due to the plastic deformation during the main-shock that can lead to higher displacements during after-shocks. Also, the inelastic response at plastic hinges during the main-shock can change the hysteretic behavior during the aftershocks. Higher ductility demands can also occur when buildings are subjected to trains of ground motions compared to the case of individual ground motions. A general conclusion is that the occurrence of aftershocks following an earthquake can lead to increased damage within the elements of an RC frame buildings. Current code provisions for seismic design do not consider the probability of significant aftershocks when designing a new building in zones of high seismic activity.

Keywords: reinforced concrete, existing buildings, aftershocks, damage accumulation

Procedia PDF Downloads 280
1620 Incorporating Lexical-Semantic Knowledge into Convolutional Neural Network Framework for Pediatric Disease Diagnosis

Authors: Xiaocong Liu, Huazhen Wang, Ting He, Xiaozheng Li, Weihan Zhang, Jian Chen

Abstract:

The utilization of electronic medical record (EMR) data to establish the disease diagnosis model has become an important research content of biomedical informatics. Deep learning can automatically extract features from the massive data, which brings about breakthroughs in the study of EMR data. The challenge is that deep learning lacks semantic knowledge, which leads to impracticability in medical science. This research proposes a method of incorporating lexical-semantic knowledge from abundant entities into a convolutional neural network (CNN) framework for pediatric disease diagnosis. Firstly, medical terms are vectorized into Lexical Semantic Vectors (LSV), which are concatenated with the embedded word vectors of word2vec to enrich the feature representation. Secondly, the semantic distribution of medical terms serves as Semantic Decision Guide (SDG) for the optimization of deep learning models. The study evaluate the performance of LSV-SDG-CNN model on four kinds of Chinese EMR datasets. Additionally, CNN, LSV-CNN, and SDG-CNN are designed as baseline models for comparison. The experimental results show that LSV-SDG-CNN model outperforms baseline models on four kinds of Chinese EMR datasets. The best configuration of the model yielded an F1 score of 86.20%. The results clearly demonstrate that CNN has been effectively guided and optimized by lexical-semantic knowledge, and LSV-SDG-CNN model improves the disease classification accuracy with a clear margin.

Keywords: convolutional neural network, electronic medical record, feature representation, lexical semantics, semantic decision

Procedia PDF Downloads 126
1619 Psycholinguistic Analysis on Stuttering Treatment through Systemic Functional Grammar in Tom Hooper’s The King’s Speech

Authors: Nurvita Wijayanti

Abstract:

The movie titled The King’s Speech is based on a true story telling an English king suffers from stuttering and how he gets the treatment from the therapist, so that he can reduce the high frequency on stuttering. The treatment uses the unique approach implying the linguistic principles. This study shows how the language works significantly in order to treat the stuttering sufferer using psychological approach. Therefore, the linguistic study is done to analyze the treatment activity. Halliday’s Systemic Functional Grammar is used as the main approach in this study along with qualitative descriptive method. The study finds that the therapist though using the orthodox approach applies the psycholinguistic method to overcome the king’s stuttering.

Keywords: psycholinguistics, stuttering, systemic functional grammar, treatment

Procedia PDF Downloads 252
1618 Translation and Sociolinguistics of Classical Books

Authors: Laura de Almeida

Abstract:

This paper aims to present research involving the translation of classical books originally in English and translated into the Portuguese language. The objective is to analyze the linguistic varieties evident and how they appear in the other language the work was translated into. We based our study on the sociolinguistics theory, more specifically, the study of the Black English Vernacular. Our methodology is built on collecting data from the speech characters of the Black English Vernacular from some books such as The Adventures of Huckleberry Finn by Mark Twain. On doing so, we compare the two versions of a book and how they reflected the linguistic variety. Our purpose is to show that some translators do not worry when dealing with linguistic variety. In other words, they just translate the story without taking into account some important linguistic aspects which need attention, such as language variation.

Keywords: classical books, linguistic variation, sociolinguistics, translation

Procedia PDF Downloads 398
1617 Object-Based Image Analysis for Gully-Affected Area Detection in the Hilly Loess Plateau Region of China Using Unmanned Aerial Vehicle

Authors: Hu Ding, Kai Liu, Guoan Tang

Abstract:

The Chinese Loess Plateau suffers from serious gully erosion induced by natural and human causes. Gully features detection including gully-affected area and its two dimension parameters (length, width, area et al.), is a significant task not only for researchers but also for policy-makers. This study aims at gully-affected area detection in three catchments of Chinese Loess Plateau, which were selected in Changwu, Ansai, and Suide by using unmanned aerial vehicle (UAV). The methodology includes a sequence of UAV data generation, image segmentation, feature calculation and selection, and random forest classification. Two experiments were conducted to investigate the influences of segmentation strategy and feature selection. Results showed that vertical and horizontal root-mean-square errors were below 0.5 and 0.2 m, respectively, which were ideal for the Loess Plateau region. The segmentation strategy adopted in this paper, which considers the topographic information, and optimal parameter combination can improve the segmentation results. Besides, the overall extraction accuracy in Changwu, Ansai, and Suide achieved was 84.62%, 86.46%, and 93.06%, respectively, which indicated that the proposed method for detecting gully-affected area is more objective and effective than traditional methods. This study demonstrated that UAV can bridge the gap between field measurement and satellite-based remote sensing, obtaining a balance in resolution and efficiency for catchment-scale gully erosion research.

Keywords: unmanned aerial vehicle (UAV), object-analysis image analysis, gully erosion, gully-affected area, Loess Plateau, random forest

Procedia PDF Downloads 218
1616 Wind Fragility of Window Glass in 10-Story Apartment with Two Different Window Models

Authors: Viriyavudh Sim, WooYoung Jung

Abstract:

Damage due to high wind is not limited to load resistance components such as beam and column. The majority of damage is due to breach in the building envelope such as broken roof, window, and door. In this paper, wind fragility of window glass in residential apartment was determined to compare the difference between two window configuration models. Monte Carlo Simulation method had been used to derive damage data and analytical fragilities were constructed. Fragility of window system showed that window located in leeward wall had higher probability of failure, especially those close to the edge of structure. Between the two window models, Model 2 had higher probability of failure, this was due to the number of panel in this configuration.

Keywords: wind fragility, glass window, high rise building, wind disaster

Procedia PDF Downloads 259
1615 KUCERIA: A Media to Increase Students’ Reading Interest and Nutrition Knowledge

Authors: Luthfia A. Eka, Bertri M. Masita, G. Indah Lestari, Rizka. Ryanindya, Anindita D. Nur, Asih. Setiarini

Abstract:

The preferred habit nowadays is to watch television or listen to the radio rather than reading a newspaper or magazine. The low interest in reading is the reason to the Indonesian government passed a regulation to foster interest in reading early in schoolchildren through literacy programs. Literacy programs are held for the first 10 - 15 minutes before classes begin and children are asked to read books other than textbooks such as storybooks or magazines. In addition, elementary school children have a tendency to buy less healthy snacks around the school and do not know the nutrition fact from the food purchased. Whereas snacks contribute greatly in the fulfillment of energy and nutrients of children every day. The purpose of this study was to increase reading interest as well as knowledge of nutrition and health for elementary school students. This study used quantitative method with experimental study design for four months with twice intervention per week and deepened by qualitative method in the form of interview. The participants were 130 students consisting of 3rd and 4th graders in selected elementary school in Depok City. The Interventions given using KUCERIA (Child Storybook) which were storybooks with pictures consisting of 12 series about nutrition and health given at school literacy hours. There were five questions given by using the crossword method to find out the students' understanding of the story content in each series. To maximize the understanding and absorption of information, two students were asked to retell the story in front of the class and one student to fill the crossword on the board for each series. In addition, interviews were conducted by asking questions about students' interest in reading books. Intervention involved not only students but also teachers and parents in order to optimize students' reading habits. Analysis showed > 80% of student could answer 3 of 5 questions correctly in each series, which showed they had an interest in what they read. Research data on nutrition and health knowledge were analyzed using Wilcoxon and Chi-Square Test to see the relationship. However, only 46% of students completed 12 series and the rest lost to follow up due to school schedule incompatibility with the program. The results showed that there was a significant increase of knowledge (p = 0.000) between before intervention with 66,53 score and after intervention with 81,47 score. Retention of knowledge was conducted one month after the last intervention was administered and the analysis result showed no significant decrease of knowledge (p = 0,000) from 79,17 score to 75,48 score. There is also no relationship between sex and class with knowledge. Hence, an increased interest in reading of elementary school students and nutritional knowledge interventions using KUCERIA was proved successful. These interventions may be replicated in other schools or learning communities.

Keywords: literation, reading interest, nutrition knowledge, school children

Procedia PDF Downloads 148
1614 Comparative Study of R.C.C. Steel and Concrete Building

Authors: Mahesh Suresh Kumawat

Abstract:

Steel concrete composite construction means the concrete slab is connected to the steel beam with the help of shear connectors so that they act as a single unit. In the present work, steel concrete composite with RCC options are considered for comparative study of G+9 story commercial building which is situated in earthquake zone-III and for earthquake loading, the provisions of IS: 1893(Part1)-2002 is considered. A three dimensional modeling and analysis of the structure are carried out with the help of SAP 2000 software. Equivalent Static Method of Analysis and Response spectrum analysis method are used for the analysis of both Composite & R.C.C. structures. The results are compared and it was found that composite structure is more economical.

Keywords: composite beam, column, RCC column, RCC beam, shear connector, SAP 2000 software

Procedia PDF Downloads 452
1613 Four Museums for One (Hi) Story

Authors: Sheyla Moroni

Abstract:

A number of scholars around the world have analyzed the great architectural and urban planning revolution proposed by Skopje 2014, but so far, there are no readings of the parallels between the museums in the Balkan area (including Greece) that share the same name as the museum at the center of that political and cultural revolution. In the former FYROM (now renamed North Macedonia), a museum called "Macedonian Struggle" was born during the reconstruction of the city of Skopje as the new "national" capital. This new museum was built under the "Skopje 2014" plan and cost about 560 million euros (1/3 of the country's GDP). It has been a "flagship" of the government of Nikola Gruevski, leader of the nationalist VMRO-DPMNE party. Until 2016 this museum was close to the motivations of the Macedonian nationalist movement (and later party) active (including terrorist actions) during the 19th and 20th centuries. The museum served to narrate a new "nation-building" after "state-building" had already taken place. But there are three other museums that tell the story of the "Macedonian struggle" by understanding "Macedonia" as a territory other than present-day North Macedonia. The first one is located in Thessaloniki and primarily commemorates the "Greek battle" against the Ottoman Empire. While the first uses a new dark building and many reconstructed rooms and shows the bloody history of the quest for "freedom" for the Macedonian language and people (different from Greeks, Albanians, and Bulgarians), the second is located in an old building in Thessaloniki and in its six rooms on the ground floor graphically illustrates the modern and contemporary history of Greek Macedonia. There are also third and fourth museums: in Kastoria (toward the Albanian border) and in Chromio (near the Greek-North Macedonian border). These two museums (Kastoria and Chromio) are smaller, but they mark two important borders for the (Greek) regions bordering Albania to the east and dividing it to the northwest not only from the Ottoman past but also from two communities felt to be "foreign" (Albanians and former Yugoslav Macedonians). All museums reconstruct a different "national edifice" and emphasize the themes of language and religion. The objective of the research is to understand, through four museums bearing the same name, what are the main "mental boundaries" (religious, linguistic, cultural) of the different states (reconstructed between the late 19th century and 1991). Both classical historiographic methodology (very different between Balkan and "Western" areas) and on-site observation and interactions with different sites are used in this research. An attempt is made to highlight four different political focuses with respect to nation-building and the Public History (and/or propaganda) approaches applied in the construction of these buildings and memorials tendency often that one "defines" oneself by differences from "others" (even if close).

Keywords: nationalisms, museum, nation building, public history

Procedia PDF Downloads 86
1612 When Talk Is the Cure for the Morning After: Talking Therapy in Conor Mcpherson’s Dublin Carol and Shining City

Authors: Maha Hamoud Alatawi

Abstract:

Drawing on the work of John McLeod and Ariel Watson, this paper explains the relationship between narrative and psychotherapy in two plays by the Irish playwright Conor McPherson. Dublin Carol presents John’s chequered past through his reminiscences of alcohol addiction and Shining City tells the story of John who is haunted by the ghost of his wife, recently died in a car accident, and who seeks the help of Ian, a therapist. At first, the significance of storytelling as an integral part of Irish culture is highlighted. Such a tradition features prominently in contemporary Irish drama. The paper concludes that it is the power of narrative and its therapeutic impact and not the act of psychotherapy and treatment which brings signs of change to characters’ lives.

Keywords: Conor McPherson, drama, psychotherapy, storytelling

Procedia PDF Downloads 315
1611 Epilepsy Seizure Prediction by Effective Connectivity Estimation Using Granger Causality and Directed Transfer Function Analysis of Multi-Channel Electroencephalogram

Authors: Mona Hejazi, Ali Motie Nasrabadi

Abstract:

Epilepsy is a persistent neurological disorder that affects more than 50 million people worldwide. Hence, there is a necessity to introduce an efficient prediction model for making a correct diagnosis of the epileptic seizure and accurate prediction of its type. In this study we consider how the Effective Connectivity (EC) patterns obtained from intracranial Electroencephalographic (EEG) recordings reveal information about the dynamics of the epileptic brain and can be used to predict imminent seizures, as this will enable the patients (and caregivers) to take appropriate precautions. We use this definition because we believe that effective connectivity near seizures begin to change, so we can predict seizures according to this feature. Results are reported on the standard Freiburg EEG dataset which contains data from 21 patients suffering from medically intractable focal epilepsy. Six channels of EEG from each patients are considered and effective connectivity using Directed Transfer Function (DTF) and Granger Causality (GC) methods is estimated. We concentrate on effective connectivity standard deviation over time and feature changes in five brain frequency sub-bands (Alpha, Beta, Theta, Delta, and Gamma) are compared. The performance obtained for the proposed scheme in predicting seizures is: average prediction time is 50 minutes before seizure onset, the maximum sensitivity is approximate ~80% and the false positive rate is 0.33 FP/h. DTF method is more acceptable to predict epileptic seizures and generally we can observe that the greater results are in gamma and beta sub-bands. The research of this paper is significantly helpful for clinical applications, especially for the exploitation of online portable devices.

Keywords: effective connectivity, Granger causality, directed transfer function, epilepsy seizure prediction, EEG

Procedia PDF Downloads 469
1610 A New Approach to Retrofit Steel Moment Resisting Frame Structures after Mainshock

Authors: Amir H. Farivarrad, Kiarash M. Dolatshahi

Abstract:

During earthquake events, aftershocks can significantly increase the probability of collapse of buildings, especially for those with induced damages during the mainshock. In this paper, a practical approach is proposed for seismic rehabilitation of mainshock-damaged buildings that can be easily implemented within few days after the mainshock. To show the efficacy of the proposed method, a case study nine story steel moment frame building is chosen which was designed to pre-Northridge codes. The collapse fragility curve for the aftershock is presented for both the retrofitted and non-retrofitted structures. Comparison of the collapse fragility curves shows that the proposed method is indeed applicable to reduce the seismic collapse risk.

Keywords: aftershock, the collapse fragility curve, seismic rehabilitation, seismic retrofitting

Procedia PDF Downloads 433
1609 Predicting Football Player Performance: Integrating Data Visualization and Machine Learning

Authors: Saahith M. S., Sivakami R.

Abstract:

In the realm of football analytics, particularly focusing on predicting football player performance, the ability to forecast player success accurately is of paramount importance for teams, managers, and fans. This study introduces an elaborate examination of predicting football player performance through the integration of data visualization methods and machine learning algorithms. The research entails the compilation of an extensive dataset comprising player attributes, conducting data preprocessing, feature selection, model selection, and model training to construct predictive models. The analysis within this study will involve delving into feature significance using methodologies like Select Best and Recursive Feature Elimination (RFE) to pinpoint pertinent attributes for predicting player performance. Various machine learning algorithms, including Random Forest, Decision Tree, Linear Regression, Support Vector Regression (SVR), and Artificial Neural Networks (ANN), will be explored to develop predictive models. The evaluation of each model's performance utilizing metrics such as Mean Squared Error (MSE) and R-squared will be executed to gauge their efficacy in predicting player performance. Furthermore, this investigation will encompass a top player analysis to recognize the top-performing players based on the anticipated overall performance scores. Nationality analysis will entail scrutinizing the player distribution based on nationality and investigating potential correlations between nationality and player performance. Positional analysis will concentrate on examining the player distribution across various positions and assessing the average performance of players in each position. Age analysis will evaluate the influence of age on player performance and identify any discernible trends or patterns associated with player age groups. The primary objective is to predict a football player's overall performance accurately based on their individual attributes, leveraging data-driven insights to enrich the comprehension of player success on the field. By amalgamating data visualization and machine learning methodologies, the aim is to furnish valuable tools for teams, managers, and fans to effectively analyze and forecast player performance. This research contributes to the progression of sports analytics by showcasing the potential of machine learning in predicting football player performance and offering actionable insights for diverse stakeholders in the football industry.

Keywords: football analytics, player performance prediction, data visualization, machine learning algorithms, random forest, decision tree, linear regression, support vector regression, artificial neural networks, model evaluation, top player analysis, nationality analysis, positional analysis

Procedia PDF Downloads 39