Search results for: scene classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2366

Search results for: scene classification

1346 Cable Transport for a Smart City: Between Challenges and Opportunities, Case of the City of Algiers, Algeria

Authors: Ihaddadene Thanina, Haraoubia Imane, Baouni Tahar

Abstract:

Urban mobility is one of the first challenges of cities; it is becoming more and more problematic because it is perceived as the cause of many dysfunctions; it is not only to facilitate accessibility but also to ensure vast benefits. For this reason, several cities in the world have thought about alternatives to smart mobility and sustainable transport. Today, the sustainable city has many cards at its disposal, and a new mode is entering the urban scene: aerial cable transport; it has imposed itself as an effective mode of public transport and a real solution for the future. This electric mobility brings a new dimension, not only to collective daily travel but also to the urban space. It has an excellent capacity to redevelop the public space; it is a catalyst that allows one to appreciate the view from the sky and to discover different large-scale projects that bring an important attractiveness to the city. With regard to the cities in the world which use these systems of transport: Algeria does not escape this reality; it is the country which has the greatest number of devices of urban transport by cable in the world, with installations in many cities such as Tlemcen, Constantine, Blida, Oran, Tizi-Ouzou, Annaba, Skikda. The following study explores the role of cable transport in the transformation of the city of Algiers into a smart city. The methodology used in this work is based on the development of a set of indicators using a questionnaire survey. The main objective of this work is to shed light on cable transport as a key issue in designing the sustainable city of tomorrow, to evaluate its role in the city of Algiers, and its ability to integrate into the urban transport network.

Keywords: Algiers, cable transport, indicators, smart city

Procedia PDF Downloads 89
1345 Study and Analysis of the Factors Affecting Road Safety Using Decision Tree Algorithms

Authors: Naina Mahajan, Bikram Pal Kaur

Abstract:

The purpose of traffic accident analysis is to find the possible causes of an accident. Road accidents cannot be totally prevented but by suitable traffic engineering and management the accident rate can be reduced to a certain extent. This paper discusses the classification techniques C4.5 and ID3 using the WEKA Data mining tool. These techniques use on the NH (National highway) dataset. With the C4.5 and ID3 technique it gives best results and high accuracy with less computation time and error rate.

Keywords: C4.5, ID3, NH(National highway), WEKA data mining tool

Procedia PDF Downloads 316
1344 Fault-Tolerant Control Study and Classification: Case Study of a Hydraulic-Press Model Simulated in Real-Time

Authors: Jorge Rodriguez-Guerra, Carlos Calleja, Aron Pujana, Iker Elorza, Ana Maria Macarulla

Abstract:

Society demands more reliable manufacturing processes capable of producing high quality products in shorter production cycles. New control algorithms have been studied to satisfy this paradigm, in which Fault-Tolerant Control (FTC) plays a significant role. It is suitable to detect, isolate and adapt a system when a harmful or faulty situation appears. In this paper, a general overview about FTC characteristics are exposed; highlighting the properties a system must ensure to be considered faultless. In addition, a research to identify which are the main FTC techniques and a classification based on their characteristics is presented in two main groups: Active Fault-Tolerant Controllers (AFTCs) and Passive Fault-Tolerant Controllers (PFTCs). AFTC encompasses the techniques capable of re-configuring the process control algorithm after the fault has been detected, while PFTC comprehends the algorithms robust enough to bypass the fault without further modifications. The mentioned re-configuration requires two stages, one focused on detection, isolation and identification of the fault source and the other one in charge of re-designing the control algorithm by two approaches: fault accommodation and control re-design. From the algorithms studied, one has been selected and applied to a case study based on an industrial hydraulic-press. The developed model has been embedded under a real-time validation platform, which allows testing the FTC algorithms and analyse how the system will respond when a fault arises in similar conditions as a machine will have on factory. One AFTC approach has been picked up as the methodology the system will follow in the fault recovery process. In a first instance, the fault will be detected, isolated and identified by means of a neural network. In a second instance, the control algorithm will be re-configured to overcome the fault and continue working without human interaction.

Keywords: fault-tolerant control, electro-hydraulic actuator, fault detection and isolation, control re-design, real-time

Procedia PDF Downloads 162
1343 A Preliminary Study of the Effects of Abiotic Environmental Variables on Early Diptera Carrion Colonizers in Algiers, Algeria

Authors: M. Taleb, G. Tail, F. Z. Kara, B. Djedouani T. Moussa

Abstract:

Necrophagous insects usually colonize cadavers within a short time after death. However, they are influenced by weather conditions, and their distribution and activity vary according to different time scales, which can affect the post-mortem interval (PMI) estimation. As no data have been published in Algeria on necrophagous insects visiting corpses, two field surveys were conducted in July 2012 and March 2013 at the National Institute for Criminalistics and Criminology (INCC) using rabbit carcasses (Oryctolagus cuniculus L.). The trials were designed to identify the necrophagous Diptera fauna of Algiers, Algeria and examine their variations according to environmental variables. Four hundred and eighteen Diptera adults belonging to five families were captured during this study. The species which were identified on human corpses in different regions of Algeria were also observed on the rabbit carcasses. Although seasonal variations of the species were observed, their abundance did not significantly vary between the two seasons. In addition to seasonal effects, the ambient temperature, the wind speed, and precipitation affect the number of trapped flies. These conclusions highlight the necessity of considering the environmental factors at a scene to estimate the post-mortem interval accurately. It is hoped that these findings provide basic information regarding the necrophagous Diptera fauna of Algeria.

Keywords: forensic entomology, necrophagous diptera, post-mortem interval, abiotic factors, Algeria

Procedia PDF Downloads 371
1342 Sustainability Enhancement of Pedestrian Space Quality in Old Communities from the Perspective of Inclusiveness:Taking Cao Yang New Village, Shanghai as an Example

Authors: Feng Zisu

Abstract:

Community is the basic unit of the city, community pedestrian space is also an important part of the urban public space, and its quality improvement is also closely related to the residents' happiness and sense of belonging. Domestic and international research perspectives on community pedestrian space have gradually changed to inclusive design for the whole population, paying more attention to the equitable accessibility of urban space and the multiple composite enhancement of spatial connotation. In order to realize the inclusive and sustainable development of pedestrian space in old communities, this article selects Cao Yang New Village in Shanghai as a practice case, and based on the connotation of inclusiveness, the four dimensions of space, traffic, function and emotion are selected as the layers of inclusive connotation of pedestrian space in old communities. This article identifies the objective social needs, dynamic activity characteristics and subjective feelings of multiple subjects, and reconstructs the structural hierarchy of “spatial perception - behavioral characteristics - subjective feelings” of walking. It also proposes a governance strategy of “reconfiguring the pedestrian network, optimizing street quality, integrating ecological space and reshaping the community scene” from the aspects of quality of physical environment and quality of behavioral perception, aiming to provide new ideas for promoting the inclusive and sustainable development of pedestrian space in old communities.

Keywords: inclusivity, old community, pedestrian space, spatial quality, sustainable renovation

Procedia PDF Downloads 12
1341 Mondoc: Informal Lightweight Ontology for Faceted Semantic Classification of Hypernymy

Authors: M. Regina Carreira-Lopez

Abstract:

Lightweight ontologies seek to concrete union relationships between a parent node, and a secondary node, also called "child node". This logic relation (L) can be formally defined as a triple ontological relation (LO) equivalent to LO in ⟨LN, LE, LC⟩, and where LN represents a finite set of nodes (N); LE is a set of entities (E), each of which represents a relationship between nodes to form a rooted tree of ⟨LN, LE⟩; and LC is a finite set of concepts (C), encoded in a formal language (FL). Mondoc enables more refined searches on semantic and classified facets for retrieving specialized knowledge about Atlantic migrations, from the Declaration of Independence of the United States of America (1776) and to the end of the Spanish Civil War (1939). The model looks forward to increasing documentary relevance by applying an inverse frequency of co-ocurrent hypernymy phenomena for a concrete dataset of textual corpora, with RMySQL package. Mondoc profiles archival utilities implementing SQL programming code, and allows data export to XML schemas, for achieving semantic and faceted analysis of speech by analyzing keywords in context (KWIC). The methodology applies random and unrestricted sampling techniques with RMySQL to verify the resonance phenomena of inverse documentary relevance between the number of co-occurrences of the same term (t) in more than two documents of a set of texts (D). Secondly, the research also evidences co-associations between (t) and their corresponding synonyms and antonyms (synsets) are also inverse. The results from grouping facets or polysemic words with synsets in more than two textual corpora within their syntagmatic context (nouns, verbs, adjectives, etc.) state how to proceed with semantic indexing of hypernymy phenomena for subject-heading lists and for authority lists for documentary and archival purposes. Mondoc contributes to the development of web directories and seems to achieve a proper and more selective search of e-documents (classification ontology). It can also foster on-line catalogs production for semantic authorities, or concepts, through XML schemas, because its applications could be used for implementing data models, by a prior adaptation of the based-ontology to structured meta-languages, such as OWL, RDF (descriptive ontology). Mondoc serves to the classification of concepts and applies a semantic indexing approach of facets. It enables information retrieval, as well as quantitative and qualitative data interpretation. The model reproduces a triple tuple ⟨LN, LE, LT, LCF L, BKF⟩ where LN is a set of entities that connect with other nodes to concrete a rooted tree in ⟨LN, LE⟩. LT specifies a set of terms, and LCF acts as a finite set of concepts, encoded in a formal language, L. Mondoc only resolves partial problems of linguistic ambiguity (in case of synonymy and antonymy), but neither the pragmatic dimension of natural language nor the cognitive perspective is addressed. To achieve this goal, forthcoming programming developments should target at oriented meta-languages with structured documents in XML.

Keywords: hypernymy, information retrieval, lightweight ontology, resonance

Procedia PDF Downloads 111
1340 Arboretum: Community Mixed Reality Nature Environment

Authors: Radek Richtr, Petr Paus

Abstract:

The connection to the primal environment, living and growing nature is disappearing for most of the residents in urban core areas nowadays. Most of the residents perceive scattered green mass like more technical objects than sentient living organisms. The Arboretum is a type of application from the 'serious games' genre -it is a research experiment masked as a gaming environment. In used virtual and augmented reality environments, every city district is represented by central objects; Pillars created as a result of resident’s consensus. Every player can furthermore plant and grow virtual organic seeds everywhere he wants. Seeds sprout, and their form is determined by both players’ choice and nearest pillar. Every house, private rooms, and even workspace get their new living virtual avatar-connected 'residents' growing from player-planted seeds. Every room or workspace is transformed into (calming) nature scene, reflecting in some way both players and community spirit and together create a vicinity environment. The conceptual design phase of the project is crucial and allows for the identification of the fundamental problems through abstraction. The project that centers on wide community usage needs a clear and accessible interface. Simultaneously the conceptual design allows early sharing of project ideas and creating public concern. The paper discusses the current conceptual model of an Arboretum project (which is part of a whole widespread project) and its validation.

Keywords: augmented reality, conceptual design, mixed reality, social engineering

Procedia PDF Downloads 214
1339 The Effects of Lithofacies on Oil Enrichment in Lucaogou Formation Fine-Grained Sedimentary Rocks in Santanghu Basin, China

Authors: Guoheng Liu, Zhilong Huang

Abstract:

For more than the past ten years, oil and gas production from marine shale such as the Barnett shale. In addition, in recent years, major breakthroughs have also been made in lacustrine shale gas exploration, such as the Yanchang Formation of the Ordos Basin in China. Lucaogou Formation shale, which is also lacustrine shale, has also yielded a high production in recent years, for wells such as M1, M6, and ML2, yielding a daily oil production of 5.6 tons, 37.4 tons and 13.56 tons, respectively. Lithologic identification and classification of reservoirs are the base and keys to oil and gas exploration. Lithology and lithofacies obviously control the distribution of oil and gas in lithological reservoirs, so it is of great significance to describe characteristics of lithology and lithofacies of reservoirs finely. Lithofacies is an intrinsic property of rock formed under certain conditions of sedimentation. Fine-grained sedimentary rocks such as shale formed under different sedimentary conditions display great particularity and distinctiveness. Hence, to our best knowledge, no constant and unified criteria and methods exist for fine-grained sedimentary rocks regarding lithofacies definition and classification. Consequently, multi-parameters and multi-disciplines are necessary. A series of qualitative descriptions and quantitative analysis were used to figure out the lithofacies characteristics and its effect on oil accumulation of Lucaogou formation fine-grained sedimentary rocks in Santanghu basin. The qualitative description includes core description, petrographic thin section observation, fluorescent thin-section observation, cathode luminescence observation and scanning electron microscope observation. The quantitative analyses include X-ray diffraction, total organic content analysis, ROCK-EVAL.II Methodology, soxhlet extraction, porosity and permeability analysis and oil saturation analysis. Three types of lithofacies were mainly well-developed in this study area, which is organic-rich massive shale lithofacies, organic-rich laminated and cloddy hybrid sedimentary lithofacies and organic-lean massive carbonate lithofacies. Organic-rich massive shale lithofacies mainly include massive shale and tuffaceous shale, of which quartz and clay minerals are the major components. Organic-rich laminated and cloddy hybrid sedimentary lithofacies contain lamina and cloddy structure. Rocks from this lithofacies chiefly consist of dolomite and quartz. Organic-lean massive carbonate lithofacies mainly contains massive bedding fine-grained carbonate rocks, of which fine-grained dolomite accounts for the main part. Organic-rich massive shale lithofacies contain the highest content of free hydrocarbon and solid organic matter. Moreover, more pores were developed in organic-rich massive shale lithofacies. Organic-lean massive carbonate lithofacies contain the lowest content solid organic matter and develop the least amount of pores. Organic-rich laminated and cloddy hybrid sedimentary lithofacies develop the largest number of cracks and fractures. To sum up, organic-rich massive shale lithofacies is the most favorable type of lithofacies. Organic-lean massive carbonate lithofacies is impossible for large scale oil accumulation.

Keywords: lithofacies classification, tuffaceous shale, oil enrichment, Lucaogou formation

Procedia PDF Downloads 195
1338 Constructing a Semi-Supervised Model for Network Intrusion Detection

Authors: Tigabu Dagne Akal

Abstract:

While advances in computer and communications technology have made the network ubiquitous, they have also rendered networked systems vulnerable to malicious attacks devised from a distance. These attacks or intrusions start with attackers infiltrating a network through a vulnerable host and then launching further attacks on the local network or Intranet. Nowadays, system administrators and network professionals can attempt to prevent such attacks by developing intrusion detection tools and systems using data mining technology. In this study, the experiments were conducted following the Knowledge Discovery in Database Process Model. The Knowledge Discovery in Database Process Model starts from selection of the datasets. The dataset used in this study has been taken from Massachusetts Institute of Technology Lincoln Laboratory. After taking the data, it has been pre-processed. The major pre-processing activities include fill in missed values, remove outliers; resolve inconsistencies, integration of data that contains both labelled and unlabelled datasets, dimensionality reduction, size reduction and data transformation activity like discretization tasks were done for this study. A total of 21,533 intrusion records are used for training the models. For validating the performance of the selected model a separate 3,397 records are used as a testing set. For building a predictive model for intrusion detection J48 decision tree and the Naïve Bayes algorithms have been tested as a classification approach for both with and without feature selection approaches. The model that was created using 10-fold cross validation using the J48 decision tree algorithm with the default parameter values showed the best classification accuracy. The model has a prediction accuracy of 96.11% on the training datasets and 93.2% on the test dataset to classify the new instances as normal, DOS, U2R, R2L and probe classes. The findings of this study have shown that the data mining methods generates interesting rules that are crucial for intrusion detection and prevention in the networking industry. Future research directions are forwarded to come up an applicable system in the area of the study.

Keywords: intrusion detection, data mining, computer science, data mining

Procedia PDF Downloads 283
1337 Convergence of Media in New Era

Authors: Mohamad Reza Asariha

Abstract:

The development and extension of modern communication innovations at an extraordinary speed has caused crucial changes in all financial, social, social and political areas of the world. The improvement of toady and cable innovations, in expansion to expanding the generation and dissemination needs of worldwide programs; the financial defense made it more appealing. The alter of the administration of mechanical economy to data economy and benefit economy in created nations brought approximately uncommon advancements within the standards of world exchange and as a result, it caused the extension of media organizations in outside measurements, and the advancement of financial speculations in many Asian nations, beside the worldwide demand for the utilization of media merchandise, made new markets, and the media both within the household scene of the nations and within the universal field. Universal and financial are of great significance and have and viable and compelling nearness within the condition of picking up, keeping up and expanding financial control and riches within the world. Moreover, mechanical progresses and mechanical joining are critical components in media auxiliary alter. This auxiliary alter took put beneath the impact of digitalization. That’s, the method that broke the boundaries between electronic media administrations. Until presently, the direction of mass media was totally subordinate on certain styles of data transmission that were for the most part utilized. Digitization made it conceivable for any content to be effortlessly transmitted through distinctive electronic transmission styles, and this media merging has had clear impacts on media approaches and the way mass media are controlled.

Keywords: media, digital era, digital ages, media convergence

Procedia PDF Downloads 56
1336 Recurrent Neural Networks for Classifying Outliers in Electronic Health Record Clinical Text

Authors: Duncan Wallace, M-Tahar Kechadi

Abstract:

In recent years, Machine Learning (ML) approaches have been successfully applied to an analysis of patient symptom data in the context of disease diagnosis, at least where such data is well codified. However, much of the data present in Electronic Health Records (EHR) are unlikely to prove suitable for classic ML approaches. Furthermore, as scores of data are widely spread across both hospitals and individuals, a decentralized, computationally scalable methodology is a priority. The focus of this paper is to develop a method to predict outliers in an out-of-hours healthcare provision center (OOHC). In particular, our research is based upon the early identification of patients who have underlying conditions which will cause them to repeatedly require medical attention. OOHC act as an ad-hoc delivery of triage and treatment, where interactions occur without recourse to a full medical history of the patient in question. Medical histories, relating to patients contacting an OOHC, may reside in several distinct EHR systems in multiple hospitals or surgeries, which are unavailable to the OOHC in question. As such, although a local solution is optimal for this problem, it follows that the data under investigation is incomplete, heterogeneous, and comprised mostly of noisy textual notes compiled during routine OOHC activities. Through the use of Deep Learning methodologies, the aim of this paper is to provide the means to identify patient cases, upon initial contact, which are likely to relate to such outliers. To this end, we compare the performance of Long Short-Term Memory, Gated Recurrent Units, and combinations of both with Convolutional Neural Networks. A further aim of this paper is to elucidate the discovery of such outliers by examining the exact terms which provide a strong indication of positive and negative case entries. While free-text is the principal data extracted from EHRs for classification, EHRs also contain normalized features. Although the specific demographical features treated within our corpus are relatively limited in scope, we examine whether it is beneficial to include such features among the inputs to our neural network, or whether these features are more successfully exploited in conjunction with a different form of a classifier. In this section, we compare the performance of randomly generated regression trees and support vector machines and determine the extent to which our classification program can be improved upon by using either of these machine learning approaches in conjunction with the output of our Recurrent Neural Network application. The output of our neural network is also used to help determine the most significant lexemes present within the corpus for determining high-risk patients. By combining the confidence of our classification program in relation to lexemes within true positive and true negative cases, with an inverse document frequency of the lexemes related to these cases, we can determine what features act as the primary indicators of frequent-attender and non-frequent-attender cases, providing a human interpretable appreciation of how our program classifies cases.

Keywords: artificial neural networks, data-mining, machine learning, medical informatics

Procedia PDF Downloads 113
1335 Application of Biosensors in Forensic Analysis

Authors: Shirin jalili, Hadi Shirzad, Samaneh Nabavi, Somayeh Khanjani

Abstract:

Biosensors in forensic analysis are ideal biological tools that can be used for rapid and sensitive initial screening and testing to detect of suspicious components like biological and chemical agent in crime scenes. The wide use of different biomolecules such as proteins, nucleic acids, microorganisms, antibodies and enzymes makes it possible. These biosensors have great advantages such as rapidity, little sample manipulation and high sensitivity, also Because of their stability, specificity and low cost they have become a very important tool to Forensic analysis and detection of crime. In crime scenes different substances such as rape samples, Semen, saliva fingerprints and blood samples, act as a detecting elements for biosensors. On the other hand, successful fluid recovery via biosensor has the propensity to yield a highly valuable source of genetic material, which is important in finding the suspect. Although current biological fluid testing techniques are impaired for identification of body fluids. But these methods have disadvantages. For example if they are to be used simultaneously, Often give false positive result. These limitations can negatively result the output of a case through missed or misinterpreted evidence. The use of biosensor enable criminal researchers the highly sensitive and non-destructive detection of biological fluid through interaction with several fluid-endogenous and other biological and chemical contamination at the crime scene. For this reason, using of the biosensors for detecting the biological fluid found at the crime scenes which play an important role in identifying the suspect and solving the criminal.

Keywords: biosensors, forensic analysis, biological fluid, crime detection

Procedia PDF Downloads 1091
1334 A Kruskal Based Heuxistic for the Application of Spanning Tree

Authors: Anjan Naidu

Abstract:

In this paper we first discuss the minimum spanning tree, then we use the Kruskal algorithm to obtain minimum spanning tree. Based on Kruskal algorithm we propose Kruskal algorithm to apply an application to find minimum cost applying the concept of spanning tree.

Keywords: Minimum Spanning tree, algorithm, Heuxistic, application, classification of Sub 97K90

Procedia PDF Downloads 426
1333 Towards Learning Query Expansion

Authors: Ahlem Bouziri, Chiraz Latiri, Eric Gaussier

Abstract:

The steady growth in the size of textual document collections is a key progress-driver for modern information retrieval techniques whose effectiveness and efficiency are constantly challenged. Given a user query, the number of retrieved documents can be overwhelmingly large, hampering their efficient exploitation by the user. In addition, retaining only relevant documents in a query answer is of paramount importance for an effective meeting of the user needs. In this situation, the query expansion technique offers an interesting solution for obtaining a complete answer while preserving the quality of retained documents. This mainly relies on an accurate choice of the added terms to an initial query. Interestingly enough, query expansion takes advantage of large text volumes by extracting statistical information about index terms co-occurrences and using it to make user queries better fit the real information needs. In this respect, a promising track consists in the application of data mining methods to extract dependencies between terms, namely a generic basis of association rules between terms. The key feature of our approach is a better trade off between the size of the mining result and the conveyed knowledge. Thus, face to the huge number of derived association rules and in order to select the optimal combination of query terms from the generic basis, we propose to model the problem as a classification problem and solve it using a supervised learning algorithm such as SVM or k-means. For this purpose, we first generate a training set using a genetic algorithm based approach that explores the association rules space in order to find an optimal set of expansion terms, improving the MAP of the search results. The experiments were performed on SDA 95 collection, a data collection for information retrieval. It was found that the results were better in both terms of MAP and NDCG. The main observation is that the hybridization of text mining techniques and query expansion in an intelligent way allows us to incorporate the good features of all of them. As this is a preliminary attempt in this direction, there is a large scope for enhancing the proposed method.

Keywords: supervised leaning, classification, query expansion, association rules

Procedia PDF Downloads 309
1332 Spatial Object-Oriented Template Matching Algorithm Using Normalized Cross-Correlation Criterion for Tracking Aerial Image Scene

Authors: Jigg Pelayo, Ricardo Villar

Abstract:

Leaning on the development of aerial laser scanning in the Philippine geospatial industry, researches about remote sensing and machine vision technology became a trend. Object detection via template matching is one of its application which characterized to be fast and in real time. The paper purposely attempts to provide application for robust pattern matching algorithm based on the normalized cross correlation (NCC) criterion function subjected in Object-based image analysis (OBIA) utilizing high-resolution aerial imagery and low density LiDAR data. The height information from laser scanning provides effective partitioning order, thus improving the hierarchal class feature pattern which allows to skip unnecessary calculation. Since detection is executed in the object-oriented platform, mathematical morphology and multi-level filter algorithms were established to effectively avoid the influence of noise, small distortion and fluctuating image saturation that affect the rate of recognition of features. Furthermore, the scheme is evaluated to recognized the performance in different situations and inspect the computational complexities of the algorithms. Its effectiveness is demonstrated in areas of Misamis Oriental province, achieving an overall accuracy of 91% above. Also, the garnered results portray the potential and efficiency of the implemented algorithm under different lighting conditions.

Keywords: algorithm, LiDAR, object recognition, OBIA

Procedia PDF Downloads 230
1331 Human Gait Recognition Using Moment with Fuzzy

Authors: Jyoti Bharti, Navneet Manjhi, M. K.Gupta, Bimi Jain

Abstract:

A reliable gait features are required to extract the gait sequences from an images. In this paper suggested a simple method for gait identification which is based on moments. Moment values are extracted on different number of frames of gray scale and silhouette images of CASIA database. These moment values are considered as feature values. Fuzzy logic and nearest neighbour classifier are used for classification. Both achieved higher recognition.

Keywords: gait, fuzzy logic, nearest neighbour, recognition rate, moments

Procedia PDF Downloads 737
1330 The Romero-System Clarinet: A Milestone in the 19th Century Clarinet Manufacture

Authors: Pedro Rubio

Abstract:

Antonio Romero y Andía, was one of the most active and interesting figures in 19th century Spanish music. He was not only an exceptional clarinetist, he was also a publisher, a brilliant oboist, a music critic, and he revitalized Madrid’s musical scene by promoting orchestras and a national opera. In 1849, Romero was appointed Professor of Clarinet at the Conservatory of Madrid. Shortly after, Romero introduced to Spain the Boehm-System clarinet recently appeared in France. However, when initial interest in that system waned, he conceived his own system in 1853. The clarinet was manufactured in Paris by Lefêvre, who registered its first patent in 1862. In 1867 a second version was patented, and a year earlier, in 1866, the Romero clarinet was adopted as an official instrument for teaching the clarinet at the Conservatory of Madrid. The Romero-System clarinet mechanism has incorporated numerous additional devices and several extra keys, its skillful combination in a single instrument represents not only one of the pinnacles in the manufacture of musical instruments of the 19th century, but also an authentic synthesis of knowledge and practice in an era in which woodwind instruments were shaped as we know them today. Through the description and analysis of the data related to the aforementioned historical period, this lecture will try to show a crucial time in the history of all woodwind instruments, a period of technological effervescence in which the Romero-System clarinet emerged. The different stages of conception of the clarinet will be described, as well as its manufacturing and marketing process. Romero played with his clarinet system over twenty-five years. The research has identified the repertoire associated with this instrument whose conclusions will be presented in its case in the Congress.

Keywords: Antonio Romero, clarinet, keywork, 19th century

Procedia PDF Downloads 106
1329 Adolescent-Parent Relationship as the Most Important Factor in Preventing Mood Disorders in Adolescents: An Application of Artificial Intelligence to Social Studies

Authors: Elżbieta Turska

Abstract:

Introduction: One of the most difficult times in a person’s life is adolescence. The experiences in this period may shape the future life of this person to a large extent. This is the reason why many young people experience sadness, dejection, hopelessness, sense of worthlessness, as well as losing interest in various activities and social relationships, all of which are often classified as mood disorders. As many as 15-40% adolescents experience depressed moods and for most of them they resolve and are not carried into adulthood. However, (5-6%) of those affected by mood disorders develop the depressive syndrome and as many as (1-3%) develop full-blown clinical depression. Materials: A large questionnaire was given to 2508 students, aged 13–16 years old, and one of its parts was the Burns checklist, i.e. the standard test for identifying depressed mood. The questionnaire asked about many aspects of the student’s life, it included a total of 53 questions, most of which had subquestions. It is important to note that the data suffered from many problems, the most important of which were missing data and collinearity. Aim: In order to identify the correlates of mood disorders we built predictive models which were then trained and validated. Our aim was not to be able to predict which students suffer from mood disorders but rather to explore the factors influencing mood disorders. Methods: The problems with data described above practically excluded using all classical statistical methods. For this reason, we attempted to use the following Artificial Intelligence (AI) methods: classification trees with surrogate variables, random forests and xgboost. All analyses were carried out with the use of the mlr package for the R programming language. Resuts: The predictive model built by classification trees algorithm outperformed the other algorithms by a large margin. As a result, we were able to rank the variables (questions and subquestions from the questionnaire) from the most to least influential as far as protection against mood disorder is concerned. Thirteen out of twenty most important variables reflect the relationships with parents. This seems to be a really significant result both from the cognitive point of view and also from the practical point of view, i.e. as far as interventions to correct mood disorders are concerned.

Keywords: mood disorders, adolescents, family, artificial intelligence

Procedia PDF Downloads 86
1328 Detecting Covid-19 Fake News Using Deep Learning Technique

Authors: AnjalI A. Prasad

Abstract:

Nowadays, social media played an important role in spreading misinformation or fake news. This study analyzes the fake news related to the COVID-19 pandemic spread in social media. This paper aims at evaluating and comparing different approaches that are used to mitigate this issue, including popular deep learning approaches, such as CNN, RNN, LSTM, and BERT algorithm for classification. To evaluate models’ performance, we used accuracy, precision, recall, and F1-score as the evaluation metrics. And finally, compare which algorithm shows better result among the four algorithms.

Keywords: BERT, CNN, LSTM, RNN

Procedia PDF Downloads 192
1327 Development of Latent Fingerprints on Non-Porous Surfaces Recovered from Fresh and Sea Water

Authors: A. Somaya Madkour, B. Abeer sheta, C. Fatma Badr El Dine, D. Yasser Elwakeel, E. Nermine AbdAllah

Abstract:

Criminal offenders have a fundamental goal not to leave any traces at the crime scene. Some may suppose that items recovered underwater will have no forensic value, therefore, they try to destroy the traces by throwing items in water. These traces are subjected to the destructive environmental effects. This can represent a challenge for Forensic experts investigating finger marks. Accordingly, the present study was conducted to determine the optimal method for latent fingerprints development on non-porous surfaces submerged in aquatic environments at different time interval. The two factors analyzed in this study were the nature of aquatic environment and length of submerged time. In addition, the quality of developed finger marks depending on the used method was also assessed. Therefore, latent fingerprints were deposited on metallic, plastic and glass objects and submerged in fresh or sea water for one, two, and ten days. After recovery, the items were subjected to cyanoacrylate fuming, black powder and small particle reagent processing and the prints were examined. Each print was evaluated according to fingerprint quality assessment scale. The present study demonstrated that the duration of submersion affects the quality of finger marks; the longer the duration, the worse the quality.The best results of visualization were achieved using cyanoacrylate either in fresh or sea water. This study has also revealed that the exposure to sea water had more destructive influence on the quality of detected finger marks.

Keywords: fingerprints, fresh water, sea, non-porous

Procedia PDF Downloads 440
1326 Assessing the Utility of Unmanned Aerial Vehicle-Borne Hyperspectral Image and Photogrammetry Derived 3D Data for Wetland Species Distribution Quick Mapping

Authors: Qiaosi Li, Frankie Kwan Kit Wong, Tung Fung

Abstract:

Lightweight unmanned aerial vehicle (UAV) loading with novel sensors offers a low cost approach for data acquisition in complex environment. This study established a framework for applying UAV system in complex environment quick mapping and assessed the performance of UAV-based hyperspectral image and digital surface model (DSM) derived from photogrammetric point clouds for 13 species classification in wetland area Mai Po Inner Deep Bay Ramsar Site, Hong Kong. The study area was part of shallow bay with flat terrain and the major species including reedbed and four mangroves: Kandelia obovata, Aegiceras corniculatum, Acrostichum auerum and Acanthus ilicifolius. Other species involved in various graminaceous plants, tarbor, shrub and invasive species Mikania micrantha. In particular, invasive species climbed up to the mangrove canopy caused damage and morphology change which might increase species distinguishing difficulty. Hyperspectral images were acquired by Headwall Nano sensor with spectral range from 400nm to 1000nm and 0.06m spatial resolution image. A sequence of multi-view RGB images was captured with 0.02m spatial resolution and 75% overlap. Hyperspectral image was corrected for radiative and geometric distortion while high resolution RGB images were matched to generate maximum dense point clouds. Furtherly, a 5 cm grid digital surface model (DSM) was derived from dense point clouds. Multiple feature reduction methods were compared to identify the efficient method and to explore the significant spectral bands in distinguishing different species. Examined methods including stepwise discriminant analysis (DA), support vector machine (SVM) and minimum noise fraction (MNF) transformation. Subsequently, spectral subsets composed of the first 20 most importance bands extracted by SVM, DA and MNF, and multi-source subsets adding extra DSM to 20 spectrum bands were served as input in maximum likelihood classifier (MLC) and SVM classifier to compare the classification result. Classification results showed that feature reduction methods from best to worst are MNF transformation, DA and SVM. MNF transformation accuracy was even higher than all bands input result. Selected bands frequently laid along the green peak, red edge and near infrared. Additionally, DA found that chlorophyll absorption red band and yellow band were also important for species classification. In terms of 3D data, DSM enhanced the discriminant capacity among low plants, arbor and mangrove. Meanwhile, DSM largely reduced misclassification due to the shadow effect and morphological variation of inter-species. In respect to classifier, nonparametric SVM outperformed than MLC for high dimension and multi-source data in this study. SVM classifier tended to produce higher overall accuracy and reduce scattered patches although it costs more time than MLC. The best result was obtained by combining MNF components and DSM in SVM classifier. This study offered a precision species distribution survey solution for inaccessible wetland area with low cost of time and labour. In addition, findings relevant to the positive effect of DSM as well as spectral feature identification indicated that the utility of UAV-borne hyperspectral and photogrammetry deriving 3D data is promising in further research on wetland species such as bio-parameters modelling and biological invasion monitoring.

Keywords: digital surface model (DSM), feature reduction, hyperspectral, photogrammetric point cloud, species mapping, unmanned aerial vehicle (UAV)

Procedia PDF Downloads 243
1325 Text Localization in Fixed-Layout Documents Using Convolutional Networks in a Coarse-to-Fine Manner

Authors: Beier Zhu, Rui Zhang, Qi Song

Abstract:

Text contained within fixed-layout documents can be of great semantic value and so requires a high localization accuracy, such as ID cards, invoices, cheques, and passports. Recently, algorithms based on deep convolutional networks achieve high performance on text detection tasks. However, for text localization in fixed-layout documents, such algorithms detect word bounding boxes individually, which ignores the layout information. This paper presents a novel architecture built on convolutional neural networks (CNNs). A global text localization network and a regional bounding-box regression network are introduced to tackle the problem in a coarse-to-fine manner. The text localization network simultaneously locates word bounding points, which takes the layout information into account. The bounding-box regression network inputs the features pooled from arbitrarily sized RoIs and refine the localizations. These two networks share their convolutional features and are trained jointly. A typical type of fixed-layout documents: ID cards, is selected to evaluate the effectiveness of the proposed system. These networks are trained on data cropped from nature scene images, and synthetic data produced by a synthetic text generation engine. Experiments show that our approach locates high accuracy word bounding boxes and achieves state-of-the-art performance.

Keywords: bounding box regression, convolutional networks, fixed-layout documents, text localization

Procedia PDF Downloads 179
1324 Language Inequalities in the Algerian Public Space: A Semiotic Landscape Analysis

Authors: Sarah Smail

Abstract:

Algeria has been subject to countless conquests and invasions that resulted in having a diverse linguistic repertoire. The sociolinguistic situation of the country made linguistic landscape analysis pertinent. This in fact, has led to the growth of diverse linguistic landscape studies that mainly focused on identifying the sociolinguistic situation of the country through shop names analysis. The present research adds to the existing literature by offering another perspective to the analysis of signs by combining the physical and digital semiotic landscape. The powerful oil, gas and agri-food industries in Algeria make it interesting to focus on the commodification of natural resources for the sake of identifying the language and semiotic resources deployed in the Algerian public scene in addition to the identification of the visibility of linguistic inequalities and minorities in the business domain. The study discusses the semiotic landscape of three trade cities: Bejaia, Setif and Hassi-Messaoud. In addition to interviews conducted with business owners and graphic designers and questionnaires with business employees. Withal, the study relies on Gorter’s multilingual inequalities in public space (MIPS) model (2021) and Irvine and Gal’s language ideology and linguistic differentiation (2000). The preliminary results demonstrate the sociolinguistic injustice existing in the business domain, e.g., the exclusion of the official languages, the dominance of foreign languages, and the excessive use of the roman script.

Keywords: semiotic landscaping, digital scapes, language commodification, linguistic inequalities, business signage

Procedia PDF Downloads 88
1323 Flood Hazard Assessment and Land Cover Dynamics of the Orai Khola Watershed, Bardiya, Nepal

Authors: Loonibha Manandhar, Rajendra Bhandari, Kumud Raj Kafle

Abstract:

Nepal’s Terai region is a part of the Ganges river basin which is one of the most disaster-prone areas of the world, with recurrent monsoon flooding causing millions in damage and the death and displacement of hundreds of people and households every year. The vulnerability of human settlements to natural disasters such as floods is increasing, and mapping changes in land use practices and hydro-geological parameters is essential in developing resilient communities and strong disaster management policies. The objective of this study was to develop a flood hazard zonation map of Orai Khola watershed and map the decadal land use/land cover dynamics of the watershed. The watershed area was delineated using SRTM DEM, and LANDSAT images were classified into five land use classes (forest, grassland, sediment and bare land, settlement area and cropland, and water body) using pixel-based semi-automated supervised maximum likelihood classification. Decadal changes in each class were then quantified using spatial modelling. Flood hazard mapping was performed by assigning weights to factors slope, rainfall distribution, distance from the river and land use/land cover on the basis of their estimated influence in causing flood hazard and performing weighed overlay analysis to identify areas that are highly vulnerable. The forest and grassland coverage increased by 11.53 km² (3.8%) and 1.43 km² (0.47%) from 1996 to 2016. The sediment and bare land areas decreased by 12.45 km² (4.12%) from 1996 to 2016 whereas settlement and cropland areas showed a consistent increase to 14.22 km² (4.7%). Waterbody coverage also increased to 0.3 km² (0.09%) from 1996-2016. 1.27% (3.65 km²) of total watershed area was categorized into very low hazard zone, 20.94% (60.31 km²) area into low hazard zone, 37.59% (108.3 km²) area into moderate hazard zone, 29.25% (84.27 km²) area into high hazard zone and 31 villages which comprised 10.95% (31.55 km²) were categorized into high hazard zone area.

Keywords: flood hazard, land use/land cover, Orai river, supervised maximum likelihood classification, weighed overlay analysis

Procedia PDF Downloads 327
1322 Characterization of Agroforestry Systems in Burkina Faso Using an Earth Observation Data Cube

Authors: Dan Kanmegne

Abstract:

Africa will become the most populated continent by the end of the century, with around 4 billion inhabitants. Food security and climate changes will become continental issues since agricultural practices depend on climate but also contribute to global emissions and land degradation. Agroforestry has been identified as a cost-efficient and reliable strategy to address these two issues. It is defined as the integrated management of trees and crops/animals in the same land unit. Agroforestry provides benefits in terms of goods (fruits, medicine, wood, etc.) and services (windbreaks, fertility, etc.), and is acknowledged to have a great potential for carbon sequestration; therefore it can be integrated into reduction mechanisms of carbon emissions. Particularly in sub-Saharan Africa, the constraint stands in the lack of information about both areas under agroforestry and the characterization (composition, structure, and management) of each agroforestry system at the country level. This study describes and quantifies “what is where?”, earliest to the quantification of carbon stock in different systems. Remote sensing (RS) is the most efficient approach to map such a dynamic technology as agroforestry since it gives relatively adequate and consistent information over a large area at nearly no cost. RS data fulfill the good practice guidelines of the Intergovernmental Panel On Climate Change (IPCC) that is to be used in carbon estimation. Satellite data are getting more and more accessible, and the archives are growing exponentially. To retrieve useful information to support decision-making out of this large amount of data, satellite data needs to be organized so to ensure fast processing, quick accessibility, and ease of use. A new solution is a data cube, which can be understood as a multi-dimensional stack (space, time, data type) of spatially aligned pixels and used for efficient access and analysis. A data cube for Burkina Faso has been set up from the cooperation project between the international service provider WASCAL and Germany, which provides an accessible exploitation architecture of multi-temporal satellite data. The aim of this study is to map and characterize agroforestry systems using the Burkina Faso earth observation data cube. The approach in its initial stage is based on an unsupervised image classification of a normalized difference vegetation index (NDVI) time series from 2010 to 2018, to stratify the country based on the vegetation. Fifteen strata were identified, and four samples per location were randomly assigned to define the sampling units. For safety reasons, the northern part will not be part of the fieldwork. A total of 52 locations will be visited by the end of the dry season in February-March 2020. The field campaigns will consist of identifying and describing different agroforestry systems and qualitative interviews. A multi-temporal supervised image classification will be done with a random forest algorithm, and the field data will be used for both training the algorithm and accuracy assessment. The expected outputs are (i) map(s) of agroforestry dynamics, (ii) characteristics of different systems (main species, management, area, etc.); (iii) assessment report of Burkina Faso data cube.

Keywords: agroforestry systems, Burkina Faso, earth observation data cube, multi-temporal image classification

Procedia PDF Downloads 127
1321 Can the Intervention of SCAMPER Bring about Changes of Neural Activation While Taking Creativity Tasks?

Authors: Yu-Chu Yeh, WeiChin Hsu, Chih-Yen Chang

Abstract:

Substitution, combination, modification, putting to other uses, elimination, and rearrangement (SCAMPER) has been regarded as an effective technique that provides a structured way to help people to produce creative ideas and solutions. Although some neuroscience studies regarding creativity training have been conducted, no study has focused on SCAMPER. This study therefore aimed at examining whether the learning of SCAMPER through video tutorials would result in alternations of neural activation. Thirty college students were randomly assigned to the experimental group or the control group. The experimental group was requested to watch SCAMPER videos, whereas the control group was asked to watch natural-scene videos which were regarded as neutral stimulating materials. Each participant was brain scanned in a Functional magnetic resonance imaging (fMRI) machine while undertaking a creativity test before and after watching the videos. Furthermore, a two-way ANOVA was used to analyze the interaction between groups (the experimental group; the control group) and tasks (C task; M task; X task). The results revealed that the left precuneus significantly activated in the interaction of groups and tasks, as well as in the main effect of group. Furthermore, compared with the control group, the experimental group had greater activation in the default mode network (left precuneus and left inferior parietal cortex) and the motor network (left postcentral gyrus and left supplementary area). The findings suggest that the SCAMPER training may facilitate creativity through the stimulation of the default mode network and the motor network.

Keywords: creativity, default mode network, neural activation, SCAMPER

Procedia PDF Downloads 90
1320 Fault Diagnosis of Manufacturing Systems Using AntTreeStoch with Parameter Optimization by ACO

Authors: Ouahab Kadri, Leila Hayet Mouss

Abstract:

In this paper, we present three diagnostic modules for complex and dynamic systems. These modules are based on three ant colony algorithms, which are AntTreeStoch, Lumer & Faieta and Binary ant colony. We chose these algorithms for their simplicity and their wide application range. However, we cannot use these algorithms in their basement forms as they have several limitations. To use these algorithms in a diagnostic system, we have proposed three variants. We have tested these algorithms on datasets issued from two industrial systems, which are clinkering system and pasteurization system.

Keywords: ant colony algorithms, complex and dynamic systems, diagnosis, classification, optimization

Procedia PDF Downloads 287
1319 Vertical and Horizantal Distribution Patterns of Major and Trace Elements: Surface and Subsurface Sediments of Endhorheic Lake Acigol Basin, Denizli Turkey

Authors: M. Budakoglu, M. Karaman

Abstract:

Lake Acıgöl is located in area with limited influences from urban and industrial pollution sources, there is nevertheless a need to understand all potential lithological and anthropogenic sources of priority contaminants in this closed basin. This study discusses vertical and horizontal distribution pattern of major, trace elements of recent lake sediments to better understand their current geochemical analog with lithological units in the Lake Acıgöl basin. This study also provides reliable background levels for the region by the detailed surfaced lithological units data. The detail results of surface, subsurface and shallow core sediments from these relatively unperturbed ecosystems, highlight its importance as conservation area, despite the high-scale industrial salt production activity. While P2O5/TiO2 versus MgO/CaO classification diagram indicate magmatic and sedimentary origin of lake sediment, Log(SiO2/Al2O3) versus Log(Na2O/K2O) classification diagrams express lithological assemblages of shale, iron-shale, vacke and arkose. The plot between TiO2 vs. SiO2 and P2O5/TiO2 vs. MgO/CaO also supports the origin of the primary magma source. The average compositions of the 20 different lithological units used as a proxy for geochemical background in the study area. As expected from weathered rock materials, there is a large variation in the major element content for all analyzed lake samples. The A-CN-K and A-CNK-FM ternary diagrams were used to deduce weathering trends. Surface and subsurface sediments display an intense weathering history according to these ternary diagrams. The most of the sediments samples plot around UCC and TTG, suggesting a low to moderate weathering history for the provenance. The sediments plot in a region clearly suggesting relative similar contents in Al2O3, CaO, Na2O, and K2O from those of lithological samples.

Keywords: Lake Acıgöl, recent lake sediment, geochemical speciation of major and trace elements, heavy metals, Denizli, Turkey

Procedia PDF Downloads 389
1318 Crossing Narrative Waters in World Cinema: Alamar (2009) and Kaili Blues (2015)

Authors: Dustin Dill

Abstract:

The physical movement of crossing over water points to both developing narrative tropes and innovative cinematography in World Cinema today. Two prime examples, Alamar (2009) by Pedro González-Rubio and Kaili Blues (2015) by Bi Gan, demonstrate how contemporary storytelling in a film not only rests upon these water shots but also emerges from them. The range of symbolism that these episodes in the story provoke goes hand in hand with the diverse filming sequences found in the respective productions. While González-Rubio decides to cut the scene into long and longer shots, Gan uses a single take. The differing angles depict equally unique directors and film projects: Alamar runs parallel to many definitions of the essay film, and Kaili Blues resonates much more with mystery and art film. Nonetheless, the crossing of water scenes influence the narratives’ subjects despite the generic consequences, and it is within the essay, mystery, and art film genres which allows for a better understanding of World Cinema. Tiago de Luca explains World Cinema’s prerogative of giving form to a certain type of spectator does not always line up. Given the immense number of interpretations of crossing water —the escape from suffering to find nirvana, rebirth, and colonization— underline the difficulty of categorizing it. If before this type of cross-genre was a trait that defined World Cinema in its beginning, this study observes that González-Rubio and Gan question the all-encompassing genre with their experimental shots of a universal narrative trope, the crossing of water.

Keywords: cinematography, genre, narrative, world cinema

Procedia PDF Downloads 261
1317 Spatial Patterns of Urban Expansion in Kuwait City between 1989 and 2001

Authors: Saad Algharib, Jay Lee

Abstract:

Urbanization is a complex phenomenon that occurs during the city’s development from one form to another. In other words, it is the process when the activities in the land use/land cover change from rural to urban. Since the oil exploration, Kuwait City has been growing rapidly due to its urbanization and population growth by both natural growth and inward immigration. The main objective of this study is to detect changes in urban land use/land cover and to examine the changing spatial patterns of urban growth in and around Kuwait City between 1989 and 2001. In addition, this study also evaluates the spatial patterns of the changes detected and how they can be related to the spatial configuration of the city. Recently, the use of remote sensing and geographic information systems became very useful and important tools in urban studies because of the integration of them can allow and provide the analysts and planners to detect, monitor and analyze the urban growth in a region effectively. Moreover, both planners and users can predict the trends of the growth in urban areas in the future with remotely sensed and GIS data because they can be effectively updated with required precision levels. In order to identify the new urban areas between 1989 and 2001, the study uses satellite images of the study area and remote sensing technology for classifying these images. Unsupervised classification method was applied to classify images to land use and land cover data layers. After finishing the unsupervised classification method, GIS overlay function was applied to the classified images for detecting the locations and patterns of the new urban areas that developed during the study period. GIS was also utilized to evaluate the distribution of the spatial patterns. For example, Moran’s index was applied for all data inputs to examine the urban growth distribution. Furthermore, this study assesses if the spatial patterns and process of these changes take place in a random fashion or with certain identifiable trends. During the study period, the result of this study indicates that the urban growth has occurred and expanded 10% from 32.4% in 1989 to 42.4% in 2001. Also, the results revealed that the largest increase of the urban area occurred between the major highways after the forth ring road from the center of Kuwait City. Moreover, the spatial distribution of urban growth occurred in cluster manners.

Keywords: geographic information systems, remote sensing, urbanization, urban growth

Procedia PDF Downloads 156