Search results for: feature expanding.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2111

Search results for: feature expanding.

1031 A Case Study on Expanding Access to Higher Education of Students with Hearing Impairment

Authors: Afaf Manzoor, Abdul Hameed

Abstract:

Children with hearing impairment face several challenges in accessing primary and secondary education in general and higher education in particular in Pakistan. A large number of these children are excluded from formal education system through segregated special institutions. The enrollment rate of these children at school level is very low and it continues decreasing as they move on the ladder of education. Negligible number of students with hearing impairment gets any chance to be enrolled at tertiary or higher education institutes. The segregated system of education at primary and secondary level makes it even more difficult to adjust in an inclusive classroom at a higher level not only for students with hearing impairment but for their teachers and peers as well. A false belief of teachers and parents about low academic profile of students with hearing impairment is one of the major challenges to overcome for their participation at higher education. This case study was conducted to document an innovative step taken by the Department of Special Education Needs, University of Management & Technology, Lahore Pakistan. The prime objective of this study was to assess the satisfaction level of students with hearing impairment in BS 4 Years and MA Special Education programs at Lahore campus. Structured interviews were of 40 students with hearing impairment to assess the satisfaction on service delivery (admission process, classroom pedagogy, content, assessment/results, access to other services centers i.e. library, cafeteria, hostel, co-curricular activities) and campus life. Their peers without disabilities were also interviewed to assess their acceptance level. The findings of the study revealed positive results about their educational as well as social inclusion in the university. The students also shared their fears at the time of admission and how fear eventually faded out with the passage of time due to the proper academic support system. The findings of the study will be shared in detail with the audience during the presentation.

Keywords: students with hearing impairment, higher education, inclusive education, marginalization

Procedia PDF Downloads 305
1030 A Local Invariant Generalized Hough Transform Method for Integrated Circuit Visual Positioning

Authors: Wei Feilong

Abstract:

In this study, an local invariant generalized Houghtransform (LI-GHT) method is proposed for integrated circuit (IC) visual positioning. The original generalized Hough transform (GHT) is robust to external noise; however, it is not suitable for visual positioning of IC chips due to the four-dimensionality (4D) of parameter space which leads to the substantial storage requirement and high computational complexity. The proposed LI-GHT method can reduce the dimensionality of parameter space to 2D thanks to the rotational invariance of local invariant geometric feature and it can estimate the accuracy position and rotation angle of IC chips in real-time under noise and blur influence. The experiment results show that the proposed LI-GHT can estimate position and rotation angle of IC chips with high accuracy and fast speed. The proposed LI-GHT algorithm was implemented in IC visual positioning system of radio frequency identification (RFID) packaging equipment.

Keywords: Integrated Circuit Visual Positioning, Generalized Hough Transform, Local invariant Generalized Hough Transform, ICpacking equipment

Procedia PDF Downloads 264
1029 High Purity Germanium Detector Characterization by Means of Monte Carlo Simulation through Application of Geant4 Toolkit

Authors: Milos Travar, Jovana Nikolov, Andrej Vranicar, Natasa Todorovic

Abstract:

Over the years, High Purity Germanium (HPGe) detectors proved to be an excellent practical tool and, as such, have established their today's wide use in low background γ-spectrometry. One of the advantages of gamma-ray spectrometry is its easy sample preparation as chemical processing and separation of the studied subject are not required. Thus, with a single measurement, one can simultaneously perform both qualitative and quantitative analysis. One of the most prominent features of HPGe detectors, besides their excellent efficiency, is their superior resolution. This feature virtually allows a researcher to perform a thorough analysis by discriminating photons of similar energies in the studied spectra where otherwise they would superimpose within a single-energy peak and, as such, could potentially scathe analysis and produce wrongly assessed results. Naturally, this feature is of great importance when the identification of radionuclides, as well as their activity concentrations, is being practiced where high precision comes as a necessity. In measurements of this nature, in order to be able to reproduce good and trustworthy results, one has to have initially performed an adequate full-energy peak (FEP) efficiency calibration of the used equipment. However, experimental determination of the response, i.e., efficiency curves for a given detector-sample configuration and its geometry, is not always easy and requires a certain set of reference calibration sources in order to account for and cover broader energy ranges of interest. With the goal of overcoming these difficulties, a lot of researches turned towards the application of different software toolkits that implement the Monte Carlo method (e.g., MCNP, FLUKA, PENELOPE, Geant4, etc.), as it has proven time and time again to be a very powerful tool. In the process of creating a reliable model, one has to have well-established and described specifications of the detector. Unfortunately, the documentation that manufacturers provide alongside the equipment is rarely sufficient enough for this purpose. Furthermore, certain parameters tend to evolve and change over time, especially with older equipment. Deterioration of these parameters consequently decreases the active volume of the crystal and can thus affect the efficiencies by a large margin if they are not properly taken into account. In this study, the optimisation method of two HPGe detectors through the implementation of the Geant4 toolkit developed by CERN is described, with the goal of further improving simulation accuracy in calculations of FEP efficiencies by investigating the influence of certain detector variables (e.g., crystal-to-window distance, dead layer thicknesses, inner crystal’s void dimensions, etc.). Detectors on which the optimisation procedures were carried out were a standard traditional co-axial extended range detector (XtRa HPGe, CANBERRA) and a broad energy range planar detector (BEGe, CANBERRA). Optimised models were verified through comparison with experimentally obtained data from measurements of a set of point-like radioactive sources. Acquired results of both detectors displayed good agreement with experimental data that falls under an average statistical uncertainty of ∼ 4.6% for XtRa and ∼ 1.8% for BEGe detector within the energy range of 59.4−1836.1 [keV] and 59.4−1212.9 [keV], respectively.

Keywords: HPGe detector, γ spectrometry, efficiency, Geant4 simulation, Monte Carlo method

Procedia PDF Downloads 119
1028 Improved Performance in Content-Based Image Retrieval Using Machine Learning Approach

Authors: B. Ramesh Naik, T. Venugopal

Abstract:

This paper presents a novel approach which improves the high-level semantics of images based on machine learning approach. The contemporary approaches for image retrieval and object recognition includes Fourier transforms, Wavelets, SIFT and HoG. Though these descriptors helpful in a wide range of applications, they exploit zero order statistics, and this lacks high descriptiveness of image features. These descriptors usually take benefit of primitive visual features such as shape, color, texture and spatial locations to describe images. These features do not adequate to describe high-level semantics of the images. This leads to a gap in semantic content caused to unacceptable performance in image retrieval system. A novel method has been proposed referred as discriminative learning which is derived from machine learning approach that efficiently discriminates image features. The analysis and results of proposed approach were validated thoroughly on WANG and Caltech-101 Databases. The results proved that this approach is very competitive in content-based image retrieval.

Keywords: CBIR, discriminative learning, region weight learning, scale invariant feature transforms

Procedia PDF Downloads 181
1027 Design of a Service-Enabled Dependable Integration Environment

Authors: Fuyang Peng, Donghong Li

Abstract:

The aim of information systems integration is to make all the data sources, applications and business flows integrated into the new environment so that unwanted redundancies are reduced and bottlenecks and mismatches are eliminated. Two issues have to be dealt with to meet such requirements: the software architecture that supports resource integration, and the adaptor development tool that help integration and migration of legacy applications. In this paper, a service-enabled dependable integration environment (SDIE), is presented, which has two key components, i.e., a dependable service integration platform and a legacy application integration tool. For the dependable platform for service integration, the service integration bus, the service management framework, the dependable engine for service composition, and the service registry and discovery components are described. For the legacy application integration tool, its basic organization, functionalities and dependable measures taken are presented. Due to its service-oriented integration model, the light-weight extensible container, the service component combination-oriented p-lattice structure, and other features, SDIE has advantages in openness, flexibility, performance-price ratio and feature support over commercial products, is better than most of the open source integration software in functionality, performance and dependability support.

Keywords: application integration, dependability, legacy, SOA

Procedia PDF Downloads 360
1026 Leveraging Quality Metrics in Voting Model Based Thread Retrieval

Authors: Atefeh Heydari, Mohammadali Tavakoli, Zuriati Ismail, Naomie Salim

Abstract:

Seeking and sharing knowledge on online forums have made them popular in recent years. Although online forums are valuable sources of information, due to variety of sources of messages, retrieving reliable threads with high quality content is an issue. Majority of the existing information retrieval systems ignore the quality of retrieved documents, particularly, in the field of thread retrieval. In this research, we present an approach that employs various quality features in order to investigate the quality of retrieved threads. Different aspects of content quality, including completeness, comprehensiveness, and politeness, are assessed using these features, which lead to finding not only textual, but also conceptual relevant threads for a user query within a forum. To analyse the influence of the features, we used an adopted version of voting model thread search as a retrieval system. We equipped it with each feature solely and also various combinations of features in turn during multiple runs. The results show that incorporating the quality features enhances the effectiveness of the utilised retrieval system significantly.

Keywords: content quality, forum search, thread retrieval, voting techniques

Procedia PDF Downloads 213
1025 English Learning Speech Assistant Speak Application in Artificial Intelligence

Authors: Albatool Al Abdulwahid, Bayan Shakally, Mariam Mohamed, Wed Almokri

Abstract:

Artificial intelligence has infiltrated every part of our life and every field we can think of. With technical developments, artificial intelligence applications are becoming more prevalent. We chose ELSA speak because it is a magnificent example of Artificial intelligent applications, ELSA speak is a smartphone application that is free to download on both IOS and Android smartphones. ELSA speak utilizes artificial intelligence to help non-native English speakers pronounce words and phrases similar to a native speaker, as well as enhance their English skills. It employs speech-recognition technology that aids the application to excel the pronunciation of its users. This remarkable feature distinguishes ELSA from other voice recognition algorithms and increase the efficiency of the application. This study focused on evaluating ELSA speak application, by testing the degree of effectiveness based on survey questions. The results of the questionnaire were variable. The generality of the participants strongly agreed that ELSA has helped them enhance their pronunciation skills. However, a few participants were unconfident about the application’s ability to assist them in their learning journey.

Keywords: ELSA speak application, artificial intelligence, speech-recognition technology, language learning, english pronunciation

Procedia PDF Downloads 106
1024 Feasibility Study on the Application of Waste Materials for Production of Sustainable Asphalt Mixtures

Authors: Farzaneh Tahmoorian, Bijan Samali, John Yeaman

Abstract:

Road networks are expanding all over the world during the past few decades to meet the increasing freight volumes created by the population growth and industrial development. At the same time, the rate of generation of solid wastes in the society is increasing with the population growth, technological development, and changes in the lifestyle of people. Thus, the management of solid wastes has become an acute problem. Accordingly, there is a need for greater efficiency in the construction and maintenance of road networks, in reducing the overall cost, especially the utilization of natural materials such as aggregates. An efficient means to reduce construction and maintenance costs of road networks is to replace natural (virgin) materials by secondary, recycled materials. Recycling will also help to reduce pressure on landfills and demand for extraction of natural virgin materials thus ensuring sustainability. Application of solid wastes in asphalt layer reduces not only environmental issues associated with waste disposal but also the demand for virgin materials which will subsequently result in sustainability. Therefore, this research aims to investigate the feasibility of the application of some of the waste materials such as glass, construction and demolition wastes, etc. as alternative materials in pavement construction, particularly flexible pavements. To this end, various combination of different waste materials in certain percentages is considered in designing the asphalt mixture. One of the goals of this research is to determine the optimum percentage of all these materials in the mixture. This is done through a series of tests to evaluate the volumetric properties and resilient modulus of the mixture. The information and data collected from these tests are used to select the adequate samples for further assessment through advanced tests such as triaxial dynamic test and fatigue test, in order to investigate the asphalt mixture resistance to permanent deformation and also cracking. This paper presents the results of these investigations on the application of waste materials in asphalt mixture for production of a sustainable asphalt mix.

Keywords: asphalt, glass, pavement, recycled aggregate, sustainability

Procedia PDF Downloads 236
1023 Alienation in Somecontemporary Anglo Arab Novels

Authors: Atef Abdallah Abouelmaaty

Abstract:

The aim of this paper is to study the theme of alienation in some contemporary novels of the most prominent Arab writers who live in Britain and write in English. The paper will focus on three female novelists of Arab origins who won wide fame among reading public, and also won international prizes for their literary creation. The first is the Egyptian Ahdaf Soueif(born in 1950) whose novel The Map of Love(1999) was shortlisted for the Man Booker Prize, and has been translated into twenty one languages and sold over a million copies. The second is the Jordanian Fadia Faqir (born in 1956) whose My Name is Salma(2007) was translated into thirteen languages, and was a runner up for the ALOA literary prize. The third is the Sudanese Leila Aboulela(born in 1964) who The Translator was nominated for the Orange Prize and was chosen as a a notable book of the year by the New York Times in 2006. The main reason of choosing the theme of alienation is that it is the qualifying feature of the above mentioned novels. This is because the theme is clearly projected and we can see different kinds of alienation: alienation of man from himself, alienation of man from other men, and alienation of man from society. The paper is concerned with studying this central theme together with its different forms. Moreover, the paper will try to identify the main causes of this alienation among which are frustrated love, the failure to adjust to change, and ethnic pride.

Keywords: alienation, Anglo-Arab, contemporary, novels

Procedia PDF Downloads 439
1022 Color Fusion of Remote Sensing Images for Imparting Fluvial Geomorphological Features of River Yamuna and Ganga over Doon Valley

Authors: P. S. Jagadeesh Kumar, Tracy Lin Huan, Rebecca K. Rossi, Yanmin Yuan, Xianpei Li

Abstract:

The fiscal growth of any country hinges on the prudent administration of water resources. The river Yamuna and Ganga are measured as the life line of India as it affords the needs for life to endure. Earth observation over remote sensing images permits the precise description and identification of ingredients on the superficial from space and airborne platforms. Multiple and heterogeneous image sources are accessible for the same geographical section; multispectral, hyperspectral, radar, multitemporal, and multiangular images. In this paper, a taxonomical learning of the fluvial geomorphological features of river Yamuna and Ganga over doon valley using color fusion of multispectral remote sensing images was performed. Experimental results exhibited that the segmentation based colorization technique stranded on pattern recognition, and color mapping fashioned more colorful and truthful colorized images for geomorphological feature extraction.

Keywords: color fusion, geomorphology, fluvial processes, multispectral images, pattern recognition

Procedia PDF Downloads 306
1021 Relocation of Livestocks in Rural of Canakkale Province Using Remote Sensing and GIS

Authors: Melis Inalpulat, Tugce Civelek, Unal Kizil, Levent Genc

Abstract:

Livestock production is one of the most important components of rural economy. Due to the urban expansion, rural areas close to expanding cities transform into urban districts during the time. However, the legislations have some restrictions related to livestock farming in such administrative units since they tend to create environmental concerns like odor problems resulted from excessive manure production. Therefore, the existing animal operations should be moved from the settlement areas. This paper was focused on determination of suitable lands for livestock production in Canakkale province of Turkey using remote sensing (RS) data and GIS techniques. To achieve the goal, Formosat 2 and Landsat 8 imageries, Aster DEM, and 1:25000 scaled soil maps, village boundaries, and village livestock inventory records were used. The study was conducted using suitability analysis which evaluates the land in terms of limitations and potentials, and suitability range was categorized as Suitable (S) and Non-Suitable (NS). Limitations included the distances from main and crossroads, water resources and settlements, while potentials were appropriate values for slope, land use capability and land use land cover status. Village-based S land distribution results were presented, and compared with livestock inventories. Results showed that approximately 44230 ha area is inappropriate because of the distance limitations for roads and etc. (NS). Moreover, according to LULC map, 71052 ha area consists of forests, olive and other orchards, and thus, may not be suitable for building such structures (NS). In comparison, it was found that there are a total of 1228 ha S lands within study area. The village-based findings indicated that, in some villages livestock production continues on NS areas. Finally, it was suggested that organized livestock zones may be constructed to serve in more than one village after the detailed analysis complemented considering also political decisions, opinion of the local people, etc.

Keywords: GIS, livestock, LULC, remote sensing, suitable lands

Procedia PDF Downloads 298
1020 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 127
1019 Spatial Data Science for Data Driven Urban Planning: The Youth Economic Discomfort Index for Rome

Authors: Iacopo Testi, Diego Pajarito, Nicoletta Roberto, Carmen Greco

Abstract:

Today, a consistent segment of the world’s population lives in urban areas, and this proportion will vastly increase in the next decades. Therefore, understanding the key trends in urbanization, likely to unfold over the coming years, is crucial to the implementation of sustainable urban strategies. In parallel, the daily amount of digital data produced will be expanding at an exponential rate during the following years. The analysis of various types of data sets and its derived applications have incredible potential across different crucial sectors such as healthcare, housing, transportation, energy, and education. Nevertheless, in city development, architects and urban planners appear to rely mostly on traditional and analogical techniques of data collection. This paper investigates the prospective of the data science field, appearing to be a formidable resource to assist city managers in identifying strategies to enhance the social, economic, and environmental sustainability of our urban areas. The collection of different new layers of information would definitely enhance planners' capabilities to comprehend more in-depth urban phenomena such as gentrification, land use definition, mobility, or critical infrastructural issues. Specifically, the research results correlate economic, commercial, demographic, and housing data with the purpose of defining the youth economic discomfort index. The statistical composite index provides insights regarding the economic disadvantage of citizens aged between 18 years and 29 years, and results clearly display that central urban zones and more disadvantaged than peripheral ones. The experimental set up selected the city of Rome as the testing ground of the whole investigation. The methodology aims at applying statistical and spatial analysis to construct a composite index supporting informed data-driven decisions for urban planning.

Keywords: data science, spatial analysis, composite index, Rome, urban planning, youth economic discomfort index

Procedia PDF Downloads 135
1018 Referring to Jordanian Female Relatives in Public

Authors: Ibrahim Darwish, Noora Abu Ain

Abstract:

Referring to female relatives by male Jordanian speakers in public is governed by various linguistic and social constraints. Although Jordanian society is less conservative than it was a few decades ago, women are still considered the weaker link in society and men still believe that they need to protect them. Conservative Jordanians often avoid referring to their female relatives overtly, i.e., using their real names. Instead, they use covert names, such as pseudonyms, nicknames, pet names, etc. The reason behind such language use has to do with how Arab men, in general, see women as part of their honor. This study intends to investigate to what extent Jordanian males hide their female relatives’ names in public domains. The data was collected from spontaneous informal voice-recorded interviews carried out in the village of Saham in the far north of Jordan. Saham’s dialect is part of a larger Horani dialect used by speakers along a wide area that stretches from Salt in the south to the Syrian borders in the north of Jordan. The voice-recorded interviews were originally carried out as an audio record of some customs and traditions in the village of Saham in 2013. During most of these interviews, the researchers observed how the male participants indirectly referred to their female relatives. Instead of using real names, the male speakers used broad terms to refer to their female relatives, such al-Beit ‘the home,’ al-ciyaal ‘the kids’, um-x ‘the mother of x,’ etc. All tokens related to the issue in question were collected, analyzed and quantified about three age cohorts: young, middle-aged and old speakers. The results show that young speakers are more direct in referring to their female relatives than the other two age groups. This can point to a possible change in progress in the speech community of Saham. It is argued that due to contact with other urban speech communities, the young speakers in Saham do not feel the need to hide the real names of their female relatives as they consider them as equals. Indeed, the young generation is more open to the idea of women's rights and call for expanding Jordanian women’s roles in Jordanian society.

Keywords: gender differences, Horan, proper names, social constraints

Procedia PDF Downloads 141
1017 Fuzzy-Machine Learning Models for the Prediction of Fire Outbreak: A Comparative Analysis

Authors: Uduak Umoh, Imo Eyoh, Emmauel Nyoho

Abstract:

This paper compares fuzzy-machine learning algorithms such as Support Vector Machine (SVM), and K-Nearest Neighbor (KNN) for the predicting cases of fire outbreak. The paper uses the fire outbreak dataset with three features (Temperature, Smoke, and Flame). The data is pre-processed using Interval Type-2 Fuzzy Logic (IT2FL) algorithm. Min-Max Normalization and Principal Component Analysis (PCA) are used to predict feature labels in the dataset, normalize the dataset, and select relevant features respectively. The output of the pre-processing is a dataset with two principal components (PC1 and PC2). The pre-processed dataset is then used in the training of the aforementioned machine learning models. K-fold (with K=10) cross-validation method is used to evaluate the performance of the models using the matrices – ROC (Receiver Operating Curve), Specificity, and Sensitivity. The model is also tested with 20% of the dataset. The validation result shows KNN is the better model for fire outbreak detection with an ROC value of 0.99878, followed by SVM with an ROC value of 0.99753.

Keywords: Machine Learning Algorithms , Interval Type-2 Fuzzy Logic, Fire Outbreak, Support Vector Machine, K-Nearest Neighbour, Principal Component Analysis

Procedia PDF Downloads 181
1016 Introduction, Establishment, and Transformation: An Initial Exploration of the Cultural Shifts and Influence of Fa Yi Chong De, Yi-Kuan-Tao in Malaysian Chinese Community

Authors: Lim Pey Huan

Abstract:

Yi-Kuan-Tao has been developing in Malaysia for nearly 60 years. It was initially introduced from mainland China and later from Taiwan starting from the 1970s. Yi-Kuan-Tao was considered a 'new religion' for the local Chinese community in Malaysia in its early stages, as Chinese immigrants primarily practiced Taoism, Buddhism, Christianity, or Catholicism upon settling in the region. The overseas propagation and development of Yi-Kuan-Tao today primarily occur through Taiwanese temples, which began spreading abroad as early as 1949. Particularly since the 1970s, with the rapid economic growth of Taiwan, various branches of Taiwanese Yi-Kuan-Tao have gained economic strength to propagate abroad, further expanding the influence of Yi-Kuan-Tao overseas. Southeast Asia is the region out from Taiwan where the propagation and development of Yi-Kuan-Tao are fastest and most concentrated. With approximately over 6 million Chinese inhabitants, Malaysia's pursuit of traditional Chinese culture has led to a flourishing interest in Yi-Kuan-Tao, particularly its advocacy of the unity of Confucianism, Buddhism, and Taoism, with an emphasis on promoting Confucian thought. Moreover, Taiwan's rapid economic development since the 1970s has enabled Yi-Kuan-Tao to allocate significant human and financial resources for external propagation efforts. Additionally, Malaysia's government has adopted a relatively tolerant policy towards religion since that time, further fostering the flourishing development of Yi-Kuan-Tao in Malaysia. Furthermore, this thesis aims to strengthen the lineage and continuity of the Yi-Kuan-Tao tradition, particularly the branch of Fa Yi Chong De, through the perspective of Heavenly Mandate (天命). By examining the different origins and ethnic backgrounds, it investigates how the Malaysian Chinese community has experienced different changes through the cultural baptism of religion, thus delving into the religious influence of Yi-Kuan-Tao. Given that the Fa Yi Chong De Academy in Taiwan is currently in an active development and construction phase, academic works related to Yi-Kuan-Tao will lay a more solid academic foundation for the future establishment of the academy.

Keywords: initial exploration, cultural shifts, Yi-Kuan-Tao, Malaysian Chinese community

Procedia PDF Downloads 78
1015 A Simple Algorithm for Real-Time 3D Capturing of an Interior Scene Using a Linear Voxel Octree and a Floating Origin Camera

Authors: Vangelis Drosos, Dimitrios Tsoukalos, Dimitrios Tsolis

Abstract:

We present a simple algorithm for capturing a 3D scene (focused on the usage of mobile device cameras in the context of augmented/mixed reality) by using a floating origin camera solution and storing the resulting information in a linear voxel octree. Data is derived from cloud points captured by a mobile device camera. For the purposes of this paper, we assume a scene of fixed size (known to us or determined beforehand) and a fixed voxel resolution. The resulting data is stored in a linear voxel octree using a hashtable. We commence by briefly discussing the logic behind floating origin approaches and the usage of linear voxel octrees for efficient storage. Following that, we present the algorithm for translating captured feature points into voxel data in the context of a fixed origin world and storing them. Finally, we discuss potential applications and areas of future development and improvement to the efficiency of our solution.

Keywords: voxel, octree, computer vision, XR, floating origin

Procedia PDF Downloads 133
1014 Output-Feedback Control Design for a General Class of Systems Subject to Sampling and Uncertainties

Authors: Tomas Menard

Abstract:

The synthesis of output-feedback control law has been investigated by many researchers since the last century. While many results exist for the case of Linear Time Invariant systems whose measurements are continuously available, nowadays, control laws are usually implemented on micro-controller, then the measurements are discrete-time by nature. This fact has to be taken into account explicitly in order to obtain a satisfactory behavior of the closed-loop system. One considers here a general class of systems corresponding to an observability normal form and which is subject to uncertainties in the dynamics and sampling of the output. Indeed, in practice, the modeling of the system is never perfect, this results in unknown uncertainties in the dynamics of the model. We propose here an output feedback algorithm which is based on a linear state feedback and a continuous-discrete time observer. The main feature of the proposed control law is that only discrete-time measurements of the output are needed. Furthermore, it is formally proven that the state of the closed loop system exponentially converges toward the origin despite the unknown uncertainties. Finally, the performances of this control scheme are illustrated with simulations.

Keywords: dynamical systems, output feedback control law, sampling, uncertain systems

Procedia PDF Downloads 285
1013 Local Governments Supporting Environmentally Sustainable Meals to Protect the Planet and People

Authors: Magdy Danial Riad

Abstract:

Introduction: The ability of our world to support the expanding population after 2050 is at risk due to the food system's global role in poor health, climate change, and resource depletion. Healthy, equitable, and sustainable food systems must be achieved from the point of production through consumption in order to meet several of the sustainable development goals (SDG) targets. There is evidence that changing the local food environment can effectively change dietary habits in a community. The purpose of this article is to outline the policy initiatives taken by local governments to support environmentally friendly eating habits. Methods: Five databases were searched for peer-reviewed articles that described local government authorities' implementation of environmentally sustainable eating habits, were located in cities that had signed the Milan Urban Food Policy Pact, were published after 2015, were available in English, and described policy interventions. Data extraction was a two-step approach that started with extracting information from the included study and ended with locating information unique to policies in the grey literature. Results: 45 papers that described a variety of policy initiatives from low-, middle-, and high-income countries met the inclusion criteria. A variety of desired dietary behaviors were the focus of policy action, including reducing food waste, procuring food locally and in season, boosting breastfeeding, avoiding overconsumption, and consuming more plant-based meals and fewer items derived from animals. Conclusions: In order to achieve SDG targets, local governments are under pressure to implement evidence-based interventions. This study can help direct local governments toward evidence-based policy measures to improve regional food systems and support ecologically friendly eating habits.

Keywords: meals, planet, poor health, eating habits

Procedia PDF Downloads 52
1012 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 63
1011 DISGAN: Efficient Generative Adversarial Network-Based Method for Cyber-Intrusion Detection

Authors: Hongyu Chen, Li Jiang

Abstract:

Ubiquitous anomalies endanger the security of our system con- stantly. They may bring irreversible damages to the system and cause leakage of privacy. Thus, it is of vital importance to promptly detect these anomalies. Traditional supervised methods such as Decision Trees and Support Vector Machine (SVM) are used to classify normality and abnormality. However, in some case, the abnormal status are largely rarer than normal status, which leads to decision bias of these methods. Generative adversarial network (GAN) has been proposed to handle the case. With its strong generative ability, it only needs to learn the distribution of normal status, and identify the abnormal status through the gap between it and the learned distribution. Nevertheless, existing GAN-based models are not suitable to process data with discrete values, leading to immense degradation of detection performance. To cope with the discrete features, in this paper, we propose an efficient GAN-based model with specifically-designed loss function. Experiment results show that our model outperforms state-of-the-art models on discrete dataset and remarkably reduce the overhead.

Keywords: GAN, discrete feature, Wasserstein distance, multiple intermediate layers

Procedia PDF Downloads 129
1010 Classifications of Images for the Recognition of People’s Behaviors by SIFT and SVM

Authors: Henni Sid Ahmed, Belbachir Mohamed Faouzi, Jean Caelen

Abstract:

Behavior recognition has been studied for realizing drivers assisting system and automated navigation and is an important studied field in the intelligent Building. In this paper, a recognition method of behavior recognition separated from a real image was studied. Images were divided into several categories according to the actual weather, distance and angle of view etc. SIFT was firstly used to detect key points and describe them because the SIFT (Scale Invariant Feature Transform) features were invariant to image scale and rotation and were robust to changes in the viewpoint and illumination. My goal is to develop a robust and reliable system which is composed of two fixed cameras in every room of intelligent building which are connected to a computer for acquisition of video sequences, with a program using these video sequences as inputs, we use SIFT represented different images of video sequences, and SVM (support vector machine) Lights as a programming tool for classification of images in order to classify people’s behaviors in the intelligent building in order to give maximum comfort with optimized energy consumption.

Keywords: video analysis, people behavior, intelligent building, classification

Procedia PDF Downloads 378
1009 Feature Evaluation and Applications of Various Advanced Conductors with High Conductivity and Low Flash in Overhead Lines

Authors: Atefeh Pourshafie, Homayoun Bakhtiari

Abstract:

In power transmission lines, electricity conductors are main tools to carry electric power. Thus, other devices such as shield wires, insulators, towers, foundations etc. should be designed in a way that the conductors be able to successfully do their task which is appropriate power delivery to the customers. Non-stop increase of energy demand has led to saturated capacity of transmission lines which, in turn, causing line flash to exceed acceptable limits in some points. An approach which may be used to solve this issue is replacement of current conductors with new ones having the capability of withstanding higher heating such that reduced flash would be observed when heating increases. These novel conductors are able to transfer higher currents and operate in higher heating conditions while line flash will remain within standard limits. In this paper, we will attempt to introduce three types of advanced overhead conductors and analyze the replacement of current conductors by new ones technically and economically in transmission lines. In this regard, progressive conductors of transmission lines are introduced such as ACC (Aluminum Conductor Composite Core), AAAC-UHC (Ultra High Conductivity, All Aluminum Alloy Conductors), and G(Z)TACSR-Gap Type.

Keywords: ACC, AAAC-UHC, gap type, transmission lines

Procedia PDF Downloads 269
1008 Individualized Emotion Recognition Through Dual-Representations and Ground-Established Ground Truth

Authors: Valentina Zhang

Abstract:

While facial expression is a complex and individualized behavior, all facial emotion recognition (FER) systems known to us rely on a single facial representation and are trained on universal data. We conjecture that: (i) different facial representations can provide different, sometimes complementing views of emotions; (ii) when employed collectively in a discussion group setting, they enable more accurate emotion reading which is highly desirable in autism care and other applications context sensitive to errors. In this paper, we first study FER using pixel-based DL vs semantics-based DL in the context of deepfake videos. Our experiment indicates that while the semantics-trained model performs better with articulated facial feature changes, the pixel-trained model outperforms on subtle or rare facial expressions. Armed with these findings, we have constructed an adaptive FER system learning from both types of models for dyadic or small interacting groups and further leveraging the synthesized group emotions as the ground truth for individualized FER training. Using a collection of group conversation videos, we demonstrate that FER accuracy and personalization can benefit from such an approach.

Keywords: neurodivergence care, facial emotion recognition, deep learning, ground truth for supervised learning

Procedia PDF Downloads 147
1007 High Frequency Memristor-Based BFSK and 8QAM Demodulators

Authors: Nahla Elazab, Mohamed Aboudina, Ghada Ibrahim, Hossam Fahmy, Ahmed Khalil

Abstract:

This paper presents the developed memristor based demodulators for eight circular Quadrature Amplitude Modulation (QAM) and Binary Frequency Shift Keying (BFSK) operating at relatively high frequency. In our implementations, the experimental-based ‘nonlinear’ dopant drift model is adopted along with the proposed circuits providing incorporation of all known non-idealities of practically realized memristor and gaining high operation frequency. The suggested designs leverage the distinctive characteristics of the memristor device, definitely, its changeable average memristance versus the frequency, phase and amplitude of the periodic excitation input. The proposed demodulators feature small integration area, low power consumption, and easy implementation. Moreover, the proposed QAM demodulator precludes the requirement for the carrier recovery circuits. In doing so, the designs were validated by transient simulations using the nonlinear dopant drift memristor model. The simulations results show high agreement with the theory presented.

Keywords: BFSK, demodulator, high frequency memristor applications, memristor based analog circuits, nonlinear dopant drift model, QAM

Procedia PDF Downloads 167
1006 Impacts of Filmmaking on Destinations: Perceptions of the Residents of Arcos de Valdevez

Authors: André Rafael Ferreira, Laurentina Vareiro, Raquel Mendes

Abstract:

This study’s main objective is to explore residents’ perceptions of film-induced tourism and the impacts of filmmaking on the development of a destination. Specifically, the research examines resident´s perceptions of the social, economic, and environmental impacts on a Portuguese municipality (Arcos de Valdevez) given its feature in a popular Portuguese television series. Data is collected by means of an Internet survey, in which resident´s perceptions of the impacts of filmmaking are solicited. Residents generally agree that the recording and exhibition of the television series is important to the municipality, and contributes to the increased number of tourists. Given that residents consider that the positive impacts are more significant than the negative impacts, they supported the recording of another television series in the same municipality. Considering that destination managers and tourism development authorities aim to plan for optimal tourism development, and at the same time wish to minimize the negative impacts of this development on the local communities, monitoring residents’ opinions of perceived impacts is a good way of incorporating their reaction into tourism planning and development. The results of this research may provide useful information in this sense.

Keywords: film-induced tourism, residents’ perceptions, tourism development, tourism impacts

Procedia PDF Downloads 453
1005 Automated Heart Sound Classification from Unsegmented Phonocardiogram Signals Using Time Frequency Features

Authors: Nadia Masood Khan, Muhammad Salman Khan, Gul Muhammad Khan

Abstract:

Cardiologists perform cardiac auscultation to detect abnormalities in heart sounds. Since accurate auscultation is a crucial first step in screening patients with heart diseases, there is a need to develop computer-aided detection/diagnosis (CAD) systems to assist cardiologists in interpreting heart sounds and provide second opinions. In this paper different algorithms are implemented for automated heart sound classification using unsegmented phonocardiogram (PCG) signals. Support vector machine (SVM), artificial neural network (ANN) and cartesian genetic programming evolved artificial neural network (CGPANN) without the application of any segmentation algorithm has been explored in this study. The signals are first pre-processed to remove any unwanted frequencies. Both time and frequency domain features are then extracted for training the different models. The different algorithms are tested in multiple scenarios and their strengths and weaknesses are discussed. Results indicate that SVM outperforms the rest with an accuracy of 73.64%.

Keywords: pattern recognition, machine learning, computer aided diagnosis, heart sound classification, and feature extraction

Procedia PDF Downloads 262
1004 Application of Random Forest Model in The Prediction of River Water Quality

Authors: Turuganti Venkateswarlu, Jagadeesh Anmala

Abstract:

Excessive runoffs from various non-point source land uses, and other point sources are rapidly contaminating the water quality of streams in the Upper Green River watershed, Kentucky, USA. It is essential to maintain the stream water quality as the river basin is one of the major freshwater sources in this province. It is also important to understand the water quality parameters (WQPs) quantitatively and qualitatively along with their important features as stream water is sensitive to climatic events and land-use practices. In this paper, a model was developed for predicting one of the significant WQPs, Fecal Coliform (FC) from precipitation, temperature, urban land use factor (ULUF), agricultural land use factor (ALUF), and forest land-use factor (FLUF) using Random Forest (RF) algorithm. The RF model, a novel ensemble learning algorithm, can even find out advanced feature importance characteristics from the given model inputs for different combinations. This model’s outcomes showed a good correlation between FC and climate events and land use factors (R2 = 0.94) and precipitation and temperature are the primary influencing factors for FC.

Keywords: water quality, land use factors, random forest, fecal coliform

Procedia PDF Downloads 197
1003 Food for Thought: Preparing the Brain to Eat New Foods through “Messy” Play

Authors: L. Bernabeo, T. Loftus

Abstract:

Many children often experience phases of picky eating, food aversions and/or avoidance. For families with children who have special needs, these experiences are often exacerbated, which can lead to feelings that negatively impact a caregiver’s relationship with their child. Within the scope of speech language pathology practice, knowledge of both emotional and feeding development is key. This paper will explore the significance of “messy play” within typical feeding development, and the challenges that may arise if a child does not have the opportunity to engage in this type of exploratory play. This paper will consider several contributing factors that can result in a “picky eater.” Further, research has shown that individuals with special needs, including autism, possess a neurological makeup that differs from that of a typical individual. Because autism is a disorder of relating and communicating due to differences in the limbic system, an individual with special needs may respond to a typical feeding experience as if it is a traumatic event. As a result, broadening one’s dietary repertoire may seem to be an insurmountable challenge. This paper suggests that introducing new foods through exploratory play can help broaden and strengthen diets, as well as improve the feeding experience, of individuals with autism. The DIRFloortimeⓇ methodology stresses the importance of following a child's lead. Within this developmental model, there is a special focus on a person’s individual differences, including the unique way they process the world around them, as well as the significance of therapy occurring within the context of a strong and motivating relationship. Using this child-centered approach, we can support our children in expanding their diets, while simultaneously building upon their cognitive and creative development through playful and respectful interactions that include exposure to foods that differ in color, texture, and smell. Further, this paper explores the importance of exploration, self-feeding and messy play on brain development, both in the context of typically developing individuals and those with disordered development.

Keywords: development, feeding, floortime, sensory

Procedia PDF Downloads 116
1002 Hybrid Inventory Model Optimization under Uncertainties: A Case Study in a Manufacturing Plant

Authors: E. Benga, T. Tengen, A. Alugongo

Abstract:

Periodic and continuous inventory models are the two classical management tools used to handle inventories. These models have advantages and disadvantages. The implementation of both continuous (r,Q) inventory and periodic (R, S) inventory models in most manufacturing plants comes with higher cost. Such high inventory costs are due to the fact that most manufacturing plants are not flexible enough. Since demand and lead-time are two important variables of every inventory models, their effect on the flexibility of the manufacturing plant matter most. Unfortunately, these effects are not clearly understood by managers. The reason is that the decision parameters of the continuous (r, Q) inventory and periodic (R, S) inventory models are not designed to effectively deal with the issues of uncertainties such as poor manufacturing performances, delivery performance supplies performances. There is, therefore, a need to come up with a predictive and hybrid inventory model that can combine in some sense the feature of the aforementioned inventory models. A linear combination technique is used to hybridize both continuous (r, Q) inventory and periodic (R, S) inventory models. The behavior of such hybrid inventory model is described by a differential equation and then optimized. From the results obtained after simulation, the continuous (r, Q) inventory model is more effective than the periodic (R, S) inventory models in the short run, but this difference changes as time goes by. Because the hybrid inventory model is more cost effective than the continuous (r,Q) inventory and periodic (R, S) inventory models in long run, it should be implemented for strategic decisions.

Keywords: periodic inventory, continuous inventory, hybrid inventory, optimization, manufacturing plant

Procedia PDF Downloads 382