Search results for: heterogeneous massive data
23665 SEM Image Classification Using CNN Architectures
Authors: Güzi̇n Ti̇rkeş, Özge Teki̇n, Kerem Kurtuluş, Y. Yekta Yurtseven, Murat Baran
Abstract:
A scanning electron microscope (SEM) is a type of electron microscope mainly used in nanoscience and nanotechnology areas. Automatic image recognition and classification are among the general areas of application concerning SEM. In line with these usages, the present paper proposes a deep learning algorithm that classifies SEM images into nine categories by means of an online application to simplify the process. The NFFA-EUROPE - 100% SEM data set, containing approximately 21,000 images, was used to train and test the algorithm at 80% and 20%, respectively. Validation was carried out using a separate data set obtained from the Middle East Technical University (METU) in Turkey. To increase the accuracy in the results, the Inception ResNet-V2 model was used in view of the Fine-Tuning approach. By using a confusion matrix, it was observed that the coated-surface category has a negative effect on the accuracy of the results since it contains other categories in the data set, thereby confusing the model when detecting category-specific patterns. For this reason, the coated-surface category was removed from the train data set, hence increasing accuracy by up to 96.5%.Keywords: convolutional neural networks, deep learning, image classification, scanning electron microscope
Procedia PDF Downloads 12523664 Geo-Visualization of Crimes against Children: An India Level Study 2001-2012
Authors: Ritvik Chauhan, Vijay Kumar Baraik
Abstract:
Crime is a rare event on earth surface. It is not simple but a complex event occurring in a spatio- temporal environment. Crime is one of the most serious security threats to human environments as it may result in harm to the individuals through the loss of property, physical and psychological injuries. The conventional studies done on different nature crime was mostly related to laws, psychological, social and political themes. The geographical areas are heterogeneous in their environmental conditions, associations between structural conditions, social organization which contributing specific crimes. The crime pattern analysis is made through theories in which criminal events occurs in persistent, identifiable patterns in a particular space and time. It will be the combined analysis of spatial factors and rational factors to the crime. In this study, we are analyzing the combined factors for the origin of crime against children. Children have always been vulnerable to victimization more because they are silent victims both physically and mentally to crimes and they even not realize what is happening with them. Their trusting nature and innocence always misused by criminals to perform crimes. The nature of crime against children is changed in past years like child rape, kidnapping &abduction, selling & buying of girls, foeticide, infanticide, prostitution, child marriage etc turned to more cruel and inhuman. This study will focus on understanding the space-time pattern of crime against children during the period 2001-2012. It also makes an attempt to explore and ascertain the association of crimes categorised against children, its rates with various geographical and socio-demographic factors through causal analysis using selected indicators (child sex-ratio, education, literacy rate, employment, income, etc.) obtained from the Census of India and other government sources. The outcome of study will help identifying the high crime regions with specified nature of crimes. It will also review the existing efforts and exploring the new plausible measure for tracking, monitoring and minimization of crime rate to meet the end goal of protecting the children from crimes committed against them.Keywords: crime against children, geographic profiling, spatio-temporal analysis, hotspot
Procedia PDF Downloads 21123663 Nearest Neighbor Investigate Using R+ Tree
Authors: Rutuja Desai
Abstract:
Search engine is fundamentally a framework used to search the data which is pertinent to the client via WWW. Looking close-by spot identified with the keywords is an imperative concept in developing web advances. For such kind of searching, extent pursuit or closest neighbor is utilized. In range search the forecast is made whether the objects meet to query object. Nearest neighbor is the forecast of the focuses close to the query set by the client. Here, the nearest neighbor methodology is utilized where Data recovery R+ tree is utilized rather than IR2 tree. The disadvantages of IR2 tree is: The false hit number can surpass the limit and the mark in Information Retrieval R-tree must have Voice over IP bit for each one of a kind word in W set is recouped by Data recovery R+ tree. The inquiry is fundamentally subordinate upon the key words and the geometric directions.Keywords: information retrieval, nearest neighbor search, keyword search, R+ tree
Procedia PDF Downloads 29123662 Classical and Bayesian Inference of the Generalized Log-Logistic Distribution with Applications to Survival Data
Authors: Abdisalam Hassan Muse, Samuel Mwalili, Oscar Ngesa
Abstract:
A generalized log-logistic distribution with variable shapes of the hazard rate was introduced and studied, extending the log-logistic distribution by adding an extra parameter to the classical distribution, leading to greater flexibility in analysing and modeling various data types. The proposed distribution has a large number of well-known lifetime special sub-models such as; Weibull, log-logistic, exponential, and Burr XII distributions. Its basic mathematical and statistical properties were derived. The method of maximum likelihood was adopted for estimating the unknown parameters of the proposed distribution, and a Monte Carlo simulation study is carried out to assess the behavior of the estimators. The importance of this distribution is that its tendency to model both monotone (increasing and decreasing) and non-monotone (unimodal and bathtub shape) or reversed “bathtub” shape hazard rate functions which are quite common in survival and reliability data analysis. Furthermore, the flexibility and usefulness of the proposed distribution are illustrated in a real-life data set and compared to its sub-models; Weibull, log-logistic, and BurrXII distributions and other parametric survival distributions with 3-parmaeters; like the exponentiated Weibull distribution, the 3-parameter lognormal distribution, the 3- parameter gamma distribution, the 3-parameter Weibull distribution, and the 3-parameter log-logistic (also known as shifted log-logistic) distribution. The proposed distribution provided a better fit than all of the competitive distributions based on the goodness-of-fit tests, the log-likelihood, and information criterion values. Finally, Bayesian analysis and performance of Gibbs sampling for the data set are also carried out.Keywords: hazard rate function, log-logistic distribution, maximum likelihood estimation, generalized log-logistic distribution, survival data, Monte Carlo simulation
Procedia PDF Downloads 20223661 Fuzzy Inference-Assisted Saliency-Aware Convolution Neural Networks for Multi-View Summarization
Authors: Tanveer Hussain, Khan Muhammad, Amin Ullah, Mi Young Lee, Sung Wook Baik
Abstract:
The Big Data generated from distributed vision sensors installed on large scale in smart cities create hurdles in its efficient and beneficial exploration for browsing, retrieval, and indexing. This paper presents a three-folded framework for effective video summarization of such data and provide a compact and representative format of Big Video Data. In the first fold, the paper acquires input video data from the installed cameras and collect clues such as type and count of objects and clarity of the view from a chunk of pre-defined number of frames of each view. The decision of representative view selection for a particular interval is based on fuzzy inference system, acquiring a precise and human resembling decision, reinforced by the known clues as a part of the second fold. In the third fold, the paper forwards the selected view frames to the summary generation mechanism that is supported by a saliency-aware convolution neural network (CNN) model. The new trend of fuzzy rules for view selection followed by CNN architecture for saliency computation makes the multi-view video summarization (MVS) framework a suitable candidate for real-world practice in smart cities.Keywords: big video data analysis, fuzzy logic, multi-view video summarization, saliency detection
Procedia PDF Downloads 18823660 Relation between Pavement Roughness and Distress Parameters for Highways
Authors: Suryapeta Harini
Abstract:
Road surface roughness is one of the essential aspects of the road's functional condition, indicating riding comfort in both the transverse and longitudinal directions. The government of India has made maintaining good surface evenness a prerequisite for all highway projects. Pavement distress data was collected with a Network Survey Vehicle (NSV) on a National Highway. It determines the smoothness and frictional qualities of the pavement surface, which are related to driving safety and ease. Based on the data obtained in the field, a regression equation was created with the IRI value and the visual distresses. The suggested system can use wireless acceleration sensors and GPS to gather vehicle status and location data, as well as calculate the international roughness index (IRI). Potholes, raveling, rut depth, cracked area, and repair work are all affected by pavement roughness, according to the current study. The study was carried out in one location. Data collected through using Bump integrator was used for the validation. The bump integrator (BI) obtained using deflection from the network survey vehicle was correlated with the distress parameter to establish an equation.Keywords: roughness index, network survey vehicle, regression, correlation
Procedia PDF Downloads 17623659 Structural Health Monitoring using Fibre Bragg Grating Sensors in Slab and Beams
Authors: Pierre van Tonder, Dinesh Muthoo, Kim twiname
Abstract:
Many existing and newly built structures are constructed on the design basis of the engineer and the workmanship of the construction company. However, when considering larger structures where more people are exposed to the building, its structural integrity is of great importance considering the safety of its occupants (Raghu, 2013). But how can the structural integrity of a building be monitored efficiently and effectively. This is where the fourth industrial revolution step in, and with minimal human interaction, data can be collected, analysed, and stored, which could also give an indication of any inconsistencies found in the data collected, this is where the Fibre Bragg Grating (FBG) monitoring system is introduced. This paper illustrates how data can be collected and converted to develop stress – strain behaviour and to produce bending moment diagrams for the utilisation and prediction of the structure’s integrity. Embedded fibre optic sensors were used in this study– fibre Bragg grating sensors in particular. The procedure entailed making use of the shift in wavelength demodulation technique and an inscription process of the phase mask technique. The fibre optic sensors considered in this report were photosensitive and embedded in the slab and beams for data collection and analysis. Two sets of fibre cables have been inserted, one purposely to collect temperature recordings and the other to collect strain and temperature. The data was collected over a time period and analysed used to produce bending moment diagrams to make predictions of the structure’s integrity. The data indicated the fibre Bragg grating sensing system proved to be useful and can be used for structural health monitoring in any environment. From the experimental data for the slab and beams, the moments were found to be64.33 kN.m, 64.35 kN.m and 45.20 kN.m (from the experimental bending moment diagram), and as per the idealistic (Ultimate Limit State), the data of 133 kN.m and 226.2 kN.m were obtained. The difference in values gave room for an early warning system, in other words, a reserve capacity of approximately 50% to failure.Keywords: fibre bragg grating, structural health monitoring, fibre optic sensors, beams
Procedia PDF Downloads 13923658 A Geographic Information System Mapping Method for Creating Improved Satellite Solar Radiation Dataset Over Qatar
Authors: Sachin Jain, Daniel Perez-Astudillo, Dunia A. Bachour, Antonio P. Sanfilippo
Abstract:
The future of solar energy in Qatar is evolving steadily. Hence, high-quality spatial solar radiation data is of the uttermost requirement for any planning and commissioning of solar technology. Generally, two types of solar radiation data are available: satellite data and ground observations. Satellite solar radiation data is developed by the physical and statistical model. Ground data is collected by solar radiation measurement stations. The ground data is of high quality. However, they are limited to distributed point locations with the high cost of installation and maintenance for the ground stations. On the other hand, satellite solar radiation data is continuous and available throughout geographical locations, but they are relatively less accurate than ground data. To utilize the advantage of both data, a product has been developed here which provides spatial continuity and higher accuracy than any of the data alone. The popular satellite databases: National Solar radiation Data Base, NSRDB (PSM V3 model, spatial resolution: 4 km) is chosen here for merging with ground-measured solar radiation measurement in Qatar. The spatial distribution of ground solar radiation measurement stations is comprehensive in Qatar, with a network of 13 ground stations. The monthly average of the daily total Global Horizontal Irradiation (GHI) component from ground and satellite data is used for error analysis. The normalized root means square error (NRMSE) values of 3.31%, 6.53%, and 6.63% for October, November, and December 2019 were observed respectively when comparing in-situ and NSRDB data. The method is based on the Empirical Bayesian Kriging Regression Prediction model available in ArcGIS, ESRI. The workflow of the algorithm is based on the combination of regression and kriging methods. A regression model (OLS, ordinary least square) is fitted between the ground and NSBRD data points. A semi-variogram is fitted into the experimental semi-variogram obtained from the residuals. The kriging residuals obtained after fitting the semi-variogram model were added to NSRBD data predicted values obtained from the regression model to obtain the final predicted values. The NRMSE values obtained after merging are respectively 1.84%, 1.28%, and 1.81% for October, November, and December 2019. One more explanatory variable, that is the ground elevation, has been incorporated in the regression and kriging methods to reduce the error and to provide higher spatial resolution (30 m). The final GHI maps have been created after merging, and NRMSE values of 1.24%, 1.28%, and 1.28% have been observed for October, November, and December 2019, respectively. The proposed merging method has proven as a highly accurate method. An additional method is also proposed here to generate calibrated maps by using regression and kriging model and further to use the calibrated model to generate solar radiation maps from the explanatory variable only when not enough historical ground data is available for long-term analysis. The NRMSE values obtained after the comparison of the calibrated maps with ground data are 5.60% and 5.31% for November and December 2019 month respectively.Keywords: global horizontal irradiation, GIS, empirical bayesian kriging regression prediction, NSRDB
Procedia PDF Downloads 8923657 Retail Strategy to Reduce Waste Keeping High Profit Utilizing Taylor's Law in Point-of-Sales Data
Authors: Gen Sakoda, Hideki Takayasu, Misako Takayasu
Abstract:
Waste reduction is a fundamental problem for sustainability. Methods for waste reduction with point-of-sales (POS) data are proposed, utilizing the knowledge of a recent econophysics study on a statistical property of POS data. Concretely, the non-stationary time series analysis method based on the Particle Filter is developed, which considers abnormal fluctuation scaling known as Taylor's law. This method is extended for handling incomplete sales data because of stock-outs by introducing maximum likelihood estimation for censored data. The way for optimal stock determination with pricing the cost of waste reduction is also proposed. This study focuses on the examination of the methods for large sales numbers where Taylor's law is obvious. Numerical analysis using aggregated POS data shows the effectiveness of the methods to reduce food waste maintaining a high profit for large sales numbers. Moreover, the way of pricing the cost of waste reduction reveals that a small profit loss realizes substantial waste reduction, especially in the case that the proportionality constant of Taylor’s law is small. Specifically, around 1% profit loss realizes half disposal at =0.12, which is the actual value of processed food items used in this research. The methods provide practical and effective solutions for waste reduction keeping a high profit, especially with large sales numbers.Keywords: food waste reduction, particle filter, point-of-sales, sustainable development goals, Taylor's law, time series analysis
Procedia PDF Downloads 13123656 Aesthetic Analysis and Socio-Cultural Significance of Eku Idowo and Anipo Masquerades of the Anetuno (Ebira Chao)
Authors: Lamidi Lawal Aduozava
Abstract:
Masquerade tradition is an indigenous culture of the Anetuno an extraction of the Ebira referred to as Ebira chao. This paper seeks to make aesthetic analysis of the masquerades in terms of their costumes and socio-cultural significance. To this end, the study examined and documented the functions and roles of Anipo and Idowo masquerades in terms of therapeutic, economic, prophetic and divination, entertainment, and funeral functions to the owner community(Eziobe group of families) in Igarra, Edo State of Nigeria, West Africa. For the purpose of data collection, focus group discussion, participatory, visual and observatory methods of data collection were used. All the data collected were aesthetically, descriptively and historically analyzed.Keywords: Aesthetics, , Costume, , Masquerades, , Significance.
Procedia PDF Downloads 16323655 Rejoinders to the Expression of Reprimand among Jordanian Youth: A Pragmatic Study
Authors: Nisreen Al-Khawaldeh
Abstract:
The study investigates the expressions voiced by Jordanian youth as rejoinders to the expressions of reprimands. It also explores the impact sociocultural variables exert on such types of rejoinders. To our best knowledge, this study is the first of its kind. Despite the significance and sensitivity of such type of communicative act, there is a scarcity of research on it, and it has not been investigated in the Jordanian context. Data collected from observation of naturally occurring data. Data have been qualitatively and quantitatively analyzed in light of the rapport management approach (RMA). The analysis revealed different types of rejoinders, among which was the expression of apology, admitting responsibility, and trying to manage and fix the situation were the most used strategies. Variation in the types of strategies was attributed to the influence of the sociocultural variables. Promising ideas were recommended for future research.Keywords: gender, rejoinder to reprimand, Jordanian youth, rapport management approach
Procedia PDF Downloads 19623654 Linguistic Insights Improve Semantic Technology in Medical Research and Patient Self-Management Contexts
Authors: William Michael Short
Abstract:
Semantic Web’ technologies such as the Unified Medical Language System Metathesaurus, SNOMED-CT, and MeSH have been touted as transformational for the way users access online medical and health information, enabling both the automated analysis of natural-language data and the integration of heterogeneous healthrelated resources distributed across the Internet through the use of standardized terminologies that capture concepts and relationships between concepts that are expressed differently across datasets. However, the approaches that have so far characterized ‘semantic bioinformatics’ have not yet fulfilled the promise of the Semantic Web for medical and health information retrieval applications. This paper argues within the perspective of cognitive linguistics and cognitive anthropology that four features of human meaning-making must be taken into account before the potential of semantic technologies can be realized for this domain. First, many semantic technologies operate exclusively at the level of the word. However, texts convey meanings in ways beyond lexical semantics. For example, transitivity patterns (distributions of active or passive voice) and modality patterns (configurations of modal constituents like may, might, could, would, should) convey experiential and epistemic meanings that are not captured by single words. Language users also naturally associate stretches of text with discrete meanings, so that whole sentences can be ascribed senses similar to the senses of words (so-called ‘discourse topics’). Second, natural language processing systems tend to operate according to the principle of ‘one token, one tag’. For instance, occurrences of the word sound must be disambiguated for part of speech: in context, is sound a noun or a verb or an adjective? In syntactic analysis, deterministic annotation methods may be acceptable. But because natural language utterances are typically characterized by polyvalency and ambiguities of all kinds (including intentional ambiguities), such methods leave the meanings of texts highly impoverished. Third, ontologies tend to be disconnected from everyday language use and so struggle in cases where single concepts are captured through complex lexicalizations that involve profile shifts or other embodied representations. More problematically, concept graphs tend to capture ‘expert’ technical models rather than ‘folk’ models of knowledge and so may not match users’ common-sense intuitions about the organization of concepts in prototypical structures rather than Aristotelian categories. Fourth, and finally, most ontologies do not recognize the pervasively figurative character of human language. However, since the time of Galen the widespread use of metaphor in the linguistic usage of both medical professionals and lay persons has been recognized. In particular, metaphor is a well-documented linguistic tool for communicating experiences of pain. Because semantic medical knowledge-bases are designed to help capture variations within technical vocabularies – rather than the kinds of conventionalized figurative semantics that practitioners as well as patients actually utilize in clinical description and diagnosis – they fail to capture this dimension of linguistic usage. The failure of semantic technologies in these respects degrades the efficiency and efficacy not only of medical research, where information retrieval inefficiencies can lead to direct financial costs to organizations, but also of care provision, especially in contexts of patients’ self-management of complex medical conditions.Keywords: ambiguity, bioinformatics, language, meaning, metaphor, ontology, semantic web, semantics
Procedia PDF Downloads 13223653 Wave Velocity-Rock Property Relationships in Shallow Marine Libyan Carbonate Reservoir
Authors: Tarek S. Duzan, Abdulaziz F. Ettir
Abstract:
Wave velocities, Core and Log petrophysical data were collected from recently drilled four new wells scattered through-out the Dahra/Jofra (PL-5) Reservoir. The collected data were analyzed for the relationships of Wave Velocities with rock property such as Porosity, permeability and Bulk Density. Lots of Literature review reveals a number of differing results and conclusions regarding wave velocities (Compressional Waves (Vp) and Shear Waves (Vs)) versus rock petrophysical property relationships, especially in carbonate reservoirs. In this paper, we focused on the relationships between wave velocities (Vp , Vs) and the ratio Vp/Vs with rock properties for shallow marine libyan carbonate reservoir (Real Case). Upon data analysis, a relationship between petrophysical properties and wave velocities (Vp, Vs) and the ratio Vp/Vs has been found. Porosity and bulk density properties have shown exponential relationship with wave velocities, while permeability has shown a power relationship in the interested zone. It is also clear that wave velocities (Vp , Vs) seems to be a good indicator for the lithology change with true vertical depth. Therefore, it is highly recommended to use the output relationships to predict porosity, bulk density and permeability of the similar reservoir type utilizing the most recent seismic data.Keywords: conventional core analysis (porosity, permeability bulk density) data, VS wave and P-wave velocities, shallow carbonate reservoir in D/J field
Procedia PDF Downloads 33223652 Impact of Audit Committee on Earning Quality of Listed Consumer Goods Companies in Nigeria
Authors: Usman Yakubu, Muktar Haruna
Abstract:
The paper examines the impact of the audit committee on the earning quality of the listed consumer goods sector in Nigeria. The study used data collected from annual reports and accounts of the 13 sampled companies for the periods 2007 to 2018. Data were analyzed by means of descriptive statistics to provide summary statistics for the variables; also, correlation analysis was carried out using the Pearson correlation technique for the correlation between the dependent and independent variables. Regression was employed using the Generalized Least Square technique since the data has both time series and cross sectional attributes (panel data). It was found out that the audit committee had a positive and significant influence on the earning quality in the listed consumer goods companies in Nigeria. Thus, the study recommends that competency and personal integrity should be the worthwhile attributes to be considered while constituting the committee; this could enhance the quality of accounting information. In addition to that majority of the committee members should be independent directors in order to allow a high level of independency to be exercised.Keywords: earning quality, corporate governance, audit committee, financial reporting
Procedia PDF Downloads 17323651 Ranking All of the Efficient DMUs in DEA
Authors: Elahe Sarfi, Esmat Noroozi, Farhad Hosseinzadeh Lotfi
Abstract:
One of the important issues in Data Envelopment Analysis is the ranking of Decision Making Units. In this paper, a method for ranking DMUs is presented through which the weights related to efficient units should be chosen in a way that the other units preserve a certain percentage of their efficiency with the mentioned weights. To this end, a model is presented for ranking DMUs on the base of their superefficiency by considering the mentioned restrictions related to weights. This percentage can be determined by decision Maker. If the specific percentage is unsuitable, we can find a suitable and feasible one for ranking DMUs accordingly. Furthermore, the presented model is capable of ranking all of the efficient units including nonextreme efficient ones. Finally, the presented models are utilized for two sets of data and related results are reported.Keywords: data envelopment analysis, efficiency, ranking, weight
Procedia PDF Downloads 45723650 Detecting the Palaeochannels Based on Optical Data and High-Resolution Radar Data for Periyarriver Basin
Authors: S. Jayalakshmi, Gayathri S., Subiksa V., Nithyasri P., Agasthiya
Abstract:
Paleochannels are the buried part of an active river system which was separated from the active river channel by the process of cutoff or abandonment during the dynamic evolution of the active river. Over time, they are filled by young unconsolidated or semi-consolidated sediments. Additionally, it is impacted by geo morphological influences, lineament alterations, and other factors. The primary goal of this study is to identify the paleochannels in Periyar river basin for the year 2023. Those channels has a high probability in the presence of natural resources, including gold, platinum,tin,an duranium. Numerous techniques are used to map the paleochannel. Using the optical data, Satellite images were collected from various sources, which comprises multispectral satellite images from which indices such as Normalized Difference Vegetation Index (NDVI),Normalized Difference Water Index (NDWI), Soil Adjusted Vegetative Index (SAVI) and thematic layers such as Lithology, Stream Network, Lineament were prepared. Weights are assigned to each layer based on its importance, and overlay analysis has done, which concluded that the northwest region of the area has shown some paleochannel patterns. The results were cross-verified using the results obtained using microwave data. Using Sentinel data, Synthetic Aperture Radar (SAR) Image was extracted from European Space Agency (ESA) portal, pre-processed it using SNAP 6.0. In addition to that, Polarimetric decomposition technique has incorporated to detect the paleochannels based on its scattering property. Further, Principal component analysis has done for enhanced output imagery. Results obtained from optical and microwave radar data were compared and the location of paleochannels were detected. It resulted six paleochannels in the study area out of which three paleochannels were validated with the existing data published by Department of Geology and Environmental Science, Kerala. The other three paleochannels were newly detected with the help of SAR image.Keywords: paleochannels, optical data, SAR image, SNAP
Procedia PDF Downloads 9223649 Flipping the Script: Opportunities, Challenges, and Threats of a Digital Revolution in Higher Education
Authors: James P. Takona
Abstract:
In a world that is experiencing sharp digital transformations guided by digital technologies, the potential of technology to drive transformation and evolution in the higher is apparent. Higher education is facing a paradigm shift that exposes susceptibilities and threats to fully online programs in the face of post-Covid-19 trends of commodification. This historical moment is likely to be remembered as a critical turning point from analog to digital degree-focused learning modalities, where the default became the pivot point of competition between higher education institutions. Fall 2020 marks a significant inflection point in higher education as students, educators, and government leaders scrutinize higher education's price and value propositions through the new lens of traditional lecture halls versus multiple digitized delivery modes. Online education has since tiled the way for a pedagogical shift in how teachers teach and students learn. The incremental growth of online education in the west can now be attributed to the increasing patronage among students, faculty, and institution administrators. More often than not, college instructors assume paraclete roles in this learning mode, while students become active collaborators and no longer passive learners. This paper offers valuable discernments into the threats, challenges, and opportunities of a massive digital revolution in servicing degree programs. To view digital instruction and learning demands for instructional practices that revolve around collaborative work, engaging students in learning activities, and an engagement that promotes active efforts to solicit strong connections between course activities and expected learning pace for all students. Appropriate digital technologies demand instructors and students need prior solid skills. Need for the use of digital technology to support instruction and learning, intelligent tutoring offers great promise, and failures at implementing digital learning may not improve outcomes for specific student populations. Digital learning benefits students differently depending on their circumstances and background and those of the institution and/or program. Students have alternative options, access to the convenience of learning anytime and anywhere, and the possibility of acquiring and developing new skills leading to lifelong learning.Keywords: digi̇tized learning, digital education, collaborative work, high education, online education, digitize delivery
Procedia PDF Downloads 9123648 Detection of Autistic Children's Voice Based on Artificial Neural Network
Authors: Royan Dawud Aldian, Endah Purwanti, Soegianto Soelistiono
Abstract:
In this research we have been developed an automatic investigation to classify normal children voice or autistic by using modern computation technology that is computation based on artificial neural network. The superiority of this computation technology is its capability on processing and saving data. In this research, digital voice features are gotten from the coefficient of linear-predictive coding with auto-correlation method and have been transformed in frequency domain using fast fourier transform, which used as input of artificial neural network in back-propagation method so that will make the difference between normal children and autistic automatically. The result of back-propagation method shows that successful classification capability for normal children voice experiment data is 100% whereas, for autistic children voice experiment data is 100%. The success rate using back-propagation classification system for the entire test data is 100%.Keywords: autism, artificial neural network, backpropagation, linier predictive coding, fast fourier transform
Procedia PDF Downloads 46123647 Application of Improved Semantic Communication Technology in Remote Sensing Data Transmission
Authors: Tingwei Shu, Dong Zhou, Chengjun Guo
Abstract:
Semantic communication is an emerging form of communication that realize intelligent communication by extracting semantic information of data at the source and transmitting it, and recovering the data at the receiving end. It can effectively solve the problem of data transmission under the situation of large data volume, low SNR and restricted bandwidth. With the development of Deep Learning, semantic communication further matures and is gradually applied in the fields of the Internet of Things, Uumanned Air Vehicle cluster communication, remote sensing scenarios, etc. We propose an improved semantic communication system for the situation where the data volume is huge and the spectrum resources are limited during the transmission of remote sensing images. At the transmitting, we need to extract the semantic information of remote sensing images, but there are some problems. The traditional semantic communication system based on Convolutional Neural Network cannot take into account the global semantic information and local semantic information of the image, which results in less-than-ideal image recovery at the receiving end. Therefore, we adopt the improved vision-Transformer-based structure as the semantic encoder instead of the mainstream one using CNN to extract the image semantic features. In this paper, we first perform pre-processing operations on remote sensing images to improve the resolution of the images in order to obtain images with more semantic information. We use wavelet transform to decompose the image into high-frequency and low-frequency components, perform bilinear interpolation on the high-frequency components and bicubic interpolation on the low-frequency components, and finally perform wavelet inverse transform to obtain the preprocessed image. We adopt the improved Vision-Transformer structure as the semantic coder to extract and transmit the semantic information of remote sensing images. The Vision-Transformer structure can better train the huge data volume and extract better image semantic features, and adopt the multi-layer self-attention mechanism to better capture the correlation between semantic features and reduce redundant features. Secondly, to improve the coding efficiency, we reduce the quadratic complexity of the self-attentive mechanism itself to linear so as to improve the image data processing speed of the model. We conducted experimental simulations on the RSOD dataset and compared the designed system with a semantic communication system based on CNN and image coding methods such as BGP and JPEG to verify that the method can effectively alleviate the problem of excessive data volume and improve the performance of image data communication.Keywords: semantic communication, transformer, wavelet transform, data processing
Procedia PDF Downloads 7823646 Distributional and Developmental Analysis of PM2.5 in Beijing, China
Authors: Alexander K. Guo
Abstract:
PM2.5 poses a large threat to people’s health and the environment and is an issue of large concern in Beijing, brought to the attention of the government by the media. In addition, both the United States Embassy in Beijing and the government of China have increased monitoring of PM2.5 in recent years, and have made real-time data available to the public. This report utilizes hourly historical data (2008-2016) from the U.S. Embassy in Beijing for the first time. The first objective was to attempt to fit probability distributions to the data to better predict a number of days exceeding the standard, and the second was to uncover any yearly, seasonal, monthly, daily, and hourly patterns and trends that may arise to better understand of air control policy. In these data, 66,650 hours and 2687 days provided valid data. Lognormal, gamma, and Weibull distributions were fit to the data through an estimation of parameters. The Chi-squared test was employed to compare the actual data with the fitted distributions. The data were used to uncover trends, patterns, and improvements in PM2.5 concentration over the period of time with valid data in addition to specific periods of time that received large amounts of media attention, analyzed to gain a better understanding of causes of air pollution. The data show a clear indication that Beijing’s air quality is unhealthy, with an average of 94.07µg/m3 across all 66,650 hours with valid data. It was found that no distribution fit the entire dataset of all 2687 days well, but each of the three above distribution types was optimal in at least one of the yearly data sets, with the lognormal distribution found to fit recent years better. An improvement in air quality beginning in 2014 was discovered, with the first five months of 2016 reporting an average PM2.5 concentration that is 23.8% lower than the average of the same period in all years, perhaps the result of various new pollution-control policies. It was also found that the winter and fall months contained more days in both good and extremely polluted categories, leading to a higher average but a comparable median in these months. Additionally, the evening hours, especially in the winter, reported much higher PM2.5 concentrations than the afternoon hours, possibly due to the prohibition of trucks in the city in the daytime and the increased use of coal for heating in the colder months when residents are home in the evening. Lastly, through analysis of special intervals that attracted media attention for either unnaturally good or bad air quality, the government’s temporary pollution control measures, such as more intensive road-space rationing and factory closures, are shown to be effective. In summary, air quality in Beijing is improving steadily and do follow standard probability distributions to an extent, but still needs improvement. Analysis will be updated when new data become available.Keywords: Beijing, distribution, patterns, pm2.5, trends
Procedia PDF Downloads 24523645 Anxiety and Depression in Caregivers of Autistic Children
Authors: Mou Juliet Rebeiro, S. M. Abul Kalam Azad
Abstract:
This study was carried out to see the anxiety and depression in caregivers of autistic children. The objectives of the research were to assess depression and anxiety among caregivers of autistic children and to find out the experience of caregivers. For this purpose, the research was conducted on a sample of 39 caregivers of autistic children. Participants were taken from a special school. To collect data for this study each of the caregivers were administered questionnaire comprising scales to measure anxiety and depression and some responses of the participants were taken through interview based on a topic guide. Obtained quantitative data were analyzed by using statistical analysis and qualitative data were analyzed according to themes. Mean of the anxiety score (55.85) and depression score (108.33) is above the cutoff point. Results showed that anxiety and depression is clinically present in caregivers of autistic children. Most of the caregivers experienced behavior, emotional, cognitive and social problems of their child that is linked with anxiety and depression.Keywords: anxiety, autism, caregiver, depression
Procedia PDF Downloads 30323644 Design and Field Programmable Gate Array Implementation of Radio Frequency Identification for Boosting up Tag Data Processing
Authors: G. Rajeshwari, V. D. M. Jabez Daniel
Abstract:
Radio Frequency Identification systems are used for automated identification in various applications such as automobiles, health care and security. It is also called as the automated data collection technology. RFID readers are placed in any area to scan large number of tags to cover a wide distance. The placement of the RFID elements may result in several types of collisions. A major challenge in RFID system is collision avoidance. In the previous works the collision was avoided by using algorithms such as ALOHA and tree algorithm. This work proposes collision reduction and increased throughput through reading enhancement method with tree algorithm. The reading enhancement is done by improving interrogation procedure and increasing the data handling capacity of RFID reader with parallel processing. The work is simulated using Xilinx ISE 14.5 verilog language. By implementing this in the RFID system, we can able to achieve high throughput and avoid collision in the reader at a same instant of time. The overall system efficiency will be increased by implementing this.Keywords: antenna, anti-collision protocols, data management system, reader, reading enhancement, tag
Procedia PDF Downloads 30623643 Design of Labview Based DAQ System
Authors: Omar A. A. Shaebi, Matouk M. Elamari, Salaheddin Allid
Abstract:
The Information Computing System of Monitoring (ICSM) for the Research Reactor of Tajoura Nuclear Research Centre (TNRC) stopped working since early 1991. According to the regulations, the computer is necessary to operate the reactor up to its maximum power (10 MW). The fund is secured via IAEA to develop a modern computer based data acquisition system to replace the old computer. This paper presents the development of the Labview based data acquisition system to allow automated measurements using National Instruments Hardware and its labview software. The developed system consists of SCXI 1001 chassis, the chassis house four SCXI 1100 modules each can maintain 32 variables. The chassis is interfaced with the PC using NI PCI-6023 DAQ Card. Labview, developed by National Instruments, is used to run and operate the DAQ System. Labview is graphical programming environment suited for high level design. It allows integrating different signal processing components or subsystems within a graphical framework. The results showed system capabilities in monitoring variables, acquiring and saving data. Plus the capability of the labview to control the DAQ.Keywords: data acquisition, labview, signal conditioning, national instruments
Procedia PDF Downloads 49523642 An Analysis of Public Environmental Investment on the Sustainable Development in China
Authors: K. Y. Chen, Y. N. Jia, H. Chua, C. W. Kan
Abstract:
As the largest developing country in the world, China is now facing the problem arising from the environment. Thus, China government increases the environmental investment yearly. In this study, we will analyse the effect of the public environmental investment on the sustainable development in China. Firstly, we will review the current situation of China's environmental issue. Secondly, we will collect the yearly environmental data as well as the information of public environmental investment. Finally, we will use the collected data to analyse and project the SWOT of public environmental investment in China. Therefore, the aim of this paper is to provide the relationship between public environmental investment and sustainable development in China. Based on the data collected, it was revealed that the public environmental investment had a positive impact on the sustainable development in China as well as the GDP growth. Acknowledgment: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.Keywords: China, public environmental investment, sustainable development, analysis
Procedia PDF Downloads 37023641 TELUM Land Use Model: An Investigation of Data Requirements and Calibration Results for Chittenden County MPO, U.S.A.
Authors: Georgia Pozoukidou
Abstract:
TELUM software is a land use model designed specifically to help metropolitan planning organizations (MPOs) prepare their transportation improvement programs and fulfill their numerous planning responsibilities. In this context obtaining, preparing, and validating socioeconomic forecasts are becoming fundamental tasks for an MPO in order to ensure that consistent population and employment data are provided to travel demand models. Chittenden County Metropolitan Planning Organization of Vermont State was used as a case study to test the applicability of TELUM land use model. The technical insights and lessons learned from the land use model application have transferable value for all MPOs faced with land use forecasting development and transportation modelling.Keywords: calibration data requirements, land use models, land use planning, metropolitan planning organizations
Procedia PDF Downloads 29323640 Artificial Intelligence Approach to Water Treatment Processes: Case Study of Daspoort Treatment Plant, South Africa
Authors: Olumuyiwa Ojo, Masengo Ilunga
Abstract:
Artificial neural network (ANN) has broken the bounds of the convention programming, which is actually a function of garbage in garbage out by its ability to mimic the human brain. Its ability to adopt, adapt, adjust, evaluate, learn and recognize the relationship, behavior, and pattern of a series of data set administered to it, is tailored after the human reasoning and learning mechanism. Thus, the study aimed at modeling wastewater treatment process in order to accurately diagnose water control problems for effective treatment. For this study, a stage ANN model development and evaluation methodology were employed. The source data analysis stage involved a statistical analysis of the data used in modeling in the model development stage, candidate ANN architecture development and then evaluated using a historical data set. The model was developed using historical data obtained from Daspoort Wastewater Treatment plant South Africa. The resultant designed dimensions and model for wastewater treatment plant provided good results. Parameters considered were temperature, pH value, colour, turbidity, amount of solids and acidity. Others are total hardness, Ca hardness, Mg hardness, and chloride. This enables the ANN to handle and represent more complex problems that conventional programming is incapable of performing.Keywords: ANN, artificial neural network, wastewater treatment, model, development
Procedia PDF Downloads 14923639 Using Artificial Intelligence Method to Explore the Important Factors in the Reuse of Telecare by the Elderly
Authors: Jui-Chen Huang
Abstract:
This research used artificial intelligence method to explore elderly’s opinions on the reuse of telecare, its effect on their service quality, satisfaction and the relationship between customer perceived value and intention to reuse. This study conducted a questionnaire survey on the elderly. A total of 124 valid copies of a questionnaire were obtained. It adopted Backpropagation Network (BPN) to propose an effective and feasible analysis method, which is different from the traditional method. Two third of the total samples (82 samples) were taken as the training data, and the one third of the samples (42 samples) were taken as the testing data. The training and testing data RMSE (root mean square error) are 0.022 and 0.009 in the BPN, respectively. As shown, the errors are acceptable. On the other hand, the training and testing data RMSE are 0.100 and 0.099 in the regression model, respectively. In addition, the results showed the service quality has the greatest effects on the intention to reuse, followed by the satisfaction, and perceived value. This result of the Backpropagation Network method is better than the regression analysis. This result can be used as a reference for future research.Keywords: artificial intelligence, backpropagation network (BPN), elderly, reuse, telecare
Procedia PDF Downloads 21223638 Computer Server Virtualization
Authors: Pradeep M. C. Chand
Abstract:
Virtual infrastructure initiatives often spring from data center server consolidation projects, which focus on reducing existing infrastructure “box count”, retiring older hardware or life-extending legacy applications. Server consolidation benefits result from a reduction in the overall number of systems and related recurring costs (power, cooling, rack space, etc.) and also helps in the reduction of heat to the environment.Keywords: server virtualization, data center, consolidation, project
Procedia PDF Downloads 53023637 Road Traffic Accidents Analysis in Mexico City through Crowdsourcing Data and Data Mining Techniques
Authors: Gabriela V. Angeles Perez, Jose Castillejos Lopez, Araceli L. Reyes Cabello, Emilio Bravo Grajales, Adriana Perez Espinosa, Jose L. Quiroz Fabian
Abstract:
Road traffic accidents are among the principal causes of traffic congestion, causing human losses, damages to health and the environment, economic losses and material damages. Studies about traditional road traffic accidents in urban zones represents very high inversion of time and money, additionally, the result are not current. However, nowadays in many countries, the crowdsourced GPS based traffic and navigation apps have emerged as an important source of information to low cost to studies of road traffic accidents and urban congestion caused by them. In this article we identified the zones, roads and specific time in the CDMX in which the largest number of road traffic accidents are concentrated during 2016. We built a database compiling information obtained from the social network known as Waze. The methodology employed was Discovery of knowledge in the database (KDD) for the discovery of patterns in the accidents reports. Furthermore, using data mining techniques with the help of Weka. The selected algorithms was the Maximization of Expectations (EM) to obtain the number ideal of clusters for the data and k-means as a grouping method. Finally, the results were visualized with the Geographic Information System QGIS.Keywords: data mining, k-means, road traffic accidents, Waze, Weka
Procedia PDF Downloads 41823636 Change Point Analysis in Average Ozone Layer Temperature Using Exponential Lomax Distribution
Authors: Amjad Abdullah, Amjad Yahya, Bushra Aljohani, Amani Alghamdi
Abstract:
Change point detection is an important part of data analysis. The presence of a change point refers to a significant change in the behavior of a time series. In this article, we examine the detection of multiple change points of parameters of the exponential Lomax distribution, which is broad and flexible compared with other distributions while fitting data. We used the Schwarz information criterion and binary segmentation to detect multiple change points in publicly available data on the average temperature in the ozone layer. The change points were successfully located.Keywords: binary segmentation, change point, exponentialLomax distribution, information criterion
Procedia PDF Downloads 175