Search results for: real-time data acquisition and reporting
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26089

Search results for: real-time data acquisition and reporting

22489 Ontology-Based Backpropagation Neural Network Classification and Reasoning Strategy for NoSQL and SQL Databases

Authors: Hao-Hsiang Ku, Ching-Ho Chi

Abstract:

Big data applications have become an imperative for many fields. Many researchers have been devoted into increasing correct rates and reducing time complexities. Hence, the study designs and proposes an Ontology-based backpropagation neural network classification and reasoning strategy for NoSQL big data applications, which is called ON4NoSQL. ON4NoSQL is responsible for enhancing the performances of classifications in NoSQL and SQL databases to build up mass behavior models. Mass behavior models are made by MapReduce techniques and Hadoop distributed file system based on Hadoop service platform. The reference engine of ON4NoSQL is the ontology-based backpropagation neural network classification and reasoning strategy. Simulation results indicate that ON4NoSQL can efficiently achieve to construct a high performance environment for data storing, searching, and retrieving.

Keywords: Hadoop, NoSQL, ontology, back propagation neural network, high distributed file system

Procedia PDF Downloads 262
22488 Biophysically Motivated Phylogenies

Authors: Catherine Felce, Lior Pachter

Abstract:

Current methods for building phylogenetic trees from gene expression data consider mean expression levels. With single-cell technologies, we can leverage more information about cell dynamics by considering the entire distribution of gene expression across cells. Using biophysical modeling, we propose a method for constructing phylogenetic trees from scRNA-seq data, building on Felsenstein's method of continuous characters. This method can highlight genes whose level of expression may be unchanged between species, but whose rates of transcription/decay may have evolved over time.

Keywords: phylogenetics, single-cell, biophysical modeling, transcription

Procedia PDF Downloads 51
22487 Open Educational Resource in Online Mathematics Learning

Authors: Haohao Wang

Abstract:

Technology, multimedia in Open Educational Resources, can contribute positively to student performance in an online instructional environment. Student performance data of past four years were obtained from an online course entitled Applied Calculus (MA139). This paper examined the data to determine whether multimedia (independent variable) had any impact on student performance (dependent variable) in online math learning, and how students felt about the value of the technology. Two groups of student data were analyzed, group 1 (control) from the online applied calculus course that did not use multimedia instructional materials, and group 2 (treatment) of the same online applied calculus course that used multimedia instructional materials. For the MA139 class, results indicate a statistically significant difference (p = .001) between the two groups, where group 1 had a final score mean of 56.36 (out of 100), group 2 of 70.68. Additionally, student testimonials were discussed in which students shared their experience in learning applied calculus online with multimedia instructional materials.

Keywords: online learning, open educational resources, multimedia, technology

Procedia PDF Downloads 377
22486 Design and Development of Fleet Management System for Multi-Agent Autonomous Surface Vessel

Authors: Zulkifli Zainal Abidin, Ahmad Shahril Mohd Ghani

Abstract:

Agent-based systems technology has been addressed as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are sophisticated systems that act autonomously across open and distributed environments in solving problems. Nevertheless, it is impractical to rely on a single agent to do all computing processes in solving complex problems. An increasing number of applications lately require multiple agents to work together. A multi-agent system (MAS) is a loosely coupled network of agents that interact to solve problems that are beyond the individual capacities or knowledge of each problem solver. However, the network of MAS still requires a main system to govern or oversees the operation of the agents in order to achieve a unified goal. We had developed a fleet management system (FMS) in order to manage the fleet of agents, plan route for the agents, perform real-time data processing and analysis, and issue sets of general and specific instructions to the agents. This FMS should be able to perform real-time data processing, communicate with the autonomous surface vehicle (ASV) agents and generate bathymetric map according to the data received from each ASV unit. The first algorithm is developed to communicate with the ASV via radio communication using standard National Marine Electronics Association (NMEA) protocol sentences. Next, the second algorithm will take care of the path planning, formation and pattern generation is tested using various sample data. Lastly, the bathymetry map generation algorithm will make use of data collected by the agents to create bathymetry map in real-time. The outcome of this research is expected can be applied on various other multi-agent systems.

Keywords: autonomous surface vehicle, fleet management system, multi agent system, bathymetry

Procedia PDF Downloads 271
22485 An AI-Based Dynamical Resource Allocation Calculation Algorithm for Unmanned Aerial Vehicle

Authors: Zhou Luchen, Wu Yubing, Burra Venkata Durga Kumar

Abstract:

As the scale of the network becomes larger and more complex than before, the density of user devices is also increasing. The development of Unmanned Aerial Vehicle (UAV) networks is able to collect and transform data in an efficient way by using software-defined networks (SDN) technology. This paper proposed a three-layer distributed and dynamic cluster architecture to manage UAVs by using an AI-based resource allocation calculation algorithm to address the overloading network problem. Through separating services of each UAV, the UAV hierarchical cluster system performs the main function of reducing the network load and transferring user requests, with three sub-tasks including data collection, communication channel organization, and data relaying. In this cluster, a head node and a vice head node UAV are selected considering the Central Processing Unit (CPU), operational (RAM), and permanent (ROM) memory of devices, battery charge, and capacity. The vice head node acts as a backup that stores all the data in the head node. The k-means clustering algorithm is used in order to detect high load regions and form the UAV layered clusters. The whole process of detecting high load areas, forming and selecting UAV clusters, and moving the selected UAV cluster to that area is proposed as offloading traffic algorithm.

Keywords: k-means, resource allocation, SDN, UAV network, unmanned aerial vehicles

Procedia PDF Downloads 111
22484 Deep learning with Noisy Labels : Learning True Labels as Discrete Latent Variable

Authors: Azeddine El-Hassouny, Chandrashekhar Meshram, Geraldin Nanfack

Abstract:

In recent years, learning from data with noisy labels (Label Noise) has been a major concern in supervised learning. This problem has become even more worrying in Deep Learning, where the generalization capabilities have been questioned lately. Indeed, deep learning requires a large amount of data that is generally collected by search engines, which frequently return data with unreliable labels. In this paper, we investigate the Label Noise in Deep Learning using variational inference. Our contributions are : (1) exploiting Label Noise concept where the true labels are learnt using reparameterization variational inference, while observed labels are learnt discriminatively. (2) the noise transition matrix is learnt during the training without any particular process, neither heuristic nor preliminary phases. The theoretical results shows how true label distribution can be learned by variational inference in any discriminate neural network, and the effectiveness of our approach is proved in several target datasets, such as MNIST and CIFAR32.

Keywords: label noise, deep learning, discrete latent variable, variational inference, MNIST, CIFAR32

Procedia PDF Downloads 128
22483 Multimodal Database of Retina Images for Africa: The First Open Access Digital Repository for Retina Images in Sub Saharan Africa

Authors: Simon Arunga, Teddy Kwaga, Rita Kageni, Michael Gichangi, Nyawira Mwangi, Fred Kagwa, Rogers Mwavu, Amos Baryashaba, Luis F. Nakayama, Katharine Morley, Michael Morley, Leo A. Celi, Jessica Haberer, Celestino Obua

Abstract:

Purpose: The main aim for creating the Multimodal Database of Retinal Images for Africa (MoDRIA) was to provide a publicly available repository of retinal images for responsible researchers to conduct algorithm development in a bid to curb the challenges of ophthalmic artificial intelligence (AI) in Africa. Methods: Data and retina images were ethically sourced from sites in Uganda and Kenya. Data on medical history, visual acuity, ocular examination, blood pressure, and blood sugar were collected. Retina images were captured using fundus cameras (Foru3-nethra and Canon CR-Mark-1). Images were stored on a secure online database. Results: The database consists of 7,859 retinal images in portable network graphics format from 1,988 participants. Images from patients with human immunodeficiency virus were 18.9%, 18.2% of images were from hypertensive patients, 12.8% from diabetic patients, and the rest from normal’ participants. Conclusion: Publicly available data repositories are a valuable asset in the development of AI technology. Therefore, is a need for the expansion of MoDRIA so as to provide larger datasets that are more representative of Sub-Saharan data.

Keywords: retina images, MoDRIA, image repository, African database

Procedia PDF Downloads 127
22482 Assessment of Environmental Quality of an Urban Setting

Authors: Namrata Khatri

Abstract:

The rapid growth of cities is transforming the urban environment and posing significant challenges for environmental quality. This study examines the urban environment of Belagavi in Karnataka, India, using geostatistical methods to assess the spatial pattern and land use distribution of the city and to evaluate the quality of the urban environment. The study is driven by the necessity to assess the environmental impact of urbanisation. Satellite data was utilised to derive information on land use and land cover. The investigation revealed that land use had changed significantly over time, with a drop in plant cover and an increase in built-up areas. High-resolution satellite data was also utilised to map the city's open areas and gardens. GIS-based research was used to assess public green space accessibility and to identify regions with inadequate waste management practises. The findings revealed that garbage collection and disposal techniques in specific areas of the city needed to be improved. Moreover, the study evaluated the city's thermal environment using Landsat 8 land surface temperature (LST) data. The investigation found that built-up regions had higher LST values than green areas, pointing to the city's urban heat island (UHI) impact. The study's conclusions have far-reaching ramifications for urban planners and politicians in Belgaum and other similar cities. The findings may be utilised to create sustainable urban planning strategies that address the environmental effect of urbanisation while also improving the quality of life for city dwellers. Satellite data and high-resolution satellite pictures were gathered for the study, and remote sensing and GIS tools were utilised to process and analyse the data. Ground truthing surveys were also carried out to confirm the accuracy of the remote sensing and GIS-based data. Overall, this study provides a complete assessment of Belgaum's environmental quality and emphasizes the potential of remote sensing and geographic information systems (GIS) approaches in environmental assessment and management.

Keywords: environmental quality, UEQ, remote sensing, GIS

Procedia PDF Downloads 80
22481 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton

Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani

Abstract:

Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.

Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton

Procedia PDF Downloads 324
22480 U.S. Trade and Trade Balance with China: Testing for Marshall-Lerner Condition and the J-Curve Hypothesis

Authors: Anisul Islam

Abstract:

The U.S. has a very strong trade relationship with China but with a large and persistent trade deficit. Some has argued that the undervalued Chinese Yuan is to be blamed for the persistent trade deficit. The empirical results are mixed at best. This paper empirically estimates the U.S. export function along with the U.S. import function with its trade with China with the purpose of testing for the existence of the Marshall-Lerner (ML) condition as well for the possible existence of the J-curve hypothesis. Annual export and import data will be utilized for as long as the time series data exists. The export and import functions will be estimated using advanced econometric techniques, along with appropriate diagnostic tests performed to examine the validity and reliability of the estimated results. The annual time-series data covers from 1975 to 2022 with a sample size of 48 years, the longest period ever utilized before in any previous study. The data is collected from several sources, such as the World Bank’s World Development Indicators, IMF Financial Statistics, IMF Direction of Trade Statistics, and several other sources. The paper is expected to shed important light on the ongoing debate regarding the persistent U.S. trade deficit with China and the policies that may be useful to reduce such deficits over time. As such, the paper will be of great interest for the academics, researchers, think tanks, global organizations, and policy makers in both China and the U.S.

Keywords: exports, imports, marshall-lerner condition, j-curve hypothesis, united states, china

Procedia PDF Downloads 64
22479 Analysis of Education Faculty Students’ Attitudes towards E-Learning According to Different Variables

Authors: Eyup Yurt, Ahmet Kurnaz, Ismail Sahin

Abstract:

The purpose of the study is to investigate the education faculty students’ attitudes towards e-learning according to different variables. In current study, the data were collected from 393 students of an education faculty in Turkey. In this study, theattitude towards e‐learning scale and the demographic information form were used to collect data. The collected data were analyzed by t-test, ANOVA and Pearson correlation coefficient. It was found that there is a significant difference in students’ tendency towards e-learning and avoidance from e-learning based on gender. Male students have more positive attitudes towards e-learning than female students. Also, the students who used the internet lesshave higher levels of avoidance from e-learning. Additionally, it is found that there is a positive and significant relationship between the number of personal mobile learning devices and tendency towards e-learning. On the other hand, there is a negative and significant relationship between the number of personal mobile learning devices and avoidance from e-learning. Also, suggestions were presented according to findings.

Keywords: education faculty students, attitude towards e-learning, gender, daily internet usage time, m-learning

Procedia PDF Downloads 308
22478 Physics-Informed Machine Learning for Displacement Estimation in Solid Mechanics Problem

Authors: Feng Yang

Abstract:

Machine learning (ML), especially deep learning (DL), has been extensively applied to many applications in recently years and gained great success in solving different problems, including scientific problems. However, conventional ML/DL methodologies are purely data-driven which have the limitations, such as need of ample amount of labelled training data, lack of consistency to physical principles, and lack of generalizability to new problems/domains. Recently, there is a growing consensus that ML models need to further take advantage of prior knowledge to deal with these limitations. Physics-informed machine learning, aiming at integration of physics/domain knowledge into ML, has been recognized as an emerging area of research, especially in the recent 2 to 3 years. In this work, physics-informed ML, specifically physics-informed neural network (NN), is employed and implemented to estimate the displacements at x, y, z directions in a solid mechanics problem that is controlled by equilibrium equations with boundary conditions. By incorporating the physics (i.e. the equilibrium equations) into the learning process of NN, it is showed that the NN can be trained very efficiently with a small set of labelled training data. Experiments with different settings of the NN model and the amount of labelled training data were conducted, and the results show that very high accuracy can be achieved in fulfilling the equilibrium equations as well as in predicting the displacements, e.g. in setting the overall displacement of 0.1, a root mean square error (RMSE) of 2.09 × 10−4 was achieved.

Keywords: deep learning, neural network, physics-informed machine learning, solid mechanics

Procedia PDF Downloads 150
22477 Effect of Freight Transport Intensity on Firm Performance: Mediating Role of Operational Capability

Authors: Bonaventure Naab Dery, Abdul Muntaka Samad

Abstract:

During the past two decades, huge population growth has been recorded in developing countries. Thisled to an increase in the demand for transport services for human and merchandises. The study sought to examine the effect of freight transport intensity on firm performance. Among others, this study sought to examine the link between freight transport intensity and firm performance; the link between operational capability and firm performance, and the mediating role of operational capability on the relationship between freight transport intensity and firm performance. The study used a descriptive research design and a quantitative research approach. Questionnaireswereusedfor the data collection through snowball sampling and purposive sampling. SPSS and Mplus are being used to analyze the data. It is anticipated that, when the data is analyzed, it would validate the hypotheses that have been proposed by the researchers. Base on the findings, relevant recommendations would be made for managerial implications and future studies.

Keywords: freight transport intensity, freight economy transport intensity, freight efficiency transport intensity, operational capability, firm performance

Procedia PDF Downloads 148
22476 A Descriptive Study of the Characteristics of Introductory Accounting Courses Offered by Community Colleges

Authors: Jonathan Nash, Allen Hartt, Catherine Plante

Abstract:

In many nations, community colleges, or similar institutions, play a crucial role in higher education. For example, in the United States more than half of all undergraduate students enroll in a community college at some point during their academic career. Similar statistics have been reported for Australia and Canada. Recognizing the important role these institutions play in educating future accountants, the American Accounting Association has called for research that contributes to a better understanding of these members of the academic community. Although previous literature has shown that community colleges and 4-year institutions differ on many levels, the extant literature has provided data on the characteristics of introductory accounting courses for four-year institutions but not for community colleges. We fill a void in the literature by providing data on the characteristics of introductory accounting courses offered by community colleges in the United States. Data are collected on several dimensions including: course size and staffing, pedagogical orientation, standardization of course elements, textbook selection, and use of technology-based course management tools. Many of these dimensions have been used in previous research examining four-year institutions thereby facilitating comparisons. The resulting data should be of interest to instructors, regulators and administrators, researchers, and the accounting profession. The data provide information on the introductory accounting courses completed by the average community college student which can help instructors identify areas where transfer students’ experiences might differ from their contemporaries at four-year colleges. Regulators and administrators may be interested in the differences between accounting courses offered by two- and four-year institutions when implementing standardized transfer programs. Researchers might use the data to motivate future research into whether differences between two- and four-year institutions affect outcomes like the probability of students choosing to major in accounting and their performance within the major. Accounting professionals may use our findings as a springboard for facilitating discussions related to the accounting labor supply.

Keywords: Accounting curricula, Community college, Descriptive study, Introductory accounting

Procedia PDF Downloads 101
22475 Marketing Mixed Factors Affecting on Commercial Transactions Expectations through Social Networks

Authors: Ladaporn Pithuk

Abstract:

This study aims to investigate the marketing mixed factors that affecting on expectations about commercial transactions through social networks. The research method will using quantitative research, data was collected by questionnaires to person have experience access to trading over the internet for 400 sample by purposive sampling method. Data was analyzed by descriptive statistic including percentage, mean, standard deviation and using quality function deployment for hypothesis testing. Finding the most significant interrelationship between marketing mixed factors and commercial transactions expectations through social networks are product and place the relationship of five ties product and place (location) is involved in almost all will make the site a model that meets the needs of the user visit. In terms of price, the promotion, privacy, personalization and providing a process technical. This will make operations more efficient, reduce confusion, duplication, delays in data transmission, including the creation of different elements in products and services.

Keywords: commercial transactions expectations, marketing mixed factors, social networks, consumer behavior

Procedia PDF Downloads 237
22474 Future Housing Energy Efficiency Associated with the Auckland Unitary Plan

Authors: Bin Su

Abstract:

The draft Auckland Unitary Plan outlines the future land used for new housing and businesses with Auckland population growth over the next thirty years. According to Auckland Unitary Plan, over the next 30 years, the population of Auckland is projected to increase by one million, and up to 70% of total new dwellings occur within the existing urban area. Intensification will not only increase the number of median or higher density houses such as terrace house, apartment building, etc. within the existing urban area but also change mean housing design data that can impact building thermal performance under the local climate. Based on mean energy consumption and building design data, and their relationships of a number of Auckland sample houses, this study is to estimate the future mean housing energy consumption associated with the change of mean housing design data and evaluate housing energy efficiency with the Auckland Unitary Plan.

Keywords: Auckland Unitary Plan, building thermal design, housing design, housing energy efficiency

Procedia PDF Downloads 386
22473 The Use of Video in Increasing Speaking Ability of the First Year Students of SMAN 12 Pekanbaru in the Academic Year 2011/2012

Authors: Elvira Wahyuni

Abstract:

This study is a classroom action research. The general objective of this study was to find out students’ speaking ability through teaching English by using video and to find out the effectiveness of using video in teaching English to improve students’ speaking ability. The subjects of this study were 34 of the first-year students of SMAN 12 Pekanbaru who were learning English as a foreign language (EFL). Students were given pre-test before the treatment and post-test after the treatment. Quantitative data was collected by using speaking test requiring the students to respond to the recorded questions. Qualitative data was collected through observation sheets and field notes. The research finding reveals that there is a significant improvement of the students’ speaking ability through the use of video in speaking class. The qualitative data gave a description and additional information about the learning process done by the students. The research findings indicate that the use of video in teaching and learning is good in increasing learning outcome.

Keywords: English teaching, fun learning, speaking ability, video

Procedia PDF Downloads 256
22472 Youth Involvement in Cybercrime in Nigeria: A Case Study of Ikeja Local Government Area

Authors: Niyi Adegoke, Saanumi Jimmy Omolou

Abstract:

The prevalence rate of youth involving in cybercrime is alarming, which calls for concern among the government, parents, NGO and religious bodies, hence this paper aims at examining youth involvement in cybercrime in Nigeria. Achievement motivation theory was used to explain the activities of cyber-criminals in Nigerian society. A descriptive survey method was adopted for the study. The sample for the study was one hundred and fifty (150) respondents randomly selected from the population of the study. A questionnaire was used to gather information and data from the respondents. Data collected through the questionnaire were analyzed using percentage tool for the respondents’ bio-data while chi-square was employed to test the hypotheses. Findings from the study have revealed that parental negligence, unemployment, peer influence, and quest for materialism were responsible for cyber-crimes in Nigeria. The study concludes with the following recommendations among which are: creating employment opportunities for the youths and ensure good governance and accountability among other things will go a long way to solve the problem of cybercrime in our society.

Keywords: cybercrime, youth, Nigeria, unemployment, information communication technology

Procedia PDF Downloads 228
22471 R Software for Parameter Estimation of Spatio-Temporal Model

Authors: Budi Nurani Ruchjana, Atje Setiawan Abdullah, I. Gede Nyoman Mindra Jaya, Eddy Hermawan

Abstract:

In this paper, we propose the application package to estimate parameters of spatiotemporal model based on the multivariate time series analysis using the R open-source software. We build packages mainly to estimate the parameters of the Generalized Space Time Autoregressive (GSTAR) model. GSTAR is a combination of time series and spatial models that have parameters vary per location. We use the method of Ordinary Least Squares (OLS) and use the Mean Average Percentage Error (MAPE) to fit the model to spatiotemporal real phenomenon. For case study, we use oil production data from volcanic layer at Jatibarang Indonesia or climate data such as rainfall in Indonesia. Software R is very user-friendly and it is making calculation easier, processing the data is accurate and faster. Limitations R script for the estimation of model parameters spatiotemporal GSTAR built is still limited to a stationary time series model. Therefore, the R program under windows can be developed either for theoretical studies and application.

Keywords: GSTAR Model, MAPE, OLS method, oil production, R software

Procedia PDF Downloads 243
22470 Using Corpora in Semantic Studies of English Adjectives

Authors: Oxana Lukoshus

Abstract:

The methods of corpus linguistics, a well-established field of research, are being increasingly applied in cognitive linguistics. Corpora data are especially useful for different quantitative studies of grammatical and other aspects of language. The main objective of this paper is to demonstrate how present-day corpora can be applied in semantic studies in general and in semantic studies of adjectives in particular. Polysemantic adjectives have been the subject of numerous studies. But most of them have been carried out on dictionaries. Undoubtedly, dictionaries are viewed as one of the basic data sources, but only at the initial steps of a research. The author usually starts with the analysis of the lexicographic data after which s/he comes up with a hypothesis. In the research conducted three polysemantic synonyms true, loyal, faithful have been analyzed in terms of differences and similarities in their semantic structure. A corpus-based approach in the study of the above-mentioned adjectives involves the following. After the analysis of the dictionary data there was the reference to the following corpora to study the distributional patterns of the words under study – the British National Corpus (BNC) and the Corpus of Contemporary American English (COCA). These corpora are continually updated and contain thousands of examples of the words under research which make them a useful and convenient data source. For the purpose of this study there were no special needs regarding genre, mode or time of the texts included in the corpora. Out of the range of possibilities offered by corpus-analysis software (e.g. word lists, statistics of word frequencies, etc.), the most useful tool for the semantic analysis was the extracting a list of co-occurrence for the given search words. Searching by lemmas, e.g. true, true to, and grouping the results by lemmas have proved to be the most efficient corpora feature for the adjectives under the study. Following the search process, the corpora provided a list of co-occurrences, which were then to be analyzed and classified. Not every co-occurrence was relevant for the analysis. For example, the phrases like An enormous sense of responsibility to protect the minds and hearts of the faithful from incursions by the state was perceived to be the basic duty of the church leaders or ‘True,’ said Phoebe, ‘but I'd probably get to be a Union Official immediately were left out as in the first example the faithful is a substantivized adjective and in the second example true is used alone with no other parts of speech. The subsequent analysis of the corpora data gave the grounds for the distribution groups of the adjectives under the study which were then investigated with the help of a semantic experiment. To sum it up, the corpora-based approach has proved to be a powerful, reliable and convenient tool to get the data for the further semantic study.

Keywords: corpora, corpus-based approach, polysemantic adjectives, semantic studies

Procedia PDF Downloads 314
22469 An Improved Transmission Scheme in Cooperative Communication System

Authors: Seung-Jun Yu, Young-Min Ko, Hyoung-Kyu Song

Abstract:

Recently developed cooperative diversity scheme enables a terminal to get transmit diversity through the support of other terminals. However, most of the introduced cooperative schemes have a common fault of decreased transmission rate because the destination should receive the decodable compositions of symbols from the source and the relay. In order to achieve high data rate, we propose a cooperative scheme that employs hierarchical modulation. This scheme is free from the rate loss and allows seamless cooperative communication.

Keywords: cooperative communication, hierarchical modulation, high data rate, transmission scheme

Procedia PDF Downloads 426
22468 Time Series Analysis on the Production of Fruit Juice: A Case Study of National Horticultural Research Institute (Nihort) Ibadan, Oyo State

Authors: Abiodun Ayodele Sanyaolu

Abstract:

The research was carried out to investigate the time series analysis on quarterly production of fruit juice at the National Horticultural Research Institute Ibadan from 2010 to 2018. Documentary method of data collection was used, and the method of least square and moving average were used in the analysis. From the calculation and the graph, it was glaring that there was increase, decrease, and uniform movements in both the graph of the original data and the tabulated quarter values of the original data. Time series analysis was used to detect the trend in the highest number of fruit juice and it appears to be good over a period of time and the methods used to forecast are additive and multiplicative models. Since it was observed that the production of fruit juice is usually high in January of every year, it is strongly advised that National Horticultural Research Institute should make more provision for fruit juice storage outside this period of the year.

Keywords: fruit juice, least square, multiplicative models, time series

Procedia PDF Downloads 142
22467 Predicting the Solubility of Aromatic Waste Petroleum Paraffin Wax in Organic Solvents to Separate Ultra-Pure Phase Change Materials (PCMs) by Molecular Dynamics Simulation

Authors: Fathi Soliman

Abstract:

With the ultimate goal of developing the separation of n-paraffin as phase change material (PCM) by means of molecular dynamic simulations, we attempt to predict the solubility of aromatic n-paraffin in two organic solvents: Butyl Acetate (BA) and Methyl Iso Butyl Ketone (MIBK). A simple model of aromatic paraffin: 2-hexadecylantharacene with amorphous molecular structure and periodic boundary conditions was constructed. The results showed that MIBK is the best solvent to separate ultra-pure phase change materials and this data was compatible with experimental data done to separate ultra-pure n-paraffin from waste petroleum aromatic paraffin wax, the separated n-paraffin was characterized by XRD, TGA, GC and DSC, moreover; data revealed that the n-paraffin separated by using MIBK is better as PCM than that separated using BA.

Keywords: molecular dynamics simulation, n-paraffin, organic solvents, phase change materials, solvent extraction

Procedia PDF Downloads 195
22466 Application of Association Rule Using Apriori Algorithm for Analysis of Industrial Accidents in 2013-2014 in Indonesia

Authors: Triano Nurhikmat

Abstract:

Along with the progress of science and technology, the development of the industrialized world in Indonesia took place very rapidly. This leads to a process of industrialization of society Indonesia faster with the establishment of the company and the workplace are diverse. Development of the industry relates to the activity of the worker. Where in these work activities do not cover the possibility of an impending crash on either the workers or on a construction project. The cause of the occurrence of industrial accidents was the fault of electrical damage, work procedures, and error technique. The method of an association rule is one of the main techniques in data mining and is the most common form used in finding the patterns of data collection. In this research would like to know how relations of the association between the incidence of any industrial accidents. Therefore, by using methods of analysis association rule patterns associated with combination obtained two iterations item set (2 large item set) when every factor of industrial accidents with a West Jakarta so industrial accidents caused by the occurrence of an electrical value damage = 0.2 support and confidence value = 1, and the reverse pattern with value = 0.2 support and confidence = 0.75.

Keywords: association rule, data mining, industrial accidents, rules

Procedia PDF Downloads 299
22465 Automated Method Time Measurement System for Redesigning Dynamic Facility Layout

Authors: Salam Alzubaidi, G. Fantoni, F. Failli, M. Frosolini

Abstract:

The dynamic facility layout problem is a really critical issue in the competitive industrial market; thus, solving this problem requires robust design and effective simulation systems. The sustainable simulation requires inputting reliable and accurate data into the system. So this paper describes an automated system integrated into the real environment to measure the duration of the material handling operations, collect the data in real-time, and determine the variances between the actual and estimated time schedule of the operations in order to update the simulation software and redesign the facility layout periodically. The automated method- time measurement system collects the real data through using Radio Frequency-Identification (RFID) and Internet of Things (IoT) technologies. Hence, attaching RFID- antenna reader and RFID tags enables the system to identify the location of the objects and gathering the time data. The real duration gathered will be manipulated by calculating the moving average duration of the material handling operations, choosing the shortest material handling path, and then updating the simulation software to redesign the facility layout accommodating with the shortest/real operation schedule. The periodic simulation in real-time is more sustainable and reliable than the simulation system relying on an analysis of historical data. The case study of this methodology is in cooperation with a workshop team for producing mechanical parts. Although there are some technical limitations, this methodology is promising, and it can be significantly useful in the redesigning of the manufacturing layout.

Keywords: dynamic facility layout problem, internet of things, method time measurement, radio frequency identification, simulation

Procedia PDF Downloads 120
22464 The Influence of the Form of Grain on the Mechanical Behaviour of Sand

Authors: Mohamed Boualem Salah

Abstract:

The size and shape of soil particles reflect the formation history of the grains. In turn, the macro scale behavior of the soil mass results from particle level interactions which are affected by particle shape. Sphericity, roundness and smoothness characterize different scales associated to particle shape. New experimental data and data from previously published studies are gathered into two databases to explore the effects of particle shape on packing as well as small and large-strain properties of sandy soils. Data analysis shows that increased particle irregularity (angularity and/or eccentricity) leads to: an increase in emax and emin, a decrease in stiffness yet with increased sensitivity to the state of stress, an increase in compressibility under zero-lateral strain loading, and an increase in critical state friction angle φcs and intercept Γ with a weak effect on slope λ. Therefore, particle shape emerges as a significant soil index property that needs to be properly characterized and documented, particularly in clean sands and gravels. The systematic assessment of particle shape will lead to a better understanding of sand behavior.

Keywords: angularity, eccentricity, shape particle, behavior of soil

Procedia PDF Downloads 414
22463 The Views of Teachers over the Father Involvement to Preschool Education Programs

Authors: Fatma Tezel Sahin, Zeynep Nur Aydin Kilic, Aysegul Akinci Cosgun

Abstract:

Family involvement activities are a significant place in increasing the success in preschool education and maintaining the education. It is necessary that both of the parents be in the family involvement activities. However, while mother involvement is obtained in the family involvement activities, father involvement is neglected. For that reason, the current study aims at determining the views of teachers with regard to father involvement in the preschool education programs. The working group of the study consisted of 23 preschool teachers. The study is a descriptive survey. The data were obtained through individual interviews. As a data collection instrument, “Teacher Interview Form” was used. The data were analysed through content analysis method. The data regarding the views of the teachers were given as frequency and percentage values. At the end of the research, a great majority of the teachers stated that they were proficient in applying family involvement studies. They also pointed out that they held more family meetings in order to obtain family involvement and then they implemented involvement activities both in the class and out of the class for parents. They expressed that they observed more mother involvement in these activities that fathers. Parents expressed that the reasons why fathers involved in these activities less compared to mothers were the working conditions of fathers and that it was regarded as a task of mothers. Depending on the results of the research, it is likely to recommend that fathers should be informed about the involvement in family activities and that some applications and opportunities should be supplied for the fathers in preschool education institutions in order to encourage them.

Keywords: preschool education, parent involvement, father involvement, teacher views

Procedia PDF Downloads 324
22462 Political Views and Information and Communication Technology (ICT) in Tertiary Institutions in Achieving the Millennium Development Goals (MDGS)

Authors: Perpetual Nwakaego Ibe

Abstract:

The Millennium Development Goals (MDGs), were an integrated project formed to eradicate many unnatural situations the citizens of the third world country may found themselves in. The MDGs, to be a sustainable project for the future depends 100% on the actions of governments, multilateral institutions and civil society. This paper first looks at the political views on the MDGs and relates it to the current electoral situations around the country by underlining the drastic changes over the few months. The second part of the paper presents ICT in tertiary institutions as one of the solutions in terms of the success of the MDGs. ICT is vital in all phases of educational process and development of the cloud connectivity is an added advantage of Information and Communication Technology (ICT) for sharing a common data bank for research purposes among UNICEF, RED CROSS, NPS, INEC, NMIC, and WHO. Finally, the paper concludes with areas that needs twigging and recommendations for the tertiary institutions committed to delivering an ambitious set of goals. A combination of observation, and document materials for data gathering was employed as the methodology for carrying out this research.

Keywords: MDG, ICT, data bank, database

Procedia PDF Downloads 200
22461 Development of a Multi-User Country Specific Food Composition Table for Malawi

Authors: Averalda van Graan, Joelaine Chetty, Malory Links, Agness Mwangwela, Sitilitha Masangwi, Dalitso Chimwala, Shiban Ghosh, Elizabeth Marino-Costello

Abstract:

Food composition data is becoming increasingly important as dealing with food insecurity and malnutrition in its persistent form of under-nutrition is now coupled with increasing over-nutrition and its related ailments in the developing world, of which Malawi is not spared. In the absence of a food composition database (FCDB) inherent to our dietary patterns, efforts were made to develop a country-specific FCDB for nutrition practice, research, and programming. The main objective was to develop a multi-user, country-specific food composition database, and table from existing published and unpublished scientific literature. A multi-phased approach guided by the project framework was employed. Phase 1 comprised a scoping mission to assess the nutrition landscape for compilation activities. Phase 2 involved training of a compiler and data collection from various sources, primarily; institutional libraries, online databases, and food industry nutrient data. Phase 3 subsumed evaluation and compilation of data using FAO and IN FOODS standards and guidelines. Phase 4 concluded the process with quality assurance. 316 Malawian food items categorized into eight food groups for 42 components were captured. The majority were from the baby food group (27%), followed by a staple (22%) and animal (22%) food group. Fats and oils consisted the least number of food items (2%), followed by fruits (6%). Proximate values are well represented; however, the percent missing data is huge for some components, including Se 68%, I 75%, Vitamin A 42%, and lipid profile; saturated fat 53%, mono-saturated fat 59%, poly-saturated fat 59% and cholesterol 56%. A multi-phased approach following the project framework led to the development of the first Malawian FCDB and table. The table reflects inherent Malawian dietary patterns and nutritional concerns. The FCDB can be used by various professionals in nutrition and health. Rising over-nutrition, NCD, and changing diets challenge us for nutrient profiles of processed foods and complete lipid profiles.

Keywords: analytical data, dietary pattern, food composition data, multi-phased approach

Procedia PDF Downloads 93
22460 Educational Leadership and Artificial Intelligence

Authors: Sultan Ghaleb Aldaihani

Abstract:

- The environment in which educational leadership takes place is becoming increasingly complex due to factors like globalization and rapid technological change. - This is creating a "leadership gap" where the complexity of the environment outpaces the ability of leaders to effectively respond. - Educational leadership involves guiding teachers and the broader school system towards improved student learning and achievement. 2. Implications of Artificial Intelligence (AI) in Educational Leadership: - AI has great potential to enhance education, such as through intelligent tutoring systems and automating routine tasks to free up teachers. - AI can also have significant implications for educational leadership by providing better information and data-driven decision-making capabilities. - Computer-adaptive testing can provide detailed, individualized data on student learning that leaders can use for instructional decisions and accountability. 3. Enhancing Decision-Making Processes: - Statistical models and data mining techniques can help identify at-risk students earlier, allowing for targeted interventions. - Probability-based models can diagnose students likely to drop out, enabling proactive support. - These data-driven approaches can make resource allocation and decision-making more effective. 4. Improving Efficiency and Productivity: - AI systems can automate tasks and change processes to improve the efficiency of educational leadership and administration. - Integrating AI can free up leaders to focus more on their role's human, interactive elements.

Keywords: Education, Leadership, Technology, Artificial Intelligence

Procedia PDF Downloads 43