Search results for: data acquisition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25646

Search results for: data acquisition

22766 Assessment of Environmental Quality of an Urban Setting

Authors: Namrata Khatri

Abstract:

The rapid growth of cities is transforming the urban environment and posing significant challenges for environmental quality. This study examines the urban environment of Belagavi in Karnataka, India, using geostatistical methods to assess the spatial pattern and land use distribution of the city and to evaluate the quality of the urban environment. The study is driven by the necessity to assess the environmental impact of urbanisation. Satellite data was utilised to derive information on land use and land cover. The investigation revealed that land use had changed significantly over time, with a drop in plant cover and an increase in built-up areas. High-resolution satellite data was also utilised to map the city's open areas and gardens. GIS-based research was used to assess public green space accessibility and to identify regions with inadequate waste management practises. The findings revealed that garbage collection and disposal techniques in specific areas of the city needed to be improved. Moreover, the study evaluated the city's thermal environment using Landsat 8 land surface temperature (LST) data. The investigation found that built-up regions had higher LST values than green areas, pointing to the city's urban heat island (UHI) impact. The study's conclusions have far-reaching ramifications for urban planners and politicians in Belgaum and other similar cities. The findings may be utilised to create sustainable urban planning strategies that address the environmental effect of urbanisation while also improving the quality of life for city dwellers. Satellite data and high-resolution satellite pictures were gathered for the study, and remote sensing and GIS tools were utilised to process and analyse the data. Ground truthing surveys were also carried out to confirm the accuracy of the remote sensing and GIS-based data. Overall, this study provides a complete assessment of Belgaum's environmental quality and emphasizes the potential of remote sensing and geographic information systems (GIS) approaches in environmental assessment and management.

Keywords: environmental quality, UEQ, remote sensing, GIS

Procedia PDF Downloads 80
22765 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton

Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani

Abstract:

Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.

Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton

Procedia PDF Downloads 324
22764 U.S. Trade and Trade Balance with China: Testing for Marshall-Lerner Condition and the J-Curve Hypothesis

Authors: Anisul Islam

Abstract:

The U.S. has a very strong trade relationship with China but with a large and persistent trade deficit. Some has argued that the undervalued Chinese Yuan is to be blamed for the persistent trade deficit. The empirical results are mixed at best. This paper empirically estimates the U.S. export function along with the U.S. import function with its trade with China with the purpose of testing for the existence of the Marshall-Lerner (ML) condition as well for the possible existence of the J-curve hypothesis. Annual export and import data will be utilized for as long as the time series data exists. The export and import functions will be estimated using advanced econometric techniques, along with appropriate diagnostic tests performed to examine the validity and reliability of the estimated results. The annual time-series data covers from 1975 to 2022 with a sample size of 48 years, the longest period ever utilized before in any previous study. The data is collected from several sources, such as the World Bank’s World Development Indicators, IMF Financial Statistics, IMF Direction of Trade Statistics, and several other sources. The paper is expected to shed important light on the ongoing debate regarding the persistent U.S. trade deficit with China and the policies that may be useful to reduce such deficits over time. As such, the paper will be of great interest for the academics, researchers, think tanks, global organizations, and policy makers in both China and the U.S.

Keywords: exports, imports, marshall-lerner condition, j-curve hypothesis, united states, china

Procedia PDF Downloads 63
22763 Analysis of Education Faculty Students’ Attitudes towards E-Learning According to Different Variables

Authors: Eyup Yurt, Ahmet Kurnaz, Ismail Sahin

Abstract:

The purpose of the study is to investigate the education faculty students’ attitudes towards e-learning according to different variables. In current study, the data were collected from 393 students of an education faculty in Turkey. In this study, theattitude towards e‐learning scale and the demographic information form were used to collect data. The collected data were analyzed by t-test, ANOVA and Pearson correlation coefficient. It was found that there is a significant difference in students’ tendency towards e-learning and avoidance from e-learning based on gender. Male students have more positive attitudes towards e-learning than female students. Also, the students who used the internet lesshave higher levels of avoidance from e-learning. Additionally, it is found that there is a positive and significant relationship between the number of personal mobile learning devices and tendency towards e-learning. On the other hand, there is a negative and significant relationship between the number of personal mobile learning devices and avoidance from e-learning. Also, suggestions were presented according to findings.

Keywords: education faculty students, attitude towards e-learning, gender, daily internet usage time, m-learning

Procedia PDF Downloads 307
22762 Physics-Informed Machine Learning for Displacement Estimation in Solid Mechanics Problem

Authors: Feng Yang

Abstract:

Machine learning (ML), especially deep learning (DL), has been extensively applied to many applications in recently years and gained great success in solving different problems, including scientific problems. However, conventional ML/DL methodologies are purely data-driven which have the limitations, such as need of ample amount of labelled training data, lack of consistency to physical principles, and lack of generalizability to new problems/domains. Recently, there is a growing consensus that ML models need to further take advantage of prior knowledge to deal with these limitations. Physics-informed machine learning, aiming at integration of physics/domain knowledge into ML, has been recognized as an emerging area of research, especially in the recent 2 to 3 years. In this work, physics-informed ML, specifically physics-informed neural network (NN), is employed and implemented to estimate the displacements at x, y, z directions in a solid mechanics problem that is controlled by equilibrium equations with boundary conditions. By incorporating the physics (i.e. the equilibrium equations) into the learning process of NN, it is showed that the NN can be trained very efficiently with a small set of labelled training data. Experiments with different settings of the NN model and the amount of labelled training data were conducted, and the results show that very high accuracy can be achieved in fulfilling the equilibrium equations as well as in predicting the displacements, e.g. in setting the overall displacement of 0.1, a root mean square error (RMSE) of 2.09 × 10−4 was achieved.

Keywords: deep learning, neural network, physics-informed machine learning, solid mechanics

Procedia PDF Downloads 150
22761 Effect of Freight Transport Intensity on Firm Performance: Mediating Role of Operational Capability

Authors: Bonaventure Naab Dery, Abdul Muntaka Samad

Abstract:

During the past two decades, huge population growth has been recorded in developing countries. Thisled to an increase in the demand for transport services for human and merchandises. The study sought to examine the effect of freight transport intensity on firm performance. Among others, this study sought to examine the link between freight transport intensity and firm performance; the link between operational capability and firm performance, and the mediating role of operational capability on the relationship between freight transport intensity and firm performance. The study used a descriptive research design and a quantitative research approach. Questionnaireswereusedfor the data collection through snowball sampling and purposive sampling. SPSS and Mplus are being used to analyze the data. It is anticipated that, when the data is analyzed, it would validate the hypotheses that have been proposed by the researchers. Base on the findings, relevant recommendations would be made for managerial implications and future studies.

Keywords: freight transport intensity, freight economy transport intensity, freight efficiency transport intensity, operational capability, firm performance

Procedia PDF Downloads 148
22760 A Descriptive Study of the Characteristics of Introductory Accounting Courses Offered by Community Colleges

Authors: Jonathan Nash, Allen Hartt, Catherine Plante

Abstract:

In many nations, community colleges, or similar institutions, play a crucial role in higher education. For example, in the United States more than half of all undergraduate students enroll in a community college at some point during their academic career. Similar statistics have been reported for Australia and Canada. Recognizing the important role these institutions play in educating future accountants, the American Accounting Association has called for research that contributes to a better understanding of these members of the academic community. Although previous literature has shown that community colleges and 4-year institutions differ on many levels, the extant literature has provided data on the characteristics of introductory accounting courses for four-year institutions but not for community colleges. We fill a void in the literature by providing data on the characteristics of introductory accounting courses offered by community colleges in the United States. Data are collected on several dimensions including: course size and staffing, pedagogical orientation, standardization of course elements, textbook selection, and use of technology-based course management tools. Many of these dimensions have been used in previous research examining four-year institutions thereby facilitating comparisons. The resulting data should be of interest to instructors, regulators and administrators, researchers, and the accounting profession. The data provide information on the introductory accounting courses completed by the average community college student which can help instructors identify areas where transfer students’ experiences might differ from their contemporaries at four-year colleges. Regulators and administrators may be interested in the differences between accounting courses offered by two- and four-year institutions when implementing standardized transfer programs. Researchers might use the data to motivate future research into whether differences between two- and four-year institutions affect outcomes like the probability of students choosing to major in accounting and their performance within the major. Accounting professionals may use our findings as a springboard for facilitating discussions related to the accounting labor supply.

Keywords: Accounting curricula, Community college, Descriptive study, Introductory accounting

Procedia PDF Downloads 101
22759 Marketing Mixed Factors Affecting on Commercial Transactions Expectations through Social Networks

Authors: Ladaporn Pithuk

Abstract:

This study aims to investigate the marketing mixed factors that affecting on expectations about commercial transactions through social networks. The research method will using quantitative research, data was collected by questionnaires to person have experience access to trading over the internet for 400 sample by purposive sampling method. Data was analyzed by descriptive statistic including percentage, mean, standard deviation and using quality function deployment for hypothesis testing. Finding the most significant interrelationship between marketing mixed factors and commercial transactions expectations through social networks are product and place the relationship of five ties product and place (location) is involved in almost all will make the site a model that meets the needs of the user visit. In terms of price, the promotion, privacy, personalization and providing a process technical. This will make operations more efficient, reduce confusion, duplication, delays in data transmission, including the creation of different elements in products and services.

Keywords: commercial transactions expectations, marketing mixed factors, social networks, consumer behavior

Procedia PDF Downloads 237
22758 Future Housing Energy Efficiency Associated with the Auckland Unitary Plan

Authors: Bin Su

Abstract:

The draft Auckland Unitary Plan outlines the future land used for new housing and businesses with Auckland population growth over the next thirty years. According to Auckland Unitary Plan, over the next 30 years, the population of Auckland is projected to increase by one million, and up to 70% of total new dwellings occur within the existing urban area. Intensification will not only increase the number of median or higher density houses such as terrace house, apartment building, etc. within the existing urban area but also change mean housing design data that can impact building thermal performance under the local climate. Based on mean energy consumption and building design data, and their relationships of a number of Auckland sample houses, this study is to estimate the future mean housing energy consumption associated with the change of mean housing design data and evaluate housing energy efficiency with the Auckland Unitary Plan.

Keywords: Auckland Unitary Plan, building thermal design, housing design, housing energy efficiency

Procedia PDF Downloads 386
22757 The Use of Video in Increasing Speaking Ability of the First Year Students of SMAN 12 Pekanbaru in the Academic Year 2011/2012

Authors: Elvira Wahyuni

Abstract:

This study is a classroom action research. The general objective of this study was to find out students’ speaking ability through teaching English by using video and to find out the effectiveness of using video in teaching English to improve students’ speaking ability. The subjects of this study were 34 of the first-year students of SMAN 12 Pekanbaru who were learning English as a foreign language (EFL). Students were given pre-test before the treatment and post-test after the treatment. Quantitative data was collected by using speaking test requiring the students to respond to the recorded questions. Qualitative data was collected through observation sheets and field notes. The research finding reveals that there is a significant improvement of the students’ speaking ability through the use of video in speaking class. The qualitative data gave a description and additional information about the learning process done by the students. The research findings indicate that the use of video in teaching and learning is good in increasing learning outcome.

Keywords: English teaching, fun learning, speaking ability, video

Procedia PDF Downloads 256
22756 HIV and AIDS in Kosovo, Stigma Persist!

Authors: Luljeta Gashi, Naser Ramadani, Zana Deva, Dafina Gexha-Bunjaku

Abstract:

The official HIV/AIDS data in Kosovo are based on HIV case reporting from health-care services, the blood transfusion system and Voluntary Counselling and Testing centres. Between 1986 and 2014, are reported 95 HIV and AIDS cases, of which 49 were AIDS, 46 HIV and 40 deaths. The majority (69%) of cases were men, age group 25 to 34 (37%) and route of transmission is: heterosexual (90%), MSM (7%), vertical transmission (2%) and IDU (1%). Based on existing data and the UNAIDS classification system, Kosovo is currently still categorised as having a low-level HIV epidemic. Even though with a low HIV prevalence, Kosovo faces a number of threatening factors, including increased number of drug users, a stigmatized and discriminated MSM community, high percentage of youth among general population (57% of the population under the age of 25), with changing social norms and especially the sexual ones. Methods: Data collection was done using self administered structured questionnaires amongst 249 high school students. Data were analysed using the Statistical Package for Social Sciences (SPSS). Results: The findings revealed that 68% of students know that HIV transmission can be reduced by having sex with only one uninfected partner who has no other partners, 94% know that the risk of getting HIV can be reduced by using a condom every time they have sex, 68% know that a person cannot get HIV from mosquito bites, 81% know that they cannot get HIV by sharing food with someone who is infected and 46% know that a healthy looking person can have HIV. Conclusions: Seventy one percent of high school students correctly identify ways of preventing the sexual transmission of HIV and who reject the major misconceptions about HIV transmission. The findings of the study indicate a need for more health education and promotion.

Keywords: Kosovo, KPAR, HIV, high school

Procedia PDF Downloads 478
22755 Youth Involvement in Cybercrime in Nigeria: A Case Study of Ikeja Local Government Area

Authors: Niyi Adegoke, Saanumi Jimmy Omolou

Abstract:

The prevalence rate of youth involving in cybercrime is alarming, which calls for concern among the government, parents, NGO and religious bodies, hence this paper aims at examining youth involvement in cybercrime in Nigeria. Achievement motivation theory was used to explain the activities of cyber-criminals in Nigerian society. A descriptive survey method was adopted for the study. The sample for the study was one hundred and fifty (150) respondents randomly selected from the population of the study. A questionnaire was used to gather information and data from the respondents. Data collected through the questionnaire were analyzed using percentage tool for the respondents’ bio-data while chi-square was employed to test the hypotheses. Findings from the study have revealed that parental negligence, unemployment, peer influence, and quest for materialism were responsible for cyber-crimes in Nigeria. The study concludes with the following recommendations among which are: creating employment opportunities for the youths and ensure good governance and accountability among other things will go a long way to solve the problem of cybercrime in our society.

Keywords: cybercrime, youth, Nigeria, unemployment, information communication technology

Procedia PDF Downloads 228
22754 R Software for Parameter Estimation of Spatio-Temporal Model

Authors: Budi Nurani Ruchjana, Atje Setiawan Abdullah, I. Gede Nyoman Mindra Jaya, Eddy Hermawan

Abstract:

In this paper, we propose the application package to estimate parameters of spatiotemporal model based on the multivariate time series analysis using the R open-source software. We build packages mainly to estimate the parameters of the Generalized Space Time Autoregressive (GSTAR) model. GSTAR is a combination of time series and spatial models that have parameters vary per location. We use the method of Ordinary Least Squares (OLS) and use the Mean Average Percentage Error (MAPE) to fit the model to spatiotemporal real phenomenon. For case study, we use oil production data from volcanic layer at Jatibarang Indonesia or climate data such as rainfall in Indonesia. Software R is very user-friendly and it is making calculation easier, processing the data is accurate and faster. Limitations R script for the estimation of model parameters spatiotemporal GSTAR built is still limited to a stationary time series model. Therefore, the R program under windows can be developed either for theoretical studies and application.

Keywords: GSTAR Model, MAPE, OLS method, oil production, R software

Procedia PDF Downloads 242
22753 Using Corpora in Semantic Studies of English Adjectives

Authors: Oxana Lukoshus

Abstract:

The methods of corpus linguistics, a well-established field of research, are being increasingly applied in cognitive linguistics. Corpora data are especially useful for different quantitative studies of grammatical and other aspects of language. The main objective of this paper is to demonstrate how present-day corpora can be applied in semantic studies in general and in semantic studies of adjectives in particular. Polysemantic adjectives have been the subject of numerous studies. But most of them have been carried out on dictionaries. Undoubtedly, dictionaries are viewed as one of the basic data sources, but only at the initial steps of a research. The author usually starts with the analysis of the lexicographic data after which s/he comes up with a hypothesis. In the research conducted three polysemantic synonyms true, loyal, faithful have been analyzed in terms of differences and similarities in their semantic structure. A corpus-based approach in the study of the above-mentioned adjectives involves the following. After the analysis of the dictionary data there was the reference to the following corpora to study the distributional patterns of the words under study – the British National Corpus (BNC) and the Corpus of Contemporary American English (COCA). These corpora are continually updated and contain thousands of examples of the words under research which make them a useful and convenient data source. For the purpose of this study there were no special needs regarding genre, mode or time of the texts included in the corpora. Out of the range of possibilities offered by corpus-analysis software (e.g. word lists, statistics of word frequencies, etc.), the most useful tool for the semantic analysis was the extracting a list of co-occurrence for the given search words. Searching by lemmas, e.g. true, true to, and grouping the results by lemmas have proved to be the most efficient corpora feature for the adjectives under the study. Following the search process, the corpora provided a list of co-occurrences, which were then to be analyzed and classified. Not every co-occurrence was relevant for the analysis. For example, the phrases like An enormous sense of responsibility to protect the minds and hearts of the faithful from incursions by the state was perceived to be the basic duty of the church leaders or ‘True,’ said Phoebe, ‘but I'd probably get to be a Union Official immediately were left out as in the first example the faithful is a substantivized adjective and in the second example true is used alone with no other parts of speech. The subsequent analysis of the corpora data gave the grounds for the distribution groups of the adjectives under the study which were then investigated with the help of a semantic experiment. To sum it up, the corpora-based approach has proved to be a powerful, reliable and convenient tool to get the data for the further semantic study.

Keywords: corpora, corpus-based approach, polysemantic adjectives, semantic studies

Procedia PDF Downloads 314
22752 An Improved Transmission Scheme in Cooperative Communication System

Authors: Seung-Jun Yu, Young-Min Ko, Hyoung-Kyu Song

Abstract:

Recently developed cooperative diversity scheme enables a terminal to get transmit diversity through the support of other terminals. However, most of the introduced cooperative schemes have a common fault of decreased transmission rate because the destination should receive the decodable compositions of symbols from the source and the relay. In order to achieve high data rate, we propose a cooperative scheme that employs hierarchical modulation. This scheme is free from the rate loss and allows seamless cooperative communication.

Keywords: cooperative communication, hierarchical modulation, high data rate, transmission scheme

Procedia PDF Downloads 426
22751 Time Series Analysis on the Production of Fruit Juice: A Case Study of National Horticultural Research Institute (Nihort) Ibadan, Oyo State

Authors: Abiodun Ayodele Sanyaolu

Abstract:

The research was carried out to investigate the time series analysis on quarterly production of fruit juice at the National Horticultural Research Institute Ibadan from 2010 to 2018. Documentary method of data collection was used, and the method of least square and moving average were used in the analysis. From the calculation and the graph, it was glaring that there was increase, decrease, and uniform movements in both the graph of the original data and the tabulated quarter values of the original data. Time series analysis was used to detect the trend in the highest number of fruit juice and it appears to be good over a period of time and the methods used to forecast are additive and multiplicative models. Since it was observed that the production of fruit juice is usually high in January of every year, it is strongly advised that National Horticultural Research Institute should make more provision for fruit juice storage outside this period of the year.

Keywords: fruit juice, least square, multiplicative models, time series

Procedia PDF Downloads 142
22750 Predicting the Solubility of Aromatic Waste Petroleum Paraffin Wax in Organic Solvents to Separate Ultra-Pure Phase Change Materials (PCMs) by Molecular Dynamics Simulation

Authors: Fathi Soliman

Abstract:

With the ultimate goal of developing the separation of n-paraffin as phase change material (PCM) by means of molecular dynamic simulations, we attempt to predict the solubility of aromatic n-paraffin in two organic solvents: Butyl Acetate (BA) and Methyl Iso Butyl Ketone (MIBK). A simple model of aromatic paraffin: 2-hexadecylantharacene with amorphous molecular structure and periodic boundary conditions was constructed. The results showed that MIBK is the best solvent to separate ultra-pure phase change materials and this data was compatible with experimental data done to separate ultra-pure n-paraffin from waste petroleum aromatic paraffin wax, the separated n-paraffin was characterized by XRD, TGA, GC and DSC, moreover; data revealed that the n-paraffin separated by using MIBK is better as PCM than that separated using BA.

Keywords: molecular dynamics simulation, n-paraffin, organic solvents, phase change materials, solvent extraction

Procedia PDF Downloads 195
22749 Application of Association Rule Using Apriori Algorithm for Analysis of Industrial Accidents in 2013-2014 in Indonesia

Authors: Triano Nurhikmat

Abstract:

Along with the progress of science and technology, the development of the industrialized world in Indonesia took place very rapidly. This leads to a process of industrialization of society Indonesia faster with the establishment of the company and the workplace are diverse. Development of the industry relates to the activity of the worker. Where in these work activities do not cover the possibility of an impending crash on either the workers or on a construction project. The cause of the occurrence of industrial accidents was the fault of electrical damage, work procedures, and error technique. The method of an association rule is one of the main techniques in data mining and is the most common form used in finding the patterns of data collection. In this research would like to know how relations of the association between the incidence of any industrial accidents. Therefore, by using methods of analysis association rule patterns associated with combination obtained two iterations item set (2 large item set) when every factor of industrial accidents with a West Jakarta so industrial accidents caused by the occurrence of an electrical value damage = 0.2 support and confidence value = 1, and the reverse pattern with value = 0.2 support and confidence = 0.75.

Keywords: association rule, data mining, industrial accidents, rules

Procedia PDF Downloads 299
22748 Automated Method Time Measurement System for Redesigning Dynamic Facility Layout

Authors: Salam Alzubaidi, G. Fantoni, F. Failli, M. Frosolini

Abstract:

The dynamic facility layout problem is a really critical issue in the competitive industrial market; thus, solving this problem requires robust design and effective simulation systems. The sustainable simulation requires inputting reliable and accurate data into the system. So this paper describes an automated system integrated into the real environment to measure the duration of the material handling operations, collect the data in real-time, and determine the variances between the actual and estimated time schedule of the operations in order to update the simulation software and redesign the facility layout periodically. The automated method- time measurement system collects the real data through using Radio Frequency-Identification (RFID) and Internet of Things (IoT) technologies. Hence, attaching RFID- antenna reader and RFID tags enables the system to identify the location of the objects and gathering the time data. The real duration gathered will be manipulated by calculating the moving average duration of the material handling operations, choosing the shortest material handling path, and then updating the simulation software to redesign the facility layout accommodating with the shortest/real operation schedule. The periodic simulation in real-time is more sustainable and reliable than the simulation system relying on an analysis of historical data. The case study of this methodology is in cooperation with a workshop team for producing mechanical parts. Although there are some technical limitations, this methodology is promising, and it can be significantly useful in the redesigning of the manufacturing layout.

Keywords: dynamic facility layout problem, internet of things, method time measurement, radio frequency identification, simulation

Procedia PDF Downloads 120
22747 The Influence of the Form of Grain on the Mechanical Behaviour of Sand

Authors: Mohamed Boualem Salah

Abstract:

The size and shape of soil particles reflect the formation history of the grains. In turn, the macro scale behavior of the soil mass results from particle level interactions which are affected by particle shape. Sphericity, roundness and smoothness characterize different scales associated to particle shape. New experimental data and data from previously published studies are gathered into two databases to explore the effects of particle shape on packing as well as small and large-strain properties of sandy soils. Data analysis shows that increased particle irregularity (angularity and/or eccentricity) leads to: an increase in emax and emin, a decrease in stiffness yet with increased sensitivity to the state of stress, an increase in compressibility under zero-lateral strain loading, and an increase in critical state friction angle φcs and intercept Γ with a weak effect on slope λ. Therefore, particle shape emerges as a significant soil index property that needs to be properly characterized and documented, particularly in clean sands and gravels. The systematic assessment of particle shape will lead to a better understanding of sand behavior.

Keywords: angularity, eccentricity, shape particle, behavior of soil

Procedia PDF Downloads 413
22746 The Views of Teachers over the Father Involvement to Preschool Education Programs

Authors: Fatma Tezel Sahin, Zeynep Nur Aydin Kilic, Aysegul Akinci Cosgun

Abstract:

Family involvement activities are a significant place in increasing the success in preschool education and maintaining the education. It is necessary that both of the parents be in the family involvement activities. However, while mother involvement is obtained in the family involvement activities, father involvement is neglected. For that reason, the current study aims at determining the views of teachers with regard to father involvement in the preschool education programs. The working group of the study consisted of 23 preschool teachers. The study is a descriptive survey. The data were obtained through individual interviews. As a data collection instrument, “Teacher Interview Form” was used. The data were analysed through content analysis method. The data regarding the views of the teachers were given as frequency and percentage values. At the end of the research, a great majority of the teachers stated that they were proficient in applying family involvement studies. They also pointed out that they held more family meetings in order to obtain family involvement and then they implemented involvement activities both in the class and out of the class for parents. They expressed that they observed more mother involvement in these activities that fathers. Parents expressed that the reasons why fathers involved in these activities less compared to mothers were the working conditions of fathers and that it was regarded as a task of mothers. Depending on the results of the research, it is likely to recommend that fathers should be informed about the involvement in family activities and that some applications and opportunities should be supplied for the fathers in preschool education institutions in order to encourage them.

Keywords: preschool education, parent involvement, father involvement, teacher views

Procedia PDF Downloads 324
22745 Political Views and Information and Communication Technology (ICT) in Tertiary Institutions in Achieving the Millennium Development Goals (MDGS)

Authors: Perpetual Nwakaego Ibe

Abstract:

The Millennium Development Goals (MDGs), were an integrated project formed to eradicate many unnatural situations the citizens of the third world country may found themselves in. The MDGs, to be a sustainable project for the future depends 100% on the actions of governments, multilateral institutions and civil society. This paper first looks at the political views on the MDGs and relates it to the current electoral situations around the country by underlining the drastic changes over the few months. The second part of the paper presents ICT in tertiary institutions as one of the solutions in terms of the success of the MDGs. ICT is vital in all phases of educational process and development of the cloud connectivity is an added advantage of Information and Communication Technology (ICT) for sharing a common data bank for research purposes among UNICEF, RED CROSS, NPS, INEC, NMIC, and WHO. Finally, the paper concludes with areas that needs twigging and recommendations for the tertiary institutions committed to delivering an ambitious set of goals. A combination of observation, and document materials for data gathering was employed as the methodology for carrying out this research.

Keywords: MDG, ICT, data bank, database

Procedia PDF Downloads 200
22744 Development of a Multi-User Country Specific Food Composition Table for Malawi

Authors: Averalda van Graan, Joelaine Chetty, Malory Links, Agness Mwangwela, Sitilitha Masangwi, Dalitso Chimwala, Shiban Ghosh, Elizabeth Marino-Costello

Abstract:

Food composition data is becoming increasingly important as dealing with food insecurity and malnutrition in its persistent form of under-nutrition is now coupled with increasing over-nutrition and its related ailments in the developing world, of which Malawi is not spared. In the absence of a food composition database (FCDB) inherent to our dietary patterns, efforts were made to develop a country-specific FCDB for nutrition practice, research, and programming. The main objective was to develop a multi-user, country-specific food composition database, and table from existing published and unpublished scientific literature. A multi-phased approach guided by the project framework was employed. Phase 1 comprised a scoping mission to assess the nutrition landscape for compilation activities. Phase 2 involved training of a compiler and data collection from various sources, primarily; institutional libraries, online databases, and food industry nutrient data. Phase 3 subsumed evaluation and compilation of data using FAO and IN FOODS standards and guidelines. Phase 4 concluded the process with quality assurance. 316 Malawian food items categorized into eight food groups for 42 components were captured. The majority were from the baby food group (27%), followed by a staple (22%) and animal (22%) food group. Fats and oils consisted the least number of food items (2%), followed by fruits (6%). Proximate values are well represented; however, the percent missing data is huge for some components, including Se 68%, I 75%, Vitamin A 42%, and lipid profile; saturated fat 53%, mono-saturated fat 59%, poly-saturated fat 59% and cholesterol 56%. A multi-phased approach following the project framework led to the development of the first Malawian FCDB and table. The table reflects inherent Malawian dietary patterns and nutritional concerns. The FCDB can be used by various professionals in nutrition and health. Rising over-nutrition, NCD, and changing diets challenge us for nutrient profiles of processed foods and complete lipid profiles.

Keywords: analytical data, dietary pattern, food composition data, multi-phased approach

Procedia PDF Downloads 93
22743 Educational Leadership and Artificial Intelligence

Authors: Sultan Ghaleb Aldaihani

Abstract:

- The environment in which educational leadership takes place is becoming increasingly complex due to factors like globalization and rapid technological change. - This is creating a "leadership gap" where the complexity of the environment outpaces the ability of leaders to effectively respond. - Educational leadership involves guiding teachers and the broader school system towards improved student learning and achievement. 2. Implications of Artificial Intelligence (AI) in Educational Leadership: - AI has great potential to enhance education, such as through intelligent tutoring systems and automating routine tasks to free up teachers. - AI can also have significant implications for educational leadership by providing better information and data-driven decision-making capabilities. - Computer-adaptive testing can provide detailed, individualized data on student learning that leaders can use for instructional decisions and accountability. 3. Enhancing Decision-Making Processes: - Statistical models and data mining techniques can help identify at-risk students earlier, allowing for targeted interventions. - Probability-based models can diagnose students likely to drop out, enabling proactive support. - These data-driven approaches can make resource allocation and decision-making more effective. 4. Improving Efficiency and Productivity: - AI systems can automate tasks and change processes to improve the efficiency of educational leadership and administration. - Integrating AI can free up leaders to focus more on their role's human, interactive elements.

Keywords: Education, Leadership, Technology, Artificial Intelligence

Procedia PDF Downloads 43
22742 Identification of CLV for Online Shoppers Using RFM Matrix: A Case Based on Features of B2C Architecture

Authors: Riktesh Srivastava

Abstract:

Online Shopping have established an astonishing evolution in the last few years. And it is now apparent that B2C architecture is becoming progressively imperative channel for even traditional brick and mortar type traders as well. In this completion knowing customers and predicting behavior are extremely important. More important, when any customer logs onto the B2C architecture, the traces of their buying patterns can be stored and used for future predictions. Such a prediction is called Customer Lifetime Value (CLV). Earlier, we used Net Present Value to do so, however, it ignores two important aspects of B2C architecture, “market risks” and “big amount of customer data”. Now, we use RFM- Recency, Frequency and Monetary Value to estimate the CLV, and as the term exemplifies, market risks, is well sheltered. Big Data Analysis is also roofed in RFM, which gives real exploration of the Big Data and lead to a better estimation for future cash flow from customers. In the present paper, 6 factors (collected from varied sources) are used to determine as to what attracts the customers to the B2C architecture. For these 6 factors, RFM is computed for 3 years (2013, 2014 and 2015) respectively. CLV and Revenue are the two parameters defined using RFM analysis, which gives the clear picture of the future predictions.

Keywords: CLV, RFM, revenue, recency, frequency, monetary value

Procedia PDF Downloads 220
22741 Towards a Quantification of the Wind Erosion of the Gharb Shoreline Soils in Morocco by the Application of a Mathematical Model

Authors: Mohammed Kachtali, Imad Fenjiro, Jamal Alkarkouri

Abstract:

Wind erosion is a serious environmental problem in arid and semi-arid regions. Indeed, wind erosion easily removes the finest particles of the soil surface, which also contribute to losing soil fertility. The siltation of infrastructures and cultivated areas and the negative impact on health are additional consequences of wind erosion. In Morocco, wind erosion constitutes the main factor of silting up in coast and Sahara. The aim of our study is to use an equation of wind erosion in order to estimate the soil loses by wind erosion in the coast of Gharb (North of Morocco). The used equation in our model includes the geographic data, climatic data of 30 years and edaphic data collected from area study which contained 11 crossing of 4 stations. Our results have shown that the values of wind erosion are higher and very different between some crossings (p < 0.001). This difference is explained by topography, soil texture, and climate. In conclusion, wind erosion is higher in Gharb coast and varies from station to another; this problem required several methods of control and mitigation.

Keywords: Gharb coast, modeling, silting, wind erosion

Procedia PDF Downloads 137
22740 Oil Reservoir Asphalting Precipitation Estimating during CO2 Injection

Authors: I. Alhajri, G. Zahedi, R. Alazmi, A. Akbari

Abstract:

In this paper, an Artificial Neural Network (ANN) was developed to predict Asphaltene Precipitation (AP) during the injection of carbon dioxide into crude oil reservoirs. In this study, the experimental data from six different oil fields were collected. Seventy percent of the data was used to develop the ANN model, and different ANN architectures were examined. A network with the Trainlm training algorithm was found to be the best network to estimate the AP. To check the validity of the proposed model, the model was used to predict the AP for the thirty percent of the data that was unevaluated. The Mean Square Error (MSE) of the prediction was 0.0018, which confirms the excellent prediction capability of the proposed model. In the second part of this study, the ANN model predictions were compared with modified Hirschberg model predictions. The ANN was found to provide more accurate estimates compared to the modified Hirschberg model. Finally, the proposed model was employed to examine the effect of different operating parameters during gas injection on the AP. It was found that the AP is mostly sensitive to the reservoir temperature. Furthermore, the carbon dioxide concentration in liquid phase increases the AP.

Keywords: artificial neural network, asphaltene, CO2 injection, Hirschberg model, oil reservoirs

Procedia PDF Downloads 364
22739 Unearthing Air Traffic Control Officers Decision Instructional Patterns From Simulator Data for Application in Human Machine Teams

Authors: Zainuddin Zakaria, Sun Woh Lye

Abstract:

Despite the continuous advancements in automated conflict resolution tools, there is still a low rate of adoption of automation from Air Traffic Control Officers (ATCOs). Trust or acceptance in these tools and conformance to the individual ATCO preferences in strategy execution for conflict resolution are two key factors that impact their use. This paper proposes a methodology to unearth and classify ATCO conflict resolution strategies from simulator data of trained and qualified ATCOs. The methodology involves the extraction of ATCO executive control actions and the establishment of a system of strategy resolution classification based on ATCO radar commands and prevailing flight parameters in deconflicting a pair of aircraft. Six main strategies used to handle various categories of conflict were identified and discussed. It was found that ATCOs were about twice more likely to choose only vertical maneuvers in conflict resolution compared to horizontal maneuvers or a combination of both vertical and horizontal maneuvers.

Keywords: air traffic control strategies, conflict resolution, simulator data, strategy classification system

Procedia PDF Downloads 148
22738 Spectral Re-Evaluation of the Magnetic Basement Depth over Yola Arm of Upper Benue Trough Nigeria Using Aeromagnetic Data

Authors: Emberga Terhemb Opara Alexander, Selemo Alexader, Onyekwuru Samuel

Abstract:

The aeromagnetic data have been used to re-evaluate parts of the Upper Benue Trough Nigeria using spectral analysis technique in order to appraise the mineral accumulation potential of the area. The regional field was separated with a first order polynomial using polyfit program. The residual data was subdivided into 24 spectral blocks using OASIS MONTAJ software program. Two prominent magnetic depth source layers were identified. The deeper source depth values obtained ranges from 1.56km to 2.92km with an average depth of 2.37km as the magnetic basement depth while for the shallower sources, the depth values ranges from -1.17km to 0.98km with an average depth of 0.55km. The shallow depth source is attributed to the volcanic rocks that intruded the sedimentary formation and this could possibly be responsible for the mineralization found in parts of the study area.

Keywords: spectral analysis, Upper Benue Trough, magnetic basement depth, aeromagnetic

Procedia PDF Downloads 451
22737 Detect Circles in Image: Using Statistical Image Analysis

Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee

Abstract:

The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.

Keywords: image processing, median filter, projection, scale-space, segmentation, threshold

Procedia PDF Downloads 432