Search results for: motion data acquisition
22706 Effect of Freight Transport Intensity on Firm Performance: Mediating Role of Operational Capability
Authors: Bonaventure Naab Dery, Abdul Muntaka Samad
Abstract:
During the past two decades, huge population growth has been recorded in developing countries. Thisled to an increase in the demand for transport services for human and merchandises. The study sought to examine the effect of freight transport intensity on firm performance. Among others, this study sought to examine the link between freight transport intensity and firm performance; the link between operational capability and firm performance, and the mediating role of operational capability on the relationship between freight transport intensity and firm performance. The study used a descriptive research design and a quantitative research approach. Questionnaireswereusedfor the data collection through snowball sampling and purposive sampling. SPSS and Mplus are being used to analyze the data. It is anticipated that, when the data is analyzed, it would validate the hypotheses that have been proposed by the researchers. Base on the findings, relevant recommendations would be made for managerial implications and future studies.Keywords: freight transport intensity, freight economy transport intensity, freight efficiency transport intensity, operational capability, firm performance
Procedia PDF Downloads 15022705 A Descriptive Study of the Characteristics of Introductory Accounting Courses Offered by Community Colleges
Authors: Jonathan Nash, Allen Hartt, Catherine Plante
Abstract:
In many nations, community colleges, or similar institutions, play a crucial role in higher education. For example, in the United States more than half of all undergraduate students enroll in a community college at some point during their academic career. Similar statistics have been reported for Australia and Canada. Recognizing the important role these institutions play in educating future accountants, the American Accounting Association has called for research that contributes to a better understanding of these members of the academic community. Although previous literature has shown that community colleges and 4-year institutions differ on many levels, the extant literature has provided data on the characteristics of introductory accounting courses for four-year institutions but not for community colleges. We fill a void in the literature by providing data on the characteristics of introductory accounting courses offered by community colleges in the United States. Data are collected on several dimensions including: course size and staffing, pedagogical orientation, standardization of course elements, textbook selection, and use of technology-based course management tools. Many of these dimensions have been used in previous research examining four-year institutions thereby facilitating comparisons. The resulting data should be of interest to instructors, regulators and administrators, researchers, and the accounting profession. The data provide information on the introductory accounting courses completed by the average community college student which can help instructors identify areas where transfer students’ experiences might differ from their contemporaries at four-year colleges. Regulators and administrators may be interested in the differences between accounting courses offered by two- and four-year institutions when implementing standardized transfer programs. Researchers might use the data to motivate future research into whether differences between two- and four-year institutions affect outcomes like the probability of students choosing to major in accounting and their performance within the major. Accounting professionals may use our findings as a springboard for facilitating discussions related to the accounting labor supply.Keywords: Accounting curricula, Community college, Descriptive study, Introductory accounting
Procedia PDF Downloads 10422704 Marketing Mixed Factors Affecting on Commercial Transactions Expectations through Social Networks
Authors: Ladaporn Pithuk
Abstract:
This study aims to investigate the marketing mixed factors that affecting on expectations about commercial transactions through social networks. The research method will using quantitative research, data was collected by questionnaires to person have experience access to trading over the internet for 400 sample by purposive sampling method. Data was analyzed by descriptive statistic including percentage, mean, standard deviation and using quality function deployment for hypothesis testing. Finding the most significant interrelationship between marketing mixed factors and commercial transactions expectations through social networks are product and place the relationship of five ties product and place (location) is involved in almost all will make the site a model that meets the needs of the user visit. In terms of price, the promotion, privacy, personalization and providing a process technical. This will make operations more efficient, reduce confusion, duplication, delays in data transmission, including the creation of different elements in products and services.Keywords: commercial transactions expectations, marketing mixed factors, social networks, consumer behavior
Procedia PDF Downloads 24022703 Future Housing Energy Efficiency Associated with the Auckland Unitary Plan
Authors: Bin Su
Abstract:
The draft Auckland Unitary Plan outlines the future land used for new housing and businesses with Auckland population growth over the next thirty years. According to Auckland Unitary Plan, over the next 30 years, the population of Auckland is projected to increase by one million, and up to 70% of total new dwellings occur within the existing urban area. Intensification will not only increase the number of median or higher density houses such as terrace house, apartment building, etc. within the existing urban area but also change mean housing design data that can impact building thermal performance under the local climate. Based on mean energy consumption and building design data, and their relationships of a number of Auckland sample houses, this study is to estimate the future mean housing energy consumption associated with the change of mean housing design data and evaluate housing energy efficiency with the Auckland Unitary Plan.Keywords: Auckland Unitary Plan, building thermal design, housing design, housing energy efficiency
Procedia PDF Downloads 38922702 The Use of Video in Increasing Speaking Ability of the First Year Students of SMAN 12 Pekanbaru in the Academic Year 2011/2012
Authors: Elvira Wahyuni
Abstract:
This study is a classroom action research. The general objective of this study was to find out students’ speaking ability through teaching English by using video and to find out the effectiveness of using video in teaching English to improve students’ speaking ability. The subjects of this study were 34 of the first-year students of SMAN 12 Pekanbaru who were learning English as a foreign language (EFL). Students were given pre-test before the treatment and post-test after the treatment. Quantitative data was collected by using speaking test requiring the students to respond to the recorded questions. Qualitative data was collected through observation sheets and field notes. The research finding reveals that there is a significant improvement of the students’ speaking ability through the use of video in speaking class. The qualitative data gave a description and additional information about the learning process done by the students. The research findings indicate that the use of video in teaching and learning is good in increasing learning outcome.Keywords: English teaching, fun learning, speaking ability, video
Procedia PDF Downloads 25822701 HIV and AIDS in Kosovo, Stigma Persist!
Authors: Luljeta Gashi, Naser Ramadani, Zana Deva, Dafina Gexha-Bunjaku
Abstract:
The official HIV/AIDS data in Kosovo are based on HIV case reporting from health-care services, the blood transfusion system and Voluntary Counselling and Testing centres. Between 1986 and 2014, are reported 95 HIV and AIDS cases, of which 49 were AIDS, 46 HIV and 40 deaths. The majority (69%) of cases were men, age group 25 to 34 (37%) and route of transmission is: heterosexual (90%), MSM (7%), vertical transmission (2%) and IDU (1%). Based on existing data and the UNAIDS classification system, Kosovo is currently still categorised as having a low-level HIV epidemic. Even though with a low HIV prevalence, Kosovo faces a number of threatening factors, including increased number of drug users, a stigmatized and discriminated MSM community, high percentage of youth among general population (57% of the population under the age of 25), with changing social norms and especially the sexual ones. Methods: Data collection was done using self administered structured questionnaires amongst 249 high school students. Data were analysed using the Statistical Package for Social Sciences (SPSS). Results: The findings revealed that 68% of students know that HIV transmission can be reduced by having sex with only one uninfected partner who has no other partners, 94% know that the risk of getting HIV can be reduced by using a condom every time they have sex, 68% know that a person cannot get HIV from mosquito bites, 81% know that they cannot get HIV by sharing food with someone who is infected and 46% know that a healthy looking person can have HIV. Conclusions: Seventy one percent of high school students correctly identify ways of preventing the sexual transmission of HIV and who reject the major misconceptions about HIV transmission. The findings of the study indicate a need for more health education and promotion.Keywords: Kosovo, KPAR, HIV, high school
Procedia PDF Downloads 48322700 Youth Involvement in Cybercrime in Nigeria: A Case Study of Ikeja Local Government Area
Authors: Niyi Adegoke, Saanumi Jimmy Omolou
Abstract:
The prevalence rate of youth involving in cybercrime is alarming, which calls for concern among the government, parents, NGO and religious bodies, hence this paper aims at examining youth involvement in cybercrime in Nigeria. Achievement motivation theory was used to explain the activities of cyber-criminals in Nigerian society. A descriptive survey method was adopted for the study. The sample for the study was one hundred and fifty (150) respondents randomly selected from the population of the study. A questionnaire was used to gather information and data from the respondents. Data collected through the questionnaire were analyzed using percentage tool for the respondents’ bio-data while chi-square was employed to test the hypotheses. Findings from the study have revealed that parental negligence, unemployment, peer influence, and quest for materialism were responsible for cyber-crimes in Nigeria. The study concludes with the following recommendations among which are: creating employment opportunities for the youths and ensure good governance and accountability among other things will go a long way to solve the problem of cybercrime in our society.Keywords: cybercrime, youth, Nigeria, unemployment, information communication technology
Procedia PDF Downloads 23222699 R Software for Parameter Estimation of Spatio-Temporal Model
Authors: Budi Nurani Ruchjana, Atje Setiawan Abdullah, I. Gede Nyoman Mindra Jaya, Eddy Hermawan
Abstract:
In this paper, we propose the application package to estimate parameters of spatiotemporal model based on the multivariate time series analysis using the R open-source software. We build packages mainly to estimate the parameters of the Generalized Space Time Autoregressive (GSTAR) model. GSTAR is a combination of time series and spatial models that have parameters vary per location. We use the method of Ordinary Least Squares (OLS) and use the Mean Average Percentage Error (MAPE) to fit the model to spatiotemporal real phenomenon. For case study, we use oil production data from volcanic layer at Jatibarang Indonesia or climate data such as rainfall in Indonesia. Software R is very user-friendly and it is making calculation easier, processing the data is accurate and faster. Limitations R script for the estimation of model parameters spatiotemporal GSTAR built is still limited to a stationary time series model. Therefore, the R program under windows can be developed either for theoretical studies and application.Keywords: GSTAR Model, MAPE, OLS method, oil production, R software
Procedia PDF Downloads 24622698 Using Corpora in Semantic Studies of English Adjectives
Authors: Oxana Lukoshus
Abstract:
The methods of corpus linguistics, a well-established field of research, are being increasingly applied in cognitive linguistics. Corpora data are especially useful for different quantitative studies of grammatical and other aspects of language. The main objective of this paper is to demonstrate how present-day corpora can be applied in semantic studies in general and in semantic studies of adjectives in particular. Polysemantic adjectives have been the subject of numerous studies. But most of them have been carried out on dictionaries. Undoubtedly, dictionaries are viewed as one of the basic data sources, but only at the initial steps of a research. The author usually starts with the analysis of the lexicographic data after which s/he comes up with a hypothesis. In the research conducted three polysemantic synonyms true, loyal, faithful have been analyzed in terms of differences and similarities in their semantic structure. A corpus-based approach in the study of the above-mentioned adjectives involves the following. After the analysis of the dictionary data there was the reference to the following corpora to study the distributional patterns of the words under study – the British National Corpus (BNC) and the Corpus of Contemporary American English (COCA). These corpora are continually updated and contain thousands of examples of the words under research which make them a useful and convenient data source. For the purpose of this study there were no special needs regarding genre, mode or time of the texts included in the corpora. Out of the range of possibilities offered by corpus-analysis software (e.g. word lists, statistics of word frequencies, etc.), the most useful tool for the semantic analysis was the extracting a list of co-occurrence for the given search words. Searching by lemmas, e.g. true, true to, and grouping the results by lemmas have proved to be the most efficient corpora feature for the adjectives under the study. Following the search process, the corpora provided a list of co-occurrences, which were then to be analyzed and classified. Not every co-occurrence was relevant for the analysis. For example, the phrases like An enormous sense of responsibility to protect the minds and hearts of the faithful from incursions by the state was perceived to be the basic duty of the church leaders or ‘True,’ said Phoebe, ‘but I'd probably get to be a Union Official immediately were left out as in the first example the faithful is a substantivized adjective and in the second example true is used alone with no other parts of speech. The subsequent analysis of the corpora data gave the grounds for the distribution groups of the adjectives under the study which were then investigated with the help of a semantic experiment. To sum it up, the corpora-based approach has proved to be a powerful, reliable and convenient tool to get the data for the further semantic study.Keywords: corpora, corpus-based approach, polysemantic adjectives, semantic studies
Procedia PDF Downloads 31622697 An Improved Transmission Scheme in Cooperative Communication System
Authors: Seung-Jun Yu, Young-Min Ko, Hyoung-Kyu Song
Abstract:
Recently developed cooperative diversity scheme enables a terminal to get transmit diversity through the support of other terminals. However, most of the introduced cooperative schemes have a common fault of decreased transmission rate because the destination should receive the decodable compositions of symbols from the source and the relay. In order to achieve high data rate, we propose a cooperative scheme that employs hierarchical modulation. This scheme is free from the rate loss and allows seamless cooperative communication.Keywords: cooperative communication, hierarchical modulation, high data rate, transmission scheme
Procedia PDF Downloads 42922696 Time Series Analysis on the Production of Fruit Juice: A Case Study of National Horticultural Research Institute (Nihort) Ibadan, Oyo State
Authors: Abiodun Ayodele Sanyaolu
Abstract:
The research was carried out to investigate the time series analysis on quarterly production of fruit juice at the National Horticultural Research Institute Ibadan from 2010 to 2018. Documentary method of data collection was used, and the method of least square and moving average were used in the analysis. From the calculation and the graph, it was glaring that there was increase, decrease, and uniform movements in both the graph of the original data and the tabulated quarter values of the original data. Time series analysis was used to detect the trend in the highest number of fruit juice and it appears to be good over a period of time and the methods used to forecast are additive and multiplicative models. Since it was observed that the production of fruit juice is usually high in January of every year, it is strongly advised that National Horticultural Research Institute should make more provision for fruit juice storage outside this period of the year.Keywords: fruit juice, least square, multiplicative models, time series
Procedia PDF Downloads 14422695 Predicting the Solubility of Aromatic Waste Petroleum Paraffin Wax in Organic Solvents to Separate Ultra-Pure Phase Change Materials (PCMs) by Molecular Dynamics Simulation
Authors: Fathi Soliman
Abstract:
With the ultimate goal of developing the separation of n-paraffin as phase change material (PCM) by means of molecular dynamic simulations, we attempt to predict the solubility of aromatic n-paraffin in two organic solvents: Butyl Acetate (BA) and Methyl Iso Butyl Ketone (MIBK). A simple model of aromatic paraffin: 2-hexadecylantharacene with amorphous molecular structure and periodic boundary conditions was constructed. The results showed that MIBK is the best solvent to separate ultra-pure phase change materials and this data was compatible with experimental data done to separate ultra-pure n-paraffin from waste petroleum aromatic paraffin wax, the separated n-paraffin was characterized by XRD, TGA, GC and DSC, moreover; data revealed that the n-paraffin separated by using MIBK is better as PCM than that separated using BA.Keywords: molecular dynamics simulation, n-paraffin, organic solvents, phase change materials, solvent extraction
Procedia PDF Downloads 19822694 Application of Association Rule Using Apriori Algorithm for Analysis of Industrial Accidents in 2013-2014 in Indonesia
Authors: Triano Nurhikmat
Abstract:
Along with the progress of science and technology, the development of the industrialized world in Indonesia took place very rapidly. This leads to a process of industrialization of society Indonesia faster with the establishment of the company and the workplace are diverse. Development of the industry relates to the activity of the worker. Where in these work activities do not cover the possibility of an impending crash on either the workers or on a construction project. The cause of the occurrence of industrial accidents was the fault of electrical damage, work procedures, and error technique. The method of an association rule is one of the main techniques in data mining and is the most common form used in finding the patterns of data collection. In this research would like to know how relations of the association between the incidence of any industrial accidents. Therefore, by using methods of analysis association rule patterns associated with combination obtained two iterations item set (2 large item set) when every factor of industrial accidents with a West Jakarta so industrial accidents caused by the occurrence of an electrical value damage = 0.2 support and confidence value = 1, and the reverse pattern with value = 0.2 support and confidence = 0.75.Keywords: association rule, data mining, industrial accidents, rules
Procedia PDF Downloads 30322693 Automated Method Time Measurement System for Redesigning Dynamic Facility Layout
Authors: Salam Alzubaidi, G. Fantoni, F. Failli, M. Frosolini
Abstract:
The dynamic facility layout problem is a really critical issue in the competitive industrial market; thus, solving this problem requires robust design and effective simulation systems. The sustainable simulation requires inputting reliable and accurate data into the system. So this paper describes an automated system integrated into the real environment to measure the duration of the material handling operations, collect the data in real-time, and determine the variances between the actual and estimated time schedule of the operations in order to update the simulation software and redesign the facility layout periodically. The automated method- time measurement system collects the real data through using Radio Frequency-Identification (RFID) and Internet of Things (IoT) technologies. Hence, attaching RFID- antenna reader and RFID tags enables the system to identify the location of the objects and gathering the time data. The real duration gathered will be manipulated by calculating the moving average duration of the material handling operations, choosing the shortest material handling path, and then updating the simulation software to redesign the facility layout accommodating with the shortest/real operation schedule. The periodic simulation in real-time is more sustainable and reliable than the simulation system relying on an analysis of historical data. The case study of this methodology is in cooperation with a workshop team for producing mechanical parts. Although there are some technical limitations, this methodology is promising, and it can be significantly useful in the redesigning of the manufacturing layout.Keywords: dynamic facility layout problem, internet of things, method time measurement, radio frequency identification, simulation
Procedia PDF Downloads 12422692 The Influence of the Form of Grain on the Mechanical Behaviour of Sand
Authors: Mohamed Boualem Salah
Abstract:
The size and shape of soil particles reflect the formation history of the grains. In turn, the macro scale behavior of the soil mass results from particle level interactions which are affected by particle shape. Sphericity, roundness and smoothness characterize different scales associated to particle shape. New experimental data and data from previously published studies are gathered into two databases to explore the effects of particle shape on packing as well as small and large-strain properties of sandy soils. Data analysis shows that increased particle irregularity (angularity and/or eccentricity) leads to: an increase in emax and emin, a decrease in stiffness yet with increased sensitivity to the state of stress, an increase in compressibility under zero-lateral strain loading, and an increase in critical state friction angle φcs and intercept Γ with a weak effect on slope λ. Therefore, particle shape emerges as a significant soil index property that needs to be properly characterized and documented, particularly in clean sands and gravels. The systematic assessment of particle shape will lead to a better understanding of sand behavior.Keywords: angularity, eccentricity, shape particle, behavior of soil
Procedia PDF Downloads 41822691 Simulation of Elastic Bodies through Discrete Element Method, Coupled with a Nested Overlapping Grid Fluid Flow Solver
Authors: Paolo Sassi, Jorge Freiria, Gabriel Usera
Abstract:
In this work, a finite volume fluid flow solver is coupled with a discrete element method module for the simulation of the dynamics of free and elastic bodies in interaction with the fluid and between themselves. The open source fluid flow solver, caffa3d.MBRi, includes the capability to work with nested overlapping grids in order to easily refine the grid in the region where the bodies are moving. To do so, it is necessary to implement a recognition function able to identify the specific mesh block in which the device is moving in. The set of overlapping finer grids might be displaced along with the set of bodies being simulated. The interaction between the bodies and the fluid is computed through a two-way coupling. The velocity field of the fluid is first interpolated to determine the drag force on each object. After solving the objects displacements, subject to the elastic bonding among them, the force is applied back onto the fluid through a Gaussian smoothing considering the cells near the position of each object. The fishnet is represented as lumped masses connected by elastic lines. The internal forces are derived from the elasticity of these lines, and the external forces are due to drag, gravity, buoyancy and the load acting on each element of the system. When solving the ordinary differential equations system, that represents the motion of the elastic and flexible bodies, it was found that the Runge Kutta solver of fourth order is the best tool in terms of performance, but requires a finer grid than the fluid solver to make the system converge, which demands greater computing power. The coupled solver is demonstrated by simulating the interaction between the fluid, an elastic fishnet and a set of free bodies being captured by the net as they are dragged by the fluid. The deformation of the net, as well as the wake produced in the fluid stream are well captured by the method, without requiring the fluid solver mesh to adapt for the evolving geometry. Application of the same strategy to the simulation of elastic structures subject to the action of wind is also possible with the method presented, and one such application is currently under development.Keywords: computational fluid dynamics, discrete element method, fishnets, nested overlapping grids
Procedia PDF Downloads 41822690 The Views of Teachers over the Father Involvement to Preschool Education Programs
Authors: Fatma Tezel Sahin, Zeynep Nur Aydin Kilic, Aysegul Akinci Cosgun
Abstract:
Family involvement activities are a significant place in increasing the success in preschool education and maintaining the education. It is necessary that both of the parents be in the family involvement activities. However, while mother involvement is obtained in the family involvement activities, father involvement is neglected. For that reason, the current study aims at determining the views of teachers with regard to father involvement in the preschool education programs. The working group of the study consisted of 23 preschool teachers. The study is a descriptive survey. The data were obtained through individual interviews. As a data collection instrument, “Teacher Interview Form” was used. The data were analysed through content analysis method. The data regarding the views of the teachers were given as frequency and percentage values. At the end of the research, a great majority of the teachers stated that they were proficient in applying family involvement studies. They also pointed out that they held more family meetings in order to obtain family involvement and then they implemented involvement activities both in the class and out of the class for parents. They expressed that they observed more mother involvement in these activities that fathers. Parents expressed that the reasons why fathers involved in these activities less compared to mothers were the working conditions of fathers and that it was regarded as a task of mothers. Depending on the results of the research, it is likely to recommend that fathers should be informed about the involvement in family activities and that some applications and opportunities should be supplied for the fathers in preschool education institutions in order to encourage them.Keywords: preschool education, parent involvement, father involvement, teacher views
Procedia PDF Downloads 32922689 Political Views and Information and Communication Technology (ICT) in Tertiary Institutions in Achieving the Millennium Development Goals (MDGS)
Authors: Perpetual Nwakaego Ibe
Abstract:
The Millennium Development Goals (MDGs), were an integrated project formed to eradicate many unnatural situations the citizens of the third world country may found themselves in. The MDGs, to be a sustainable project for the future depends 100% on the actions of governments, multilateral institutions and civil society. This paper first looks at the political views on the MDGs and relates it to the current electoral situations around the country by underlining the drastic changes over the few months. The second part of the paper presents ICT in tertiary institutions as one of the solutions in terms of the success of the MDGs. ICT is vital in all phases of educational process and development of the cloud connectivity is an added advantage of Information and Communication Technology (ICT) for sharing a common data bank for research purposes among UNICEF, RED CROSS, NPS, INEC, NMIC, and WHO. Finally, the paper concludes with areas that needs twigging and recommendations for the tertiary institutions committed to delivering an ambitious set of goals. A combination of observation, and document materials for data gathering was employed as the methodology for carrying out this research.Keywords: MDG, ICT, data bank, database
Procedia PDF Downloads 20122688 Efficacy of Botulinum Toxin in Alleviating Pain Syndrome in Stroke Patients with Upper Limb Spasticity
Authors: Akulov M. A., Zaharov V. O., Jurishhev P. E., Tomskij A. A.
Abstract:
Introduction: Spasticity is a severe consequence of stroke, leading to profound disability, decreased quality of life and decrease of rehabilitation efficacy [4]. Spasticity is often associated with pain syndrome, arising from joint damage of paretic limbs (postural arthropathy) or painful spasm of paretic limb muscles. It is generally accepted that injection of botulinum toxin into a cramped muscle leads to decrease of muscle tone and improves motion range in paretic limb, which is accompanied by pain alleviation. Study aim: To evaluate the change in pain syndrome intensity after incections of botulinum toxin A (Xeomin) in stroke patients with upper limb spasticity. Patients and methods. 21 patients aged 47-74 years were evaluated. Inclusion criteria were: acute stroke 4-7 months before the inclusion into the study, leading to spasticity of wrist and/or finger flexors, elbow flexor or forearm pronator, associated with severe pain syndrome. Patients received Xeomin as monotherapy 90-300 U, according to spasticity pattern. Efficacy evaluation was performed using Ashworth scale, disability assessment scale (DAS), caregiver burden scale and global treatment benefit assessment on weeks 2, 4, 8 and 12. Efficacy criterion was the decrease of pain syndrome by week 4 on PQLS and VAS. Results: The study revealed a significant improvement of measured indices after 4 weeks of treatment, which persisted until the 12 week of treatment. Xeomin is effective in reducing muscle tone of flexors of wrist, fingers and elbow, forearm pronators. By the 4th week of treatment we observed a significant improvement on DAS (р < 0,05), Ashworth scale (1-2 points) in all patients (р < 0,05), caregiver burden scale (р < 0,05). A significant decrease of pain syndrome by the 4th week of treatment on PQLS (р < 0,05) и VAS (р < 0,05) was observed. No adverse effect were registered. Conclusion: Xeomin is an effective treatment of pain syndrome in postural upper limb spasticity after stroke. Xeomin treatment leads to a significant improvement on PQLS and VAS.Keywords: botulinum toxin, pain syndrome, spasticity, stroke
Procedia PDF Downloads 31222687 Development of a Multi-User Country Specific Food Composition Table for Malawi
Authors: Averalda van Graan, Joelaine Chetty, Malory Links, Agness Mwangwela, Sitilitha Masangwi, Dalitso Chimwala, Shiban Ghosh, Elizabeth Marino-Costello
Abstract:
Food composition data is becoming increasingly important as dealing with food insecurity and malnutrition in its persistent form of under-nutrition is now coupled with increasing over-nutrition and its related ailments in the developing world, of which Malawi is not spared. In the absence of a food composition database (FCDB) inherent to our dietary patterns, efforts were made to develop a country-specific FCDB for nutrition practice, research, and programming. The main objective was to develop a multi-user, country-specific food composition database, and table from existing published and unpublished scientific literature. A multi-phased approach guided by the project framework was employed. Phase 1 comprised a scoping mission to assess the nutrition landscape for compilation activities. Phase 2 involved training of a compiler and data collection from various sources, primarily; institutional libraries, online databases, and food industry nutrient data. Phase 3 subsumed evaluation and compilation of data using FAO and IN FOODS standards and guidelines. Phase 4 concluded the process with quality assurance. 316 Malawian food items categorized into eight food groups for 42 components were captured. The majority were from the baby food group (27%), followed by a staple (22%) and animal (22%) food group. Fats and oils consisted the least number of food items (2%), followed by fruits (6%). Proximate values are well represented; however, the percent missing data is huge for some components, including Se 68%, I 75%, Vitamin A 42%, and lipid profile; saturated fat 53%, mono-saturated fat 59%, poly-saturated fat 59% and cholesterol 56%. A multi-phased approach following the project framework led to the development of the first Malawian FCDB and table. The table reflects inherent Malawian dietary patterns and nutritional concerns. The FCDB can be used by various professionals in nutrition and health. Rising over-nutrition, NCD, and changing diets challenge us for nutrient profiles of processed foods and complete lipid profiles.Keywords: analytical data, dietary pattern, food composition data, multi-phased approach
Procedia PDF Downloads 9522686 Educational Leadership and Artificial Intelligence
Authors: Sultan Ghaleb Aldaihani
Abstract:
- The environment in which educational leadership takes place is becoming increasingly complex due to factors like globalization and rapid technological change. - This is creating a "leadership gap" where the complexity of the environment outpaces the ability of leaders to effectively respond. - Educational leadership involves guiding teachers and the broader school system towards improved student learning and achievement. 2. Implications of Artificial Intelligence (AI) in Educational Leadership: - AI has great potential to enhance education, such as through intelligent tutoring systems and automating routine tasks to free up teachers. - AI can also have significant implications for educational leadership by providing better information and data-driven decision-making capabilities. - Computer-adaptive testing can provide detailed, individualized data on student learning that leaders can use for instructional decisions and accountability. 3. Enhancing Decision-Making Processes: - Statistical models and data mining techniques can help identify at-risk students earlier, allowing for targeted interventions. - Probability-based models can diagnose students likely to drop out, enabling proactive support. - These data-driven approaches can make resource allocation and decision-making more effective. 4. Improving Efficiency and Productivity: - AI systems can automate tasks and change processes to improve the efficiency of educational leadership and administration. - Integrating AI can free up leaders to focus more on their role's human, interactive elements.Keywords: Education, Leadership, Technology, Artificial Intelligence
Procedia PDF Downloads 4522685 Identification of CLV for Online Shoppers Using RFM Matrix: A Case Based on Features of B2C Architecture
Authors: Riktesh Srivastava
Abstract:
Online Shopping have established an astonishing evolution in the last few years. And it is now apparent that B2C architecture is becoming progressively imperative channel for even traditional brick and mortar type traders as well. In this completion knowing customers and predicting behavior are extremely important. More important, when any customer logs onto the B2C architecture, the traces of their buying patterns can be stored and used for future predictions. Such a prediction is called Customer Lifetime Value (CLV). Earlier, we used Net Present Value to do so, however, it ignores two important aspects of B2C architecture, “market risks” and “big amount of customer data”. Now, we use RFM- Recency, Frequency and Monetary Value to estimate the CLV, and as the term exemplifies, market risks, is well sheltered. Big Data Analysis is also roofed in RFM, which gives real exploration of the Big Data and lead to a better estimation for future cash flow from customers. In the present paper, 6 factors (collected from varied sources) are used to determine as to what attracts the customers to the B2C architecture. For these 6 factors, RFM is computed for 3 years (2013, 2014 and 2015) respectively. CLV and Revenue are the two parameters defined using RFM analysis, which gives the clear picture of the future predictions.Keywords: CLV, RFM, revenue, recency, frequency, monetary value
Procedia PDF Downloads 22222684 Towards a Quantification of the Wind Erosion of the Gharb Shoreline Soils in Morocco by the Application of a Mathematical Model
Authors: Mohammed Kachtali, Imad Fenjiro, Jamal Alkarkouri
Abstract:
Wind erosion is a serious environmental problem in arid and semi-arid regions. Indeed, wind erosion easily removes the finest particles of the soil surface, which also contribute to losing soil fertility. The siltation of infrastructures and cultivated areas and the negative impact on health are additional consequences of wind erosion. In Morocco, wind erosion constitutes the main factor of silting up in coast and Sahara. The aim of our study is to use an equation of wind erosion in order to estimate the soil loses by wind erosion in the coast of Gharb (North of Morocco). The used equation in our model includes the geographic data, climatic data of 30 years and edaphic data collected from area study which contained 11 crossing of 4 stations. Our results have shown that the values of wind erosion are higher and very different between some crossings (p < 0.001). This difference is explained by topography, soil texture, and climate. In conclusion, wind erosion is higher in Gharb coast and varies from station to another; this problem required several methods of control and mitigation.Keywords: Gharb coast, modeling, silting, wind erosion
Procedia PDF Downloads 13922683 Oil Reservoir Asphalting Precipitation Estimating during CO2 Injection
Authors: I. Alhajri, G. Zahedi, R. Alazmi, A. Akbari
Abstract:
In this paper, an Artificial Neural Network (ANN) was developed to predict Asphaltene Precipitation (AP) during the injection of carbon dioxide into crude oil reservoirs. In this study, the experimental data from six different oil fields were collected. Seventy percent of the data was used to develop the ANN model, and different ANN architectures were examined. A network with the Trainlm training algorithm was found to be the best network to estimate the AP. To check the validity of the proposed model, the model was used to predict the AP for the thirty percent of the data that was unevaluated. The Mean Square Error (MSE) of the prediction was 0.0018, which confirms the excellent prediction capability of the proposed model. In the second part of this study, the ANN model predictions were compared with modified Hirschberg model predictions. The ANN was found to provide more accurate estimates compared to the modified Hirschberg model. Finally, the proposed model was employed to examine the effect of different operating parameters during gas injection on the AP. It was found that the AP is mostly sensitive to the reservoir temperature. Furthermore, the carbon dioxide concentration in liquid phase increases the AP.Keywords: artificial neural network, asphaltene, CO2 injection, Hirschberg model, oil reservoirs
Procedia PDF Downloads 36822682 Unearthing Air Traffic Control Officers Decision Instructional Patterns From Simulator Data for Application in Human Machine Teams
Authors: Zainuddin Zakaria, Sun Woh Lye
Abstract:
Despite the continuous advancements in automated conflict resolution tools, there is still a low rate of adoption of automation from Air Traffic Control Officers (ATCOs). Trust or acceptance in these tools and conformance to the individual ATCO preferences in strategy execution for conflict resolution are two key factors that impact their use. This paper proposes a methodology to unearth and classify ATCO conflict resolution strategies from simulator data of trained and qualified ATCOs. The methodology involves the extraction of ATCO executive control actions and the establishment of a system of strategy resolution classification based on ATCO radar commands and prevailing flight parameters in deconflicting a pair of aircraft. Six main strategies used to handle various categories of conflict were identified and discussed. It was found that ATCOs were about twice more likely to choose only vertical maneuvers in conflict resolution compared to horizontal maneuvers or a combination of both vertical and horizontal maneuvers.Keywords: air traffic control strategies, conflict resolution, simulator data, strategy classification system
Procedia PDF Downloads 15122681 Spectral Re-Evaluation of the Magnetic Basement Depth over Yola Arm of Upper Benue Trough Nigeria Using Aeromagnetic Data
Authors: Emberga Terhemb Opara Alexander, Selemo Alexader, Onyekwuru Samuel
Abstract:
The aeromagnetic data have been used to re-evaluate parts of the Upper Benue Trough Nigeria using spectral analysis technique in order to appraise the mineral accumulation potential of the area. The regional field was separated with a first order polynomial using polyfit program. The residual data was subdivided into 24 spectral blocks using OASIS MONTAJ software program. Two prominent magnetic depth source layers were identified. The deeper source depth values obtained ranges from 1.56km to 2.92km with an average depth of 2.37km as the magnetic basement depth while for the shallower sources, the depth values ranges from -1.17km to 0.98km with an average depth of 0.55km. The shallow depth source is attributed to the volcanic rocks that intruded the sedimentary formation and this could possibly be responsible for the mineralization found in parts of the study area.Keywords: spectral analysis, Upper Benue Trough, magnetic basement depth, aeromagnetic
Procedia PDF Downloads 45622680 Detect Circles in Image: Using Statistical Image Analysis
Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee
Abstract:
The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.Keywords: image processing, median filter, projection, scale-space, segmentation, threshold
Procedia PDF Downloads 43622679 Effect of Packaging Treatment and Storage Condition on Stability of Low Fat Chicken Burger
Authors: Mohamed Ahmed Kenawi Abdallah
Abstract:
Chemical composition, cooking loss, shrinkage value, texture coefficient indices, Feder value, microbial examination, and sensory evaluation were done in order to examine the effect of adding 15% germinated quinoa seeds flour as extender to chicken wings meat to produce low fat chicken burger, packaged in two different packing materials and stored frozen for nine months. The data indicated reduction in the moisture content, crude either extract, and increase in the ash content, pH value, and total acidity for the samples extended by quinoa flour compared with the control one. The data showed that the extended samples with quinoa flour had the lowest values of TBA, cooking loss, and shrinkage value compared with the control ones. The data also revealed that, the sample contained quinoa flour had total bacterial count and psychrophilic bacterial count lower than the control sample. In addition, it has higher evaluation values for overall acceptability than the control one.Keywords: chicken wings, low fat chicken burger, quinoa flour, vacuum packaging.
Procedia PDF Downloads 10422678 An Evaluation Method of Accelerated Storage Life Test for Typical Mechanical and Electronic Products
Authors: Jinyong Yao, Hongzhi Li, Chao Du, Jiao Li
Abstract:
Reliability of long-term storage products is related to the availability of the whole system, and the evaluation of storage life is of great necessity. These products are usually highly reliable and little failure information can be collected. In this paper, an analytical method based on data from accelerated storage life test is proposed to evaluate the reliability index of the long-term storage products. Firstly, singularities are eliminated by data normalization and residual analysis. Secondly, with the pre-processed data, the degradation path model is built to obtain the pseudo life values. Then by life distribution hypothesis, we can get the estimator of parameters in high stress levels and verify failure mechanisms consistency. Finally, the life distribution under the normal stress level is extrapolated via the acceleration model and evaluation of the true average life available. An application example with the camera stabilization device is provided to illustrate the methodology we proposed.Keywords: accelerated storage life test, failure mechanisms consistency, life distribution, reliability
Procedia PDF Downloads 38922677 Solving Dimensionality Problem and Finding Statistical Constructs on Latent Regression Models: A Novel Methodology with Real Data Application
Authors: Sergio Paez Moncaleano, Alvaro Mauricio Montenegro
Abstract:
This paper presents a novel statistical methodology for measuring and founding constructs in Latent Regression Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations on Item Response Theory (IRT). In addition, based on the fundamentals of submodel theory and with a convergence of many ideas of IRT, we propose an algorithm not just to solve the dimensionality problem (nowadays an open discussion) but a new research field that promises more fear and realistic qualifications for examiners and a revolution on IRT and educational research. In the end, the methodology is applied to a set of real data set presenting impressive results for the coherence, speed and precision. Acknowledgments: This research was financed by Colciencias through the project: 'Multidimensional Item Response Theory Models for Practical Application in Large Test Designed to Measure Multiple Constructs' and both authors belong to SICS Research Group from Universidad Nacional de Colombia.Keywords: item response theory, dimensionality, submodel theory, factorial analysis
Procedia PDF Downloads 374