Search results for: statistical tools
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7702

Search results for: statistical tools

6622 Chinese Undergraduates’ Trust in And Usage of Machine Translation: A Survey

Authors: Bi Zhao

Abstract:

Neural network technology has greatly improved the output of machine translation in terms of both fluency and accuracy, which greatly increases its appeal for young users. The present exploratory study aims to find out how the Chinese undergraduates perceive and use machine translation in their daily life. A survey is conducted to collect data from 100 undergraduate students from multiple Chinese universities and with varied academic backgrounds, including arts, business, science, engineering, and medicine. The survey questions inquire about their use (including frequency, scenarios, purposes, and preferences) of and attitudes (including trust, quality assessment, justifications, and ethics) toward machine translation. Interviews and tasks of evaluating machine translation output are also employed in combination with the survey on a sample of selected respondents. The results indicate that Chinese undergraduate students use machine translation on a daily basis for a wide range of purposes in academic, communicative, and entertainment scenarios. Most of them have preferred machine translation tools, but the availability of machine translation tools within a certain scenario, such as the embedded machine translation tool on the webpage, is also the determining factor in their choice. The results also reveal that despite the reportedly limited trust in the accuracy of machine translation output, most students lack the ability to critically analyze and evaluate such output. Furthermore, the evidence is revealed of the inadequate awareness of ethical responsibility as machine translation users among Chinese undergraduate students.

Keywords: Chinese undergraduates, machine translation, trust, usage

Procedia PDF Downloads 139
6621 Comparison of Quality Indices for Sediment Assessment in Ireland

Authors: Tayyaba Bibi, Jenny Ronan, Robert Hernan, Kathleen O’Rourke, Brendan McHugh, Evin McGovern, Michelle Giltrap, Gordon Chambers, James Wilson

Abstract:

Sediment contamination is a major source of ecosystem stress and has received significant attention from the scientific community. Both the Water Framework Directive (WFD) and Marine Strategy Framework Directive (MSFD) require a robust set of tools for biological and chemical monitoring. For the MSFD in particular, causal links between contaminant and effects need to be assessed. Appropriate assessment tools are required in order to make an accurate evaluation. In this study, a range of recommended sediment bioassays and chemical measurements are assessed in a number of potentially impacted and lowly impacted locations around Ireland. Previously, assessment indices have been developed on individual compartments, i.e. contaminant levels or biomarker/bioassay responses. A number of assessment indices are applied to chemical and ecotoxicological data from the Seachange project (Project code) and compared including the metal pollution index (MPI), pollution load index (PLI) and Chapman index for chemistry as well as integrated biomarker response (IBR). The benefits and drawbacks of the use of indices and aggregation techniques are discussed. In addition to this, modelling of raw data is investigated to analyse links between contaminant and effects.

Keywords: bioassays, contamination indices, ecotoxicity, marine environment, sediments

Procedia PDF Downloads 228
6620 The Importance of Knowledge Innovation for External Audit on Anti-Corruption

Authors: Adel M. Qatawneh

Abstract:

This paper aimed to determine the importance of knowledge innovation for external audit on anti-corruption in the entire Jordanian bank companies are listed in Amman Stock Exchange (ASE). The study importance arises from the need to recognize the Knowledge innovation for external audit and anti-corruption as the development in the world of business, the variables that will be affected by external audit innovation are: reliability of financial data, relevantly of financial data, consistency of the financial data, Full disclosure of financial data and protecting the rights of investors to achieve the objectives of the study a questionnaire was designed and distributed to the society of the Jordanian bank are listed in Amman Stock Exchange. The data analysis found out that the banks in Jordan have a positive importance of Knowledge innovation for external audit on anti-corruption. They agree on the benefit of Knowledge innovation for external audit on anti-corruption. The statistical analysis showed that Knowledge innovation for external audit had a positive impact on the anti-corruption and that external audit has a significantly statistical relationship with anti-corruption, reliability of financial data, consistency of the financial data, a full disclosure of financial data and protecting the rights of investors.

Keywords: knowledge innovation, external audit, anti-corruption, Amman Stock Exchange

Procedia PDF Downloads 465
6619 Comparing Groundwater Fluoride Level with WHO Guidelines and Classifying At-Risk Age Groups; Based on Health Risk Assessment

Authors: Samaneh Abolli, Kamyar Yaghmaeian, Ali Arab Aradani, Mahmood Alimohammadi

Abstract:

The main route of fluoride uptake is drinking water. Fluoride absorption in the acceptable range (0.5-1.5 mg L-¹) is suitable for the body, but it's too much consumption can have irreversible health effects. To compare fluoride concentration with the WHO guidelines, 112 water samples were taken from groundwater aquifers in 22 villages of Garmsar County, the central part of Iran, during 2018 to 2019.Fluoride concentration was measured by the SPANDS method, and its non-carcinogenic impacts were calculated using EDI and HQ. The statistical population was divided into four categories of infant, children, teenagers, and adults. Linear regression and Spearman rank correlation coefficient tests were used to investigate the relationships between the well's depth and fluoride concentration in the water samples. The annual mean concentrations of fluoride in 2018 and2019 were 0.75 and 0.64 mg -¹ and, the fluoride mean concentration in the samples classifying the cold and hot seasons of the studied years was 0.709 and 0.689 mg L-¹, respectively. The amount of fluoride in 27% of the samples in both years was less than the acceptable minimum (0.5 mg L-¹). Also, 11% of the samples in2018 (6 samples) had fluoride levels higher than 1.5 mg L-¹. The HQ showed that the children were vulnerable; teenagers and adults were in the next ranks, respectively. Statistical tests showed a reverse and significant correlation (R2 = 0.02, < 0.0001) between well depth and fluoride content. The border between the usefulness/harmfulness of fluoride is very narrow and requires extensive studies.

Keywords: fluoride, groundwater, health risk assessment, hazard quotient, Garmsar

Procedia PDF Downloads 71
6618 Understanding Complexity at Pre-Construction Stage in Project Planning of Construction Projects

Authors: Mehran Barani Shikhrobat, Roger Flanagan

Abstract:

The construction planning and scheduling based on using the current tools and techniques is resulted deterministic in nature (Gantt chart, CPM) or applying a very little probability of completion (PERT) for each task. However, every project embodies assumptions and influences and should start with a complete set of clearly defined goals and constraints that remain constant throughout the duration of the project. Construction planners continue to apply the traditional methods and tools of “hard” project management that were developed for “ideal projects,” neglecting the potential influence of complexity on the design and construction process. The aim of this research is to investigate the emergence and growth of complexity in project planning and to provide a model to consider the influence of complexity on the total project duration at the post-contract award pre-construction stage of a project. The literature review showed that complexity originates from different sources of environment, technical, and workflow interactions. They can be divided into two categories of complexity factors, first, project tasks, and second, project organisation management. Project tasks may originate from performance, lack of resources, or environmental changes for a specific task. Complexity factors that relate to organisation and management refer to workflow and interdependence of different parts. The literature review highlighted the ineffectiveness of traditional tools and techniques in planning for complexity. However, this research focus on understanding the fundamental causes of the complexity of construction projects were investigated through a questionnaire with industry experts. The results were used to develop a model that considers the core complexity factors and their interactions. System dynamics were used to investigate the model to consider the influence of complexity on project planning. Feedback from experts revealed 20 major complexity factors that impact project planning. The factors are divided into five categories known as core complexity factors. To understand the weight of each factor in comparison, the Analytical Hierarchy Process (AHP) analysis method is used. The comparison showed that externalities are ranked as the biggest influence across the complexity factors. The research underlines that there are many internal and external factors that impact project activities and the project overall. This research shows the importance of considering the influence of complexity on the project master plan undertaken at the post-contract award pre-construction phase of a project.

Keywords: project planning, project complexity measurement, planning uncertainty management, project risk management, strategic project scheduling

Procedia PDF Downloads 141
6617 Statistical Data Analysis of Migration Impact on the Spread of HIV Epidemic Model Using Markov Monte Carlo Method

Authors: Ofosuhene O. Apenteng, Noor Azina Ismail

Abstract:

Over the last several years, concern has developed over how to minimize the spread of HIV/AIDS epidemic in many countries. AIDS epidemic has tremendously stimulated the development of mathematical models of infectious diseases. The transmission dynamics of HIV infection that eventually developed AIDS has taken a pivotal role of much on building mathematical models. From the initial HIV and AIDS models introduced in the 80s, various improvements have been taken into account as how to model HIV/AIDS frameworks. In this paper, we present the impact of migration on the spread of HIV/AIDS. Epidemic model is considered by a system of nonlinear differential equations to supplement the statistical method approach. The model is calibrated using HIV incidence data from Malaysia between 1986 and 2011. Bayesian inference based on Markov Chain Monte Carlo is used to validate the model by fitting it to the data and to estimate the unknown parameters for the model. The results suggest that the migrants stay for a long time contributes to the spread of HIV. The model also indicates that susceptible individual becomes infected and moved to HIV compartment at a rate that is more significant than the removal rate from HIV compartment to AIDS compartment. The disease-free steady state is unstable since the basic reproduction number is 1.627309. This is a big concern and not a good indicator from the public heath point of view since the aim is to stabilize the epidemic at the disease equilibrium.

Keywords: epidemic model, HIV, MCMC, parameter estimation

Procedia PDF Downloads 602
6616 Optimization of the Fabrication Process for Particleboards Made from Oil Palm Fronds Blended with Empty Fruit Bunch Using Response Surface Methodology

Authors: Ghazi Faisal Najmuldeen, Wahida Amat-Fadzil, Zulkafli Hassan, Jinan B. Al-Dabbagh

Abstract:

The objective of this study was to evaluate the optimum fabrication process variables to produce particleboards from oil palm fronds (OPF) particles and empty fruit bunch fiber (EFB). Response surface methodology was employed to analyse the effect of hot press temperature (150–190°C); press time (3–7 minutes) and EFB blending ratio (0–40%) on particleboards modulus of rupture, modulus of elasticity, internal bonding, water absorption and thickness swelling. A Box-Behnken experimental design was carried out to develop statistical models used for the optimisation of the fabrication process variables. All factors were found to be statistically significant on particleboards properties. The statistical analysis indicated that all models showed significant fit with experimental results. The optimum particleboards properties were obtained at optimal fabrication process condition; press temperature; 186°C, press time; 5.7 min and EFB / OPF ratio; 30.4%. Incorporating of oil palm frond and empty fruit bunch to produce particleboards has improved the particleboards properties. The OPF–EFB particleboards fabricated at optimized conditions have satisfied the ANSI A208.1–1999 specification for general purpose particleboards.

Keywords: empty fruit bunch fiber, oil palm fronds, particleboards, response surface methodology

Procedia PDF Downloads 230
6615 Developing a Framework for Open Source Software Adoption in a Higher Education Institution in Uganda. A case of Kyambogo University

Authors: Kafeero Frank

Abstract:

This study aimed at developing a frame work for open source software adoption in an institution of higher learning in Uganda, with the case of KIU as a study area. There were mainly four research questions based on; individual staff interaction with open source software forum, perceived FOSS characteristics, organizational characteristics and external characteristics as factors that affect open source software adoption. The researcher used causal-correlation research design to study effects of these variables on open source software adoption. A quantitative approach was used in this study with self-administered questionnaire on a purposively and randomly sampled sample of university ICT staff. Resultant data was analyzed using means, correlation coefficients and multivariate multiple regression analysis as statistical tools. The study reveals that individual staff interaction with open source software forum and perceived FOSS characteristics were the primary factors that significantly affect FOSS adoption while organizational and external factors were secondary with no significant effect but significant correlation to open source software adoption. It was concluded that for effective open source software adoption to occur there must be more effort on primary factors with subsequent reinforcement of secondary factors to fulfill the primary factors and adoption of open source software. Lastly recommendations were made in line with conclusions for coming up with Kyambogo University frame work for open source software adoption in institutions of higher learning. Areas of further research recommended include; Stakeholders’ analysis of open source software adoption in Uganda; Challenges and way forward. Evaluation of Kyambogo University frame work for open source software adoption in institutions of higher learning. Framework development for cloud computing adoption in Ugandan universities. Framework for FOSS development in Uganda IT industry

Keywords: open source software., organisational characteristics, external characteristics, cloud computing adoption

Procedia PDF Downloads 72
6614 The Significance of Computer Assisted Language Learning in Teaching English Grammar in Tribal Zone of Chhattisgarh

Authors: Yogesh Kumar Tiwari

Abstract:

Chhattisgarh has realized the fundamental role of information and communication technology in the globalized world where knowledge is at the top for the growth and intellectual development. They are spreading so widely that one feels lagging behind if not using them. The influence of these radiating and technological tools has encompassed all aspects of the educational, business, and economic sectors of our world. Undeniably the computer has not only established itself globally in all walks of life but has acquired a fundamental role of paramount importance in the educational process also. This role is getting all pervading and more powerful as computers are being manufactured to be cheaper, smaller in size, adaptable and easy to handle. Computers are becoming indispensable to teachers because of their enormous capabilities and extensive competence. This study aims at observing the effect of using computer based software program of English language on the achievement of undergraduate level students studying in tribal area like Sarguja Division, Chhattisgarh, India. To testify the effect of an innovative teaching in the graduate classroom in tribal area 50 students were randomly selected and separated into two groups. The first group of 25 students were taught English grammar i.e., passive voice/narration, through traditional method using chalk and blackboard asking some formal questions. The second group, the experimental one, was taught English grammar i.e., passive voice/narration, using computer, projector with power point presentation of grammatical items. The statistical analysis was done on the students’ learning capacities and achievement. The result was extremely mesmerizing not only for the teacher but for taught also. The process of the recapitulation demonstrated that the students of experimental group responded the answers of the questions enthusiastically with innovative sense of learning. In light of the findings of the study, it was recommended that teachers and professors of English ought to use self-made instructional program in their teaching process particularly in tribal areas.

Keywords: achievement computer assisted language learning, use of instructional program

Procedia PDF Downloads 150
6613 Distribution of Traffic Volume at Fuel Station during Peak Hour Period on Arterial Road

Authors: Surachai Ampawasuvan, Supornchai Utainarumol

Abstract:

Most of fuel station’ customers, who drive on the major arterial road wants to use the stations to fill fuel to their vehicle during their journey to destinations. According to the survey of traffic volume of the vehicle using fuel stations by video cameras, automatic counting tools, or questionnaires, it was found that most users prefer to use fuel stations on holiday rather than on working day. They also prefer to use fuel stations in the morning rather than in the evening. When comparing the ratio of the distribution pattern of traffic volume of the vehicle using fuel stations by video cameras, automatic counting tools, there is no significant difference. However, when comparing the ratio of peak hour (peak hour rate) of the results from questionnaires at 13 to 14 percent with the results obtained by using the methods of the Institute of Transportation Engineering (ITE), it is found that the value is similar. However, it is different from a survey by video camera and automatic traffic counting at 6 to 7 percent of about half. So, this study suggests that in order to forecast trip generation of vehicle using fuel stations on major arterial road which is mostly characterized by Though Traffic, it is recommended to use the value of half of peak hour rate, which would make the forecast for trips generation to be more precise and accurate and compatible to surrounding environment.

Keywords: peak rate, trips generation, fuel station, arterial road

Procedia PDF Downloads 411
6612 Victims of Imprisonment: Incarceration and Post-Release Effects of Confinement with Women with a Mental Illness

Authors: Anat Yaron Antar, Tomer Einat

Abstract:

This study explores the effects of the imprisonment of women together with females with mental disorders on the well-being of the former both during imprisonment and after their release from prison. Based on in-depth interviews with 22 women ex-prisoners who had been imprisoned for a period of at least two years in the single Israeli female correctional facility, Neve Tirza Prison, and released one to three months before the initiation of the study to a community-based agency managed by the Israeli Prisoner Rehabilitation Authority, and based on a qualitative, constructive strategy. We found that: (i) mentally ill prisoners’ conduct creates severe feelings of stress and discomfort among many of the prisoners without a mental disorder prisoners; (ii) The intimate and often long-term encounters with prisoners with a mental illness lead to increased feelings of distress, helplessness, fear, and frustration among many of the women prisoners; (iii) the damaging encounters between women prisoners and mentally-ill prisoners harmed the reintegration of the formers into society after release, and (iv) The women ex-prisoners lacked the basic mental, cognitive, and social tools necessary for dealing with female inmates with a mental illness and had received no psychological or emotional support from the prison personnel. Consequently, they suffered – and still suffer – from traumatic and upsetting memories Our findings led us to conclude that women prisoners should be imprisoned separately from female prisoners with mental disorders or be offered a wide range of psychological and emotional coping tools as well as various rehabilitative treatment programs.

Keywords: women, prisoners, mentally ill, health

Procedia PDF Downloads 129
6611 A Development of Creative Instruction Model through Digital Media

Authors: Kathaleeya Chanda, Panupong Chanplin, Suppara Charoenpoom

Abstract:

This purposes of the development of creative instruction model through digital media are to: 1) enable learners to learn from instruction media application; 2) help learners implementing instruction media correctly and appropriately; and 3) facilitate learners to apply technology for searching information and practicing skills to implement technology creatively. The sample group consists of 130 cases of secondary students studying in Bo Kluea School, Bo Kluea Nuea Sub-district, Bo Kluea District, Nan Province. The probability sampling was selected through the simple random sampling and the statistics used in this research are percentage, mean, standard deviation and one group pretest – posttest design. The findings are summarized as follows: The congruence index of instruction media for occupation and technology subjects is appropriate. By comparing between learning achievements before implementing the instruction media and learning achievements after implementing the instruction media, it is found that the posttest achievements are higher than the pretest achievements with statistical significance at the level of .05. For the learning achievements from instruction media implementation, pretest mean is 16.24 while posttest mean is 26.28. Besides, pretest and posttest results are compared and differences of mean are tested, the test results show that the posttest achievements are higher than the pretest achievements with statistical significance at the level of .05. This can be interpreted that the learners achieve better learning progress.

Keywords: teaching learning model, digital media, creative instruction model, Bo Kluea school

Procedia PDF Downloads 143
6610 Lipid Emulsion versus DigiFab in a Rat Model of Acute Digoxin Toxicity

Authors: Cansu Arslan Turan, Tuba Cimilli Ozturk, Ebru Unal Akoglu, Kemal Aygun, Ecem Deniz Kırkpantur, Ozge Ecmel Onur

Abstract:

Although the mechanism of action is not well known, Intravenous Lipid Emulsion (ILE) has been shown to be effective in the treatment of lipophilic drug intoxications. It is thought that ILE probably separate the lipophilic drugs from target tissue by creating a lipid-rich compartment in the plasma. The second theory is that ILE provides energy to myocardium with high dose free fatty acids activating the voltage gated calcium channels in the myocytes. In this study, the effects of ILE treatment on digoxin overdose which are frequently observed in emergency departments was searched in an animal model in terms of cardiac side effects and survival. The study was carried out at Yeditepe University, Faculty of Medicine-Experimental Animals Research Center Labs in December 2015. 40 Sprague-Dawley rats weighing 300-400 g were divided into 5 groups randomly. As the pre-treatment, the first group received saline, the second group received lipid, the third group received DigiFab, and the fourth group received DigiFab and lipid. Following that, digoxin was infused to all groups until death except the control group. First arrhythmia and cardiac arrest occurrence times were recorded. As no medication causing arrhythmia was infused, Group 5 was excluded from the statistical analysis performed for the comparisons of first arrhythmia and death time. According to the results although there was no significant difference in the statistical analysis comparing the four groups, as the rats, only exposed to digoxin intoxication were compared with the rats pre-treated with ILE in terms of first arrhythmia time and cardiac arrest occurrence times, significant difference was observed between the groups. According to our results, using DigiFab treatment, intralipid treatment, intralipid and DigiFab treatment for the rats exposed to digoxin intoxication makes no significant difference in terms of the first arrhythmia and death occurrence time. However, it is not possible to say that at the doses we use in the study, ILE treatment might be successful at least as a known antidote. The fact that the statistical significance between the two groups is not observed in the inter-comparisons of all the groups, the study should be repeated in the larger groups.

Keywords: arrhytmia, cardiac arrest, DigiFab, digoxin intoxication

Procedia PDF Downloads 235
6609 Predictive Analytics for Theory Building

Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim

Abstract:

Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.

Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building

Procedia PDF Downloads 277
6608 Enhancing Sell-In and Sell-Out Forecasting Using Ensemble Machine Learning Method

Authors: Vishal Das, Tianyi Mao, Zhicheng Geng, Carmen Flores, Diego Pelloso, Fang Wang

Abstract:

Accurate sell-in and sell-out forecasting is a ubiquitous problem in the retail industry. It is an important element of any demand planning activity. As a global food and beverage company, Nestlé has hundreds of products in each geographical location that they operate in. Each product has its sell-in and sell-out time series data, which are forecasted on a weekly and monthly scale for demand and financial planning. To address this challenge, Nestlé Chilein collaboration with Amazon Machine Learning Solutions Labhas developed their in-house solution of using machine learning models for forecasting. Similar products are combined together such that there is one model for each product category. In this way, the models learn from a larger set of data, and there are fewer models to maintain. The solution is scalable to all product categories and is developed to be flexible enough to include any new product or eliminate any existing product in a product category based on requirements. We show how we can use the machine learning development environment on Amazon Web Services (AWS) to explore a set of forecasting models and create business intelligence dashboards that can be used with the existing demand planning tools in Nestlé. We explored recent deep learning networks (DNN), which show promising results for a variety of time series forecasting problems. Specifically, we used a DeepAR autoregressive model that can group similar time series together and provide robust predictions. To further enhance the accuracy of the predictions and include domain-specific knowledge, we designed an ensemble approach using DeepAR and XGBoost regression model. As part of the ensemble approach, we interlinked the sell-out and sell-in information to ensure that a future sell-out influences the current sell-in predictions. Our approach outperforms the benchmark statistical models by more than 50%. The machine learning (ML) pipeline implemented in the cloud is currently being extended for other product categories and is getting adopted by other geomarkets.

Keywords: sell-in and sell-out forecasting, demand planning, DeepAR, retail, ensemble machine learning, time-series

Procedia PDF Downloads 276
6607 Attitudinal Change: A Major Therapy for Non–Technical Losses in the Nigerian Power Sector

Authors: Fina O. Faithpraise, Effiong O. Obisung, Azele E. Peter, Chris R. Chatwin

Abstract:

This study investigates and identifies consumer attitude as a major influence that results in non-technical losses in the Nigerian electricity supply sector. This discovery is revealed by the combination of quantitative and qualitative research to complete a survey. The dataset employed is a simple random sampling of households using electricity (public power supply), and the number of units chosen is based on statistical power analysis. The units were subdivided into two categories (household with and without electrical meters). The hypothesis formulated was tested and analyzed using a chi-square statistical method. The results obtained shows that the critical value for the household with electrical prepared meter (EPM) was (9.488 < 427.4) and those without electrical prepared meter (EPMn) was (9.488 < 436.1) with a p-value of 0.01%. The analysis demonstrated so far established the real-time position, which shows that the wrong attitude towards handling the electricity supplied (not turning off light bulbs and electrical appliances when not in use within the rooms and outdoors within 12 hours of the day) characterized the non-technical losses in the power sector. Therefore the adoption of efficient lighting attitudes in individual households as recommended by the researcher is greatly encouraged. The results from this study should serve as a model for energy efficiency and use for the improvement of electricity consumption as well as a stable economy.

Keywords: attitudinal change, household, non-technical losses, prepared meter

Procedia PDF Downloads 180
6606 Statistical Analysis of Parameters Effects on Maximum Strain and Torsion Angle of FRP Honeycomb Sandwich Panels Subjected to Torsion

Authors: Mehdi Modabberifar, Milad Roodi, Ehsan Souri

Abstract:

In recent years, honeycomb fiber reinforced plastic (FRP) sandwich panels have been increasingly used in various industries. Low weight, low price, and high mechanical strength are the benefits of these structures. However, their mechanical properties and behavior have not been fully explored. The objective of this study is to conduct a combined numerical-statistical investigation of honeycomb FRP sandwich beams subject to torsion load. In this paper, the effect of geometric parameters of the sandwich panel on the maximum shear strain in both face and core and angle of torsion in a honeycomb FRP sandwich structures in torsion is investigated. The effect of Parameters including core thickness, face skin thickness, cell shape, cell size, and cell thickness on mechanical behavior of the structure were numerically investigated. Main effects of factors were considered in this paper and regression equations were derived. Taguchi method was employed as experimental design and an optimum parameter combination for the maximum structure stiffness has been obtained. The results showed that cell size and face skin thickness have the most significant impacts on torsion angle, maximum shear strain in face and core.

Keywords: finite element, honeycomb FRP sandwich panel, torsion, civil engineering

Procedia PDF Downloads 418
6605 Trial Version of a Systematic Material Selection Tool in Building Element Design

Authors: Mine Koyaz, M. Cem Altun

Abstract:

Selection of the materials satisfying the expected performances is significantly important for any design. Today, with the constantly evolving and developing technologies, the material options are so wide that the necessity of the use of some support tools in the selection process is arising. Therefore, as a sub process of building element design, a systematic material selection tool is developed, that defines four main steps of the material selection; definition, research, comparison and decision. The main purpose of the tool is being an educational instrument that would show a methodic way of material selection in architectural detailing for the use of architecture students. The tool predefines the possible uses of various material databases and other sources of information on material properties. Hence, it is to be used as a guidance for designers, especially with a limited material knowledge and experience. The material selection tool not only embraces technical properties of materials related with building elements’ functional requirements, but also its sensual properties related with the identity of design and its environmental impacts with respect to the sustainability of the design. The method followed in the development of the tool has two main sections; first the examination and application of the existing methods and second the development of trial versions and their applications. Within the scope of the existing methods; design support tools, methodic approaches for the building element design and material selection process, material properties, material databases, methodic approaches for the decision making process are examined. The existing methods are applied by architecture students and newly graduate architects through different design problems. With respect to the results of these applications, strong and weak sides of the existing material selection tools are presented. A main flow chart of the material selection tool has been developed with the objective to apply the strong aspects of the existing methods and develop their weak sides. Through different stages, a different aspect of the material selection process is investigated and the tool took its final form. Systematic material selection tool, within the building element design process, guides the users with a minimum background information, to practically and accurately determine the ideal material that is to be chosen, satisfying the needs of their design. The tool has a flexible structure that answers different needs of different designs and designers. The trial version issued in this paper shows one of the paths that could be followed and illustrates its application over a design problem.

Keywords: architectural education, building element design, material selection tool, systematic approach

Procedia PDF Downloads 352
6604 Music Reading Expertise Facilitates Implicit Statistical Learning of Sentence Structures in a Novel Language: Evidence from Eye Movement Behavior

Authors: Sara T. K. Li, Belinda H. J. Chung, Jeffery C. N. Yip, Janet H. Hsiao

Abstract:

Music notation and text reading both involve statistical learning of music or linguistic structures. However, it remains unclear how music reading expertise influences text reading behavior. The present study examined this issue through an eye-tracking study. Chinese-English bilingual musicians and non-musicians read English sentences, Chinese sentences, musical phrases, and sentences in Tibetan, a language novel to the participants, with their eye movement recorded. Each set of stimuli consisted of two conditions in terms of structural regularity: syntactically correct and syntactically incorrect musical phrases/sentences. They then completed a sentence comprehension (for syntactically correct sentences) or a musical segment/word recognition task afterwards to test their comprehension/recognition abilities. The results showed that in reading musical phrases, as compared with non-musicians, musicians had a higher accuracy in the recognition task, and had shorter reading time, fewer fixations, and shorter fixation duration when reading syntactically correct (i.e., in diatonic key) than incorrect (i.e., in non-diatonic key/atonal) musical phrases. This result reflects their expertise in music reading. Interestingly, in reading Tibetan sentences, which was novel to both participant groups, while non-musicians did not show any behavior differences between reading syntactically correct or incorrect Tibetan sentences, musicians showed a shorter reading time and had marginally fewer fixations when reading syntactically correct sentences than syntactically incorrect ones. However, none of the musicians reported discovering any structural regularities in the Tibetan stimuli after the experiment when being asked explicitly, suggesting that they may have implicitly acquired the structural regularities in Tibetan sentences. This group difference was not observed when they read English or Chinese sentences. This result suggests that music reading expertise facilities reading texts in a novel language (i.e., Tibetan), but not in languages that the readers are already familiar with (i.e., English and Chinese). This phenomenon may be due to the similarities between reading music notations and reading texts in a novel language, as in both cases the stimuli follow particular statistical structures but do not involve semantic or lexical processing. Thus, musicians may transfer their statistical learning skills stemmed from music notation reading experience to implicitly discover structures of sentences in a novel language. This speculation is consistent with a recent finding showing that music reading expertise modulates the processing of English nonwords (i.e., words that do not follow morphological or orthographic rules) but not pseudo- or real words. These results suggest that the modulation of music reading expertise on language processing depends on the similarities in the cognitive processes involved. It also has important implications for the benefits of music education on language and cognitive development.

Keywords: eye movement behavior, eye-tracking, music reading expertise, sentence reading, structural regularity, visual processing

Procedia PDF Downloads 383
6603 Outsourcing the Front End of Innovation

Authors: B. Likar, K. Širok

Abstract:

The paper presents a new method for efficient innovation process management. Even though the innovation management methods, tools and knowledge are well established and documented in literature, most of the companies still do not manage it efficiently. Especially in SMEs the front end of innovation - problem identification, idea creation and selection - is often not optimally performed. Our eMIPS methodology represents a sort of "umbrella methodology"- a well-defined set of procedures, which can be dynamically adapted to the concrete case in a company. In daily practice, various methods (e.g. for problem identification and idea creation) can be applied, depending on the company's needs. It is based on the proactive involvement of the company's employees supported by the appropriate methodology and external experts. The presented phases are performed via a mixture of face-to-face activities (workshops) and online (eLearning) activities taking place in eLearning Moodle environment and using other e-communication channels. One part of the outcomes is an identified set of opportunities and concrete solutions ready for implementation. The other also very important result is connected to innovation competences for the participating employees related with concrete tools and methods for idea management. In addition, the employees get a strong experience for dynamic, efficient and solution oriented managing of the invention process. The eMIPS also represents a way of establishing or improving the innovation culture in the organization. The first results in a pilot company showed excellent results regarding the motivation of participants and also as to the results achieved.

Keywords: creativity, distance learning, front end, innovation, problem

Procedia PDF Downloads 329
6602 The Impact of Public Open Space System on Housing Price in Chicago

Authors: Si Chen, Le Zhang, Xian He

Abstract:

The research explored the influences of public open space system on housing price through hedonic models, in order to support better open space plans and economic policies. We have three initial hypotheses: 1) public open space system has an overall positive influence on surrounding housing prices. 2) Different public open space types have different levels of influence on motivating surrounding housing prices. 3) Walking and driving accessibilities from property to public open spaces have different statistical relation with housing prices. Cook County, Illinois, was chosen to be a study area since data availability, sufficient open space types, and long-term open space preservation strategies. We considered the housing attributes, driving and walking accessibility scores from houses to nearby public open spaces, and driving accessibility scores to hospitals as influential features and used real housing sales price in 2010 as a dependent variable in the built hedonic model. Through ordinary least squares (OLS) regression analysis, General Moran’s I analysis and geographically weighted regression analysis, we observed the statistical relations between public open spaces and housing sale prices in the three built hedonic models and confirmed all three hypotheses.

Keywords: hedonic model, public open space, housing sale price, regression analysis, accessibility score

Procedia PDF Downloads 134
6601 Comparative Study to Evaluate Chronological Age and Dental Age in North Indian Population Using Cameriere Method

Authors: Ranjitkumar Patil

Abstract:

Age estimation has its importance in forensic dentistry. Dental age estimation has emerged as an alternative to skeletal age determination. The methods based on stages of tooth formation, as appreciated on radiographs, seems to be more appropriate in the assessment of age than those based on skeletal development. The study was done to evaluate dental age in north Indian population using Cameriere’s method. Aims/Objectives: The study was conducted to assess the dental age of North Indian children using Cameriere’smethodand to compare the chronological age and dental age for validation of the Cameriere’smethod in the north Indian population. A comparative study of 02 year duration on the OPG (using PLANMECA Promax 3D) data of 497 individuals with age ranging from 5 to 15 years was done based on simple random technique ethical approval obtained from the institutional ethical committee. The data was obtained based on inclusion and exclusion criteria was analyzed by a software for dental age estimation. Statistical analysis: Student’s t test was used to compare the morphological variables of males with those of females and to compare observed age with estimated age. Regression formula was also calculated. Results: Present study was a comparative study of 497 subjects with a distribution between male and female, with their dental age assessed by using Panoramic radiograph, following the method described by Cameriere, which is widely accepted. Statistical analysis in our study indicated that gender does not have a significant influence on age estimation. (R2= 0.787). Conclusion: This infers that cameriere’s method can be effectively applied in north Indianpopulation.

Keywords: Forensic, Chronological Age, Dental Age, Skeletal Age

Procedia PDF Downloads 90
6600 Statistical Analysis of Extreme Flow (Regions of Chlef)

Authors: Bouthiba Amina

Abstract:

The estimation of the statistics bound to the precipitation represents a vast domain, which puts numerous challenges to meteorologists and hydrologists. Sometimes, it is necessary, to approach in value the extreme events for sites where there is little, or no datum, as well as their periods of return. The search for a model of the frequency of the heights of daily rains dresses a big importance in operational hydrology: It establishes a basis for predicting the frequency and intensity of floods by estimating the amount of precipitation in past years. The most known and the most common approach is the statistical approach, It consists in looking for a law of probability that fits best the values observed by the random variable " daily maximal rain " after a comparison of various laws of probability and methods of estimation by means of tests of adequacy. Therefore, a frequent analysis of the annual series of daily maximal rains was realized on the data of 54 pluviometric stations of the pond of high and average. This choice was concerned with five laws usually applied to the study and the analysis of frequent maximal daily rains. The chosen period is from 1970 to 2013. It was of use to the forecast of quantiles. The used laws are the law generalized by extremes to three components, those of the extreme values to two components (Gumbel and log-normal) in two parameters, the law Pearson typifies III and Log-Pearson III in three parameters. In Algeria, Gumbel's law has been used for a long time to estimate the quantiles of maximum flows. However, and we will check and choose the most reliable law.

Keywords: return period, extreme flow, statistics laws, Gumbel, estimation

Procedia PDF Downloads 79
6599 Change of Education Business in the Age of 5G

Authors: Heikki Ruohomaa, Vesa Salminen

Abstract:

Regions are facing huge competition to attract companies, businesses, inhabitants, students, etc. This way to improve living and business environment, which is rapidly changing due to digitalization. On the other hand, from the industry's point of view, the availability of a skilled labor force and an innovative environment are crucial factors. In this context, qualified staff has been seen to utilize the opportunities of digitalization and respond to the needs of future skills. World Manufacturing Forum has stated in the year 2019- report that in next five years, 40% of workers have to change their core competencies. Through digital transformation, new technologies like cloud, mobile, big data, 5G- infrastructure, platform- technology, data- analysis, and social networks with increasing intelligence and automation, enterprises can capitalize on new opportunities and optimize existing operations to achieve significant business improvement. Digitalization will be an important part of the everyday life of citizens and present in the working day of the average citizen and employee in the future. For that reason, the education system and education programs on all levels of education from diaper age to doctorate have been directed to fulfill this ecosystem strategy. Goal: The Fourth Industrial Revolution will bring unprecedented change to societies, education organizations and business environments. This article aims to identify how education, education content, the way education has proceeded, and overall whole the education business is changing. Most important is how we should respond to this inevitable co- evolution. Methodology: The study aims to verify how the learning process is boosted by new digital content, new learning software and tools, and customer-oriented learning environments. The change of education programs and individual education modules can be supported by applied research projects. You can use them in making proof- of- the concept of new technology, new ways to teach and train, and through the experiences gathered change education content, way to educate and finally education business as a whole. Major findings: Applied research projects can prove the concept- phases on real environment field labs to test technology opportunities and new tools for training purposes. Customer-oriented applied research projects are also excellent for students to make assignments and use new knowledge and content and teachers to test new tools and create new ways to educate. New content and problem-based learning are used in future education modules. This article introduces some case study experiences on customer-oriented digital transformation projects and how gathered knowledge on new digital content and a new way to educate has influenced education. The case study is related to experiences of research projects, customer-oriented field labs/learning environments and education programs of Häme University of Applied Sciences.

Keywords: education process, digitalization content, digital tools for education, learning environments, transdisciplinary co-operation

Procedia PDF Downloads 177
6598 Application of the Building Information Modeling Planning Approach to the Factory Planning

Authors: Peggy Näser

Abstract:

Factory planning is a systematic, objective-oriented process for planning a factory, structured into a sequence of phases, each of which is dependent on the preceding phase and makes use of particular methods and tools, and extending from the setting of objectives to the start of production. The digital factory, on the other hand, is the generic term for a comprehensive network of digital models, methods, and tools – including simulation and 3D visualisation – integrated by a continuous data management system. Its aim is the holistic planning, evaluation and ongoing improvement of all the main structures, processes and resources of the real factory in conjunction with the product. Digital factory planning has already become established in factory planning. The application of Building Information Modeling has not yet been established in factory planning but has been used predominantly in the planning of public buildings. Furthermore, this concept is limited to the planning of the buildings and does not include the planning of equipment of the factory (machines, technical equipment) and their interfaces to the building. BIM is a cooperative method of working, in which the information and data relevant to its lifecycle are consistently recorded, managed and exchanged in a transparent communication between the involved parties on the basis of digital models of a building. Both approaches, the planning approach of Building Information Modeling and the methodical approach of the Digital Factory, are based on the use of a comprehensive data model. Therefore it is necessary to examine how the approach of Building Information Modeling can be extended in the context of factory planning in such a way that an integration of the equipment planning, as well as the building planning, can take place in a common digital model. For this, a number of different perspectives have to be investigated: the equipment perspective including the tools used to implement a comprehensive digital planning process, the communication perspective between the planners of different fields, the legal perspective, that the legal certainty in each country and the quality perspective, on which the quality criteria are defined and the planning will be evaluated. The individual perspectives are examined and illustrated in the article. An approach model for the integration of factory planning into the BIM approach, in particular for the integrated planning of equipment and buildings and the continuous digital planning is developed. For this purpose, the individual factory planning phases are detailed in the sense of the integration of the BIM approach. A comprehensive software concept is shown on the tool. In addition, the prerequisites required for this integrated planning are presented. With the help of the newly developed approach, a better coordination between equipment and buildings is to be achieved, the continuity of the digital factory planning is improved, the data quality is improved and expensive implementation errors are avoided in the implementation.

Keywords: building information modeling, digital factory, digital planning, factory planning

Procedia PDF Downloads 269
6597 Framework for Decision Support Tool for Quality Control and Management in Botswana Manufacturing Companies

Authors: Mogale Sabone, Thabiso Ntlole

Abstract:

The pressure from globalization has made manufacturing organizations to move towards three major competitive arenas: quality, cost, and responsiveness. Quality is a universal value and has become a global issue. In order to survive and be able to provide customers with good products, manufacturing organizations’ supporting systems, tools, and structures it uses must grow or evolve. The majority of quality management concepts and strategies that are practiced recently are aimed at detecting and correcting problems which already exist and serve to limit losses. In agile manufacturing environment there is no room for defect and error so it needs a quality management which is proactively directed at problem prevention. This proactive quality management avoids losses by focusing on failure prevention, virtual elimination of the possibility of premature failure, mistake-proofing, and assuring consistently high quality in the definition and design of creation processes. To achieve this, a decision support tool for quality control and management is suggested. Current decision support tools/methods used by most manufacturing companies in Botswana for quality management and control are not integrated, for example they are not consistent since some tests results data is recorded manually only whilst others are recorded electronically. It is only a set of procedures not a tool. These procedures cannot offer interactive decision support. This point brings to light the aim of this research which is to develop a framework which will help manufacturing companies in Botswana build a decision support tool for quality control and management.

Keywords: decision support tool, manufacturing, quality control, quality management

Procedia PDF Downloads 566
6596 Experimental Design for Formulation Optimization of Nanoparticle of Cilnidipine

Authors: Arti Bagada, Kantilal Vadalia, Mihir Raval

Abstract:

Cilnidipine is practically insoluble in water which results in its insufficient oral bioavailability. The purpose of the present investigation was to formulate cilnidipine nanoparticles by nanoprecipitation method to increase the aqueous solubility and dissolution rate and hence bioavailability by utilizing various experimental statistical design modules. Experimental design were used to investigate specific effects of independent variables during preparation cilnidipine nanoparticles and corresponding responses in optimizing the formulation. Plackett Burman design for independent variables was successfully employed for optimization of nanoparticles of cilnidipine. The influence of independent variables studied were drug concentration, solvent to antisolvent ratio, polymer concentration, stabilizer concentration and stirring speed. The dependent variables namely average particle size, polydispersity index, zeta potential value and saturation solubility of the formulated nanoparticles of cilnidipine. The experiments were carried out according to 13 runs involving 5 independent variables (higher and lower levels) employing Plackett-Burman design. The cilnidipine nanoparticles were characterized by average particle size, polydispersity index value, zeta potential value and saturation solubility and it results were 149 nm, 0.314, 43.24 and 0.0379 mg/ml, respectively. The experimental results were good correlated with predicted data analysed by Plackett-Burman statistical method.

Keywords: dissolution enhancement, nanoparticles, Plackett-Burman design, nanoprecipitation

Procedia PDF Downloads 160
6595 Electrical Cardiac Remodeling in Elite Athletes: A Comparative Study between Triathletes and Cyclists

Authors: Lingxia Li, Frédéric Schnell, Thibault Lachard, Anne-Charlotte Dupont, Shuzhe Ding, Solène Le Douairon Lahaye

Abstract:

Background: Repetitive participation in triathlon training results in significant myocardial changes. However, whether the cardiac remodeling in triathletes is related to the specificities of the sport (consisting of three sports) raises questions. Methods: Elite triathletes and cyclists registered on the French ministerial lists of high-level athletes were involved. The basic information and routine electrocardiogram records were obtained. Electrocardiograms were evaluated according to clinical criteria. Results: Of the 105 athletes included in the study, 42 were from the short-distance triathlon (40%), and 63 were from the road cycling (60%). The average age was 22.1±4.2 years. The P wave amplitude was significantly lower in triathletes than in cyclists (p=0.005), and no significant statistical difference was found in heart rate, RR interval, PR or PQ interval, QRS complex, QRS axe, QT interval, and QTc (p>0.05). All the measured parameters were within normal ranges. The most common electrical manifestations were early repolarization (60.95%) and incomplete right bundle branch block (43.81%); there was no statistical difference between the groups (p>0.05). Conclusions: Prolonged intensive endurance exercise training induces physiological cardiac remodeling in both triathletes and cyclists. The most common electrocardiogram manifestations were early repolarization and incomplete right bundle branch block.

Keywords: cardiac screening, electrocardiogram, triathlon, cycling, elite athletes

Procedia PDF Downloads 12
6594 Antibacterial Evaluation, in Silico ADME and QSAR Studies of Some Benzimidazole Derivatives

Authors: Strahinja Kovačević, Lidija Jevrić, Miloš Kuzmanović, Sanja Podunavac-Kuzmanović

Abstract:

In this paper, various derivatives of benzimidazole have been evaluated against Gram-negative bacteria Escherichia coli. For all investigated compounds the minimum inhibitory concentration (MIC) was determined. Quantitative structure-activity relationships (QSAR) attempts to find consistent relationships between the variations in the values of molecular properties and the biological activity for a series of compounds so that these rules can be used to evaluate new chemical entities. The correlation between MIC and some absorption, distribution, metabolism and excretion (ADME) parameters was investigated, and the mathematical models for predicting the antibacterial activity of this class of compounds were developed. The quality of the multiple linear regression (MLR) models was validated by the leave-one-out (LOO) technique, as well as by the calculation of the statistical parameters for the developed models and the results are discussed on the basis of the statistical data. The results of this study indicate that ADME parameters have a significant effect on the antibacterial activity of this class of compounds. Principal component analysis (PCA) and agglomerative hierarchical clustering algorithms (HCA) confirmed that the investigated molecules can be classified into groups on the basis of the ADME parameters: Madin-Darby Canine Kidney cell permeability (MDCK), Plasma protein binding (PPB%), human intestinal absorption (HIA%) and human colon carcinoma cell permeability (Caco-2).

Keywords: benzimidazoles, QSAR, ADME, in silico

Procedia PDF Downloads 377
6593 Evaluation of the Notifiable Diseases Surveillance System, South, Haiti, 2022

Authors: Djeamsly Salomon

Abstract:

Background: Epidemiological surveillance is a dynamic national system used to observe all aspects of the evolution of priority health problems, through: collection, analysis, systematic interpretation of information, and dissemination of results with necessary recommendations. The study was conducted to assess the mandatory disease surveillance system in the Sud Department. Methods: A study was conducted from March to May 2021 with key players involved in surveillance at the level of health institutions in the department . The CDC's 2021 updated guideline was used to evaluate the system. We collected information about the operation, attributes, and usefulness of the surveillance system using interviewer-administered questionnaires. Epi-Info7.2 and Excel 2016 were used to generate the mean, frequencies and proportions. Results: Of 30 participants, 23 (77%) were women. The average age was 39 years[30-56]. 25 (83%) had training in epidemiological surveillance. (50%) of the forms checked were signed by the supervisor. Collection tools were available at (80%). Knowledge of at least 7 notifiable diseases was high (100%). Among the respondents, 29 declared that the collection tools were simple, 27 had already filled in a notification form. The maximum time taken to fill out a form was 10 minutes. The feedback between the different levels was done at (60%). Conclusion: The surveillance system is useful, simple, acceptable, representative, flexible, stable and responsive. The data generated was of high quality. However, it is threatened by the lack of supervision of sentinel sites, lack of investigation and weak feedback. This evaluation demonstrated the urgent need to improve supervision in the sites and to feedback information. Strengthen epidemiological surveillance.

Keywords: evaluation, notifiable diseases, surveillance, system

Procedia PDF Downloads 79