Search results for: open circuit test
11637 Statistical Analysis of Rainfall Change over the Blue Nile Basin
Authors: Hany Mustafa, Mahmoud Roushdi, Khaled Kheireldin
Abstract:
Rainfall variability is an important feature of semi-arid climates. Climate change is very likely to increase the frequency, magnitude, and variability of extreme weather events such as droughts, floods, and storms. The Blue Nile Basin is facing extreme climate change-related events such as floods and droughts and its possible impacts on ecosystem, livelihood, agriculture, livestock, and biodiversity are expected. Rainfall variability is a threat to food production in the Blue Nile Basin countries. This study investigates the long-term variations and trends of seasonal and annual precipitation over the Blue Nile Basin for 102-year period (1901-2002). Six statistical trend analysis of precipitation was performed with nonparametric Mann-Kendall test and Sen's slope estimator. On the other hands, four statistical absolute homogeneity tests: Standard Normal Homogeneity Test, Buishand Range test, Pettitt test and the Von Neumann ratio test were applied to test the homogeneity of the rainfall data, using XLSTAT software, which results of p-valueless than alpha=0.05, were significant. The percentages of significant trends obtained for each parameter in the different seasons are presented. The study recommends adaptation strategies to be streamlined to relevant policies, enhancing local farmers’ adaptive capacity for facing future climate change effects.Keywords: Blue Nile basin, climate change, Mann-Kendall test, trend analysis
Procedia PDF Downloads 55211636 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking
Authors: Trevor Toy, Josef Langerman
Abstract:
Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.Keywords: big data markets, open banking, blockchain, personal data management
Procedia PDF Downloads 7411635 The Dynamic Cone Penetration Test: A Review of Its Correlations and Applications
Authors: Abdulrahman M. Hamid
Abstract:
Dynamic Cone Penetration Test (DCPT) is widely used for field quality assessment of soils. Its application to predict the engineering properties of soil is globally promoted by the fact that it is difficult to obtain undisturbed soil samples, especially when loose or submerged sandy soil is encountered. Detailed discussion will be presented on the current development of DCPT correlations with resilient modulus, relative density, California Bearing Ratio (CBR), unconfined compressive strength and shear strength that have been developed for different materials in both the laboratory and field, as well as on the usage of DCPT in quality control of compaction of earth fills and performance evaluation of pavement layers. In addition, the relationship of the DCPT with other instruments such as falling weight deflectometer, nuclear gauge, soil stiffens gauge, and plate load test will be reported. Lastely, the application of DCPT in Saudi Arabia in recent years will be addressed in this manuscript.Keywords: dynamic cone penetration test, falling weight deflectometer, nuclear gauge, soil stiffens gauge, plate load test, automated dynamic cone penetration
Procedia PDF Downloads 43811634 Investigation of the Effects of Processing Parameters on Pla Based 3D Printed Tensile Samples
Authors: Saifullah Karimullah
Abstract:
Additive manufacturing techniques are becoming more common with the latest technological advancements. It is composed to bring a revolution in the way products are designed, planned, manufactured, and distributed to end users. Fused deposition modeling (FDM) based 3D printing is one of those promising aspects that have revolutionized the prototyping processes. The purpose of this design and study project is to design a customized laboratory-scale FDM-based 3D printer from locally available sources. The primary goal is to design and fabricate the FDM-based 3D printer. After the fabrication, a tensile test specimen would be designed in Solid Works or [Creo computer-aided design (CAD)] software. A .stl file is generated of the tensile test specimen through slicing software and the G-codes are inserted via a computer for the test specimen to be printed. Different parameters were under studies like printing speed, layer thickness and infill density of the printed object. Some parameters were kept constant such as temperature, extrusion rate, raster orientation etc. Different tensile test specimens were printed for a different sets of parameters of the FDM-based 3d printer. The tensile test specimen were subjected to tensile tests using a universal testing machine (UTM). Design Expert software has been used for analyses, So Different results were obtained from the different tensile test specimens. The best, average and worst specimen were also observed under a compound microscope to investigate the layer bonding in between.Keywords: additive manufacturing techniques, 3D printing, CAD software, UTM machine
Procedia PDF Downloads 10411633 The Effect of Health Program on the Fitness Ability of Abnormal BMI University Students
Authors: Hui-Fang Lee, Meng-Chu Liu, Wen-Chi Lu, Hsuan-Jung Hsieh
Abstract:
The purpose of the study was to examine the effect of health program on the fitness ability of abnormal BMI students of Ching-Yun University of Science and Technology. In order to achieve this purpose, self-regulation theory and dietary education were applied, and the effect of 10-week sports activities and three-day diet records on pre-test and post-test of fitness activities was analyzed. There were 40 original participants. Then, nine people who were with normal BMI, low attendance or unfinished fitness test were eliminated from this research. The valid samples were 31 (77.5%) participants. The fitness activities included sit-bending, one minute sit-up, standing long jump, and three-minute stage boarding. The averages of three-day diet records were compared, and differences of pre-test and post-test of the four fitness activities were analyzed with paired-samples t test. The results showed that there was a significant difference between pre-test and post of male students’ BMI and one minute sit-up. Females’ sit-bending and one minute sit-up had the same effect. Females had high fat intake in three-day diet records. The research showed that the use of self-regulation theory and dietary education, the implementation of sports activities and three-day diet records could significantly enhance the physical fitness indicators or effects. While in the course of sports, we should guide students to think about the gap between self-behavior and ideal behavior, then realize the main reasons and improving methods, and finally go towards the goal and improve the effect of physical fitness.Keywords: self-regulation theory, dietary education, three-day diet records, physical fitness
Procedia PDF Downloads 32411632 The Damage Assessment of Industrial Buildings Located on Clayey Soils Using in-Situ Tests
Authors: Ismail Akkaya, Mucip Tapan, Ali Ozvan
Abstract:
Some of the industrially prefabricated buildings located on clayey soils were damaged due to soil conditions. The reasons of these damages are generally due to different settlement capacity, the different plasticity of soils and the level of ground water. The aim of this study is to determine the source of these building damages by conducting in situ tests. Therefore, pressuremeter test, which is one of the borehole loading test conducted to determine the properties of soils under the foundations and Standart Penetration Test (SPT). The results of these two field tests were then used to accurately obtain the consistency and firmness of soils. Pressuremeter Deformation Module (EM) and Net Limiting Pressure (PL) of soils were calculated after the pressuremeter tests. These values were then compared with the SPT (N30) and SPT (N60) results. An empirical equation was developed to obtain EM and PL values of such soils from SPT test results. These values were then used to calculate soil bearing capacity as well as the soil settlement. Finally, the relationship between the foundation settlement and the damage of these buildings were checked. It was found that calculated settlement values were almost the same as measured settlement values.Keywords: damaged building, pressuremeter, standard penetration test, low and high plasticity clay
Procedia PDF Downloads 31911631 Development of Taiwanese Sign Language Receptive Skills Test for Deaf Children
Authors: Hsiu Tan Liu, Chun Jung Liu
Abstract:
It has multiple purposes to develop a sign language receptive skills test. For example, this test can be used to be an important tool for education and to understand the sign language ability of deaf children. There is no available test for these purposes in Taiwan. Through the discussion of experts and the references of standardized Taiwanese Sign Language Receptive Test for adults and adolescents, the frame of Taiwanese Sign Language Receptive Skills Test (TSL-RST) for deaf children was developed, and the items were further designed. After multiple times of pre-trials, discussions and corrections, TSL-RST is finally developed which can be conducted and scored online. There were 33 deaf children who agreed to be tested from all three deaf schools in Taiwan. Through item analysis, the items were picked out that have good discrimination index and fair difficulty index. Moreover, psychometric indexes of reliability and validity were established. Then, derived the regression formula was derived which can predict the sign language receptive skills of deaf children. The main results of this study are as follows. (1). TSL-RST includes three sub-test of vocabulary comprehension, syntax comprehension and paragraph comprehension. There are 21, 20, and 9 items in vocabulary comprehension, syntax comprehension, and paragraph comprehension, respectively. (2). TSL-RST can be conducted individually online. The sign language ability of deaf students can be calculated fast and objectively, so that they can get the feedback and results immediately. This can also contribute to both teaching and research. The most subjects can complete the test within 25 minutes. While the test procedure, they can answer the test questions without relying on their reading ability or memory capacity. (3). The sub-test of the vocabulary comprehension is the easiest one, syntax comprehension is harder than vocabulary comprehension and the paragraph comprehension is the hardest. Each of the three sub-test and the whole test are good in item discrimination index. (4). The psychometric indices are good, including the internal consistency reliability (Cronbach’s α coefficient), test-retest reliability, split-half reliability, and content validity. The sign language ability are significantly related to non-verbal IQ, the teachers’ rating to the students’ sign language ability and students’ self-rating to their own sign language ability. The results showed that the higher grade students have better performance than the lower grade students, and students with deaf parent perform better than those with hearing parent. These results made TLS-RST have great discriminant validity. (5). The predictors of sign language ability of primary deaf students are age and years of starting to learn sign language. The results of this study suggested that TSL-RST can effectively assess deaf student’s sign language ability. This study also proposed a model to develop a sign language tests.Keywords: comprehension test, elementary school, sign language, Taiwan sign language
Procedia PDF Downloads 19011630 Experimental Investigation on the Behavior of Steel Fibers Reinforced Concrete under Impact Loading
Authors: Feng Fu, Ahmad Bazgir
Abstract:
This study aimed to investigate and examine the structural behaviour of steel fibre reinforced concrete slabs when subjected to impact loading using drop weight method. A number of compressive tests, tensile splitting tests, as well as impact tests were conducted. The experimental work consists of testing both conventional reinforced slabs and SFRC slabs. Parameters to be considered for carrying out the test will consist of the volume fraction of steel fibre, type of steel fibres, drop weight height and number of blows. Energy absorption of slabs under impact loading and failure modes were examined in-depth and compared with conventional reinforced concrete slab are investigated.Keywords: steel fibre reinforce concrete, compressive test, tensile splitting test, impact test
Procedia PDF Downloads 42311629 Substantial Fatigue Similarity of a New Small-Scale Test Rig to Actual Wheel-Rail System
Authors: Meysam Naeimi, Zili Li, Roumen Petrov, Rolf Dollevoet, Jilt Sietsma, Jun Wu
Abstract:
The substantial similarity of fatigue mechanism in a new test rig for rolling contact fatigue (RCF) has been investigated. A new reduced-scale test rig is designed to perform controlled RCF tests in wheel-rail materials. The fatigue mechanism of the rig is evaluated in this study using a combined finite element-fatigue prediction approach. The influences of loading conditions on fatigue crack initiation have been studied. Furthermore, the effects of some artificial defects (squat-shape) on fatigue lives are examined. To simulate the vehicle-track interaction by means of the test rig, a three-dimensional finite element (FE) model is built up. The nonlinear material behaviour of the rail steel is modelled in the contact interface. The results of FE simulations are combined with the critical plane concept to determine the material points with the greatest possibility of fatigue failure. Based on the stress-strain responses, by employing of previously postulated criteria for fatigue crack initiation (plastic shakedown and ratchetting), fatigue life analysis is carried out. The results are reported for various loading conditions and different defect sizes. Afterward, the cyclic mechanism of the test rig is evaluated from the operational viewpoint. The results of fatigue life predictions are compared with the expected number of cycles of the test rig by its cyclic nature. Finally, the estimative duration of the experiments until fatigue crack initiation is roughly determined.Keywords: fatigue, test rig, crack initiation, life, rail, squats
Procedia PDF Downloads 51611628 Simulation 2D of Flare Steel Tubes
Authors: B. Daheche, M. T. Hannachi, H. Djebaili
Abstract:
In this approach, we tried to describe the flare test tubes welded by high frequency induction HF, and its experimental application. The test is carried out ENTTPP (National company of pipe mill and processing of flat products). Usually, the final products (tube) undergo a series of destructive testing (CD) in order to see the efficiency of welding. This test performed on sections of pipe with a length defined in the notice is made under a determined effort (pressure), which depends on its share of other parameters namely mechanical (fracture resistance) and geometry (thickness tube, outside diameter), the variation of this effort is well researched and recorded.Keywords: flare, destructive testing, pressure, drafts tube, tube finished
Procedia PDF Downloads 31911627 Study of the Protective Effects of Summer Savory against Multiple Organ Damage Induced by Lead Acetate in Rats
Authors: Bassant M. M. Ibrahim, Doha H. Abou Baker, Ahmed Abd Elghafour
Abstract:
Excessive exposure to heavy metals contributes to the occurrence of deleterious health problems that affect vital organs like the brain, liver, kidneys, and heart. The use of natural products that have antioxidant capabilities may contribute to the protection of these organs. In the present study, the essential oil of summer savory (Satureja hortensis) was used to evaluate its protective effects against lead acetate induced damaging effect on rats’ vital organs, due to its high contents of carvacrol, y-terpinene, and p-cymene. Forty female Wister Albino rats were classified into five equal groups, the 1st served as normal group, the 2nd served as positive control group was given lead acetate (60 mg/kg) intra-peritoneal (IP), the third to fifth groups were treated with calcium disodium (EDTA) as chelating agent and summer savory essential oil in doses of (50 and 100mg/kg) respectively. All treatments were given IP concomitant with lead acetate for ten successive days. At the end of the experiment duration electrocardiogram (ECG), an open field test for the evaluation of psychological state, rotarod test as for the evaluation of locomotor coordination ability as well as anti-inflammatory and oxidative stress biomarkers in serum and histopathology of vital organs were performed. The investigations in this study show that the protective effect of high dose of summer savory essential oil is more than the low dose and that the essential oil of summer savory is a promising agent that can contribute to the protection of vital organs against the hazardous damaging effects of lead acetate.Keywords: brain, heart, kidneys, lead acetate, liver, protective, summer savory
Procedia PDF Downloads 12511626 A Comprehensive Method of Fault Detection and Isolation based on Testability Modeling Data
Authors: Junyou Shi, Weiwei Cui
Abstract:
Testability modeling is a commonly used method in testability design and analysis of system. A dependency matrix will be obtained from testability modeling, and we will give a quantitative evaluation about fault detection and isolation. Based on the dependency matrix, we can obtain the diagnosis tree. The tree provides the procedures of the fault detection and isolation. But the dependency matrix usually includes built-in test (BIT) and manual test in fact. BIT runs the test automatically and is not limited by the procedures. The method above cannot give a more efficient diagnosis and use the advantages of the BIT. A Comprehensive method of fault detection and isolation is proposed. This method combines the advantages of the BIT and Manual test by splitting the matrix. The result of the case study shows that the method is effective.Keywords: fault detection, fault isolation, testability modeling, BIT
Procedia PDF Downloads 33611625 Identifying Applicant Potential Through Admissions Testing
Authors: Belinda Brunner
Abstract:
Objectives: Communicate common test constructs of well-known higher education admissions tests. Discuss influences on admissions test construct definition and design and discuss research on related to factors influencing success in academic study. Discuss how admissions tests can be used to identify relevant talent. Examine how admissions test can be used to facilitate educational mobility and inform selection decisions when the prerequisite curricula is not standardized Observations: Generally speaking, constructs of admissions tests can be placed along a continuum from curriculum-related knowledge to more general reasoning abilities. For example, subject-specific achievement tests are more closely aligned to a prescribed curriculum, while reasoning tests are typically not associated with a specific curriculum. This session will draw reference from the test-constructs of well-known international higher education admissions tests, such as the UK clinical aptitude test (UKCAT) which is used for medicine and dentistry admissions. Conclusions: The purpose of academic admissions testing is to identify potential students with the prerequisite skills set needed to succeed in the academic environment, but how can the test construct help achieve this goal? Determination of the appropriate test construct for tests used in the admissions selection decisions should be influenced by a number of factors, including the preceding academic curricula, other criteria influencing the admissions decision, and the principal purpose for testing. Attendees of this session will learn the types of aptitudes and knowledge that are assessed higher education admissions tests and will have the opportunity to gain insight into how careful and deliberate consideration of the desired test constructs can aid in identifying potential students with the greatest likelihood of success in medical school.Keywords: admissions, measuring success, selection, identify skills
Procedia PDF Downloads 48911624 Effect of Welding Parameters on Mechanical and Microstructural Properties of Aluminum Alloys Produced by Friction Stir Welding
Authors: Khalil Aghapouramin
Abstract:
The aim of the present work is to investigate the mechanical and microstructural properties of dissimilar and similar aluminum alloys welded by Friction Stir Welding (FSW). The specimens investigated by applying different welding speed and rotary speed. Typically, mechanical properties of the joints performed through tensile test fatigue test and microhardness (HV) at room temperature. Fatigue test investigated by using electromechanical testing machine under constant loading control with similar since wave loading. The Maximum stress versus minimum got the range between 0.1 to 0.3 in the research. Based upon welding parameters by optical observation and scanning electron microscopy microstructural properties fulfilled with a cross section of welds, in addition, SEM observations were made of the fracture surfacesKeywords: friction stir welding, fatigue and tensile test, Al alloys, microstructural behavior
Procedia PDF Downloads 34311623 An Approach for Estimating Open Education Resources Textbook Savings: A Case Study
Authors: Anna Ching-Yu Wong
Abstract:
Introduction: Textbooks play a sizable portion of the overall cost of higher education students. It is a board consent that open education resources (OER) reduce the te4xtbook costs and provide students a way to receive high-quality learning materials at little or no cost to them. However, there is less agreement over exactly how much. This study presents an approach for calculating OER savings by using SUNY Canton NON-OER courses (N=233) to estimate the potentially textbook savings for one semester – Fall 2022. The purpose in collecting data is to understand how much potentially saved from using OER materials and to have a record for future further studies. Literature Reviews: In the past years, researchers identified the rising cost of textbooks disproportionately harm students in higher education institutions and how much an average cost of a textbook. For example, Nyamweya (2018) found that on average students save $116.94 per course when OER adopted in place of traditional commercial textbooks by using a simple formula. Student PIRGs (2015) used reports of per-course savings when transforming a course from using a commercial textbook to OER to reach an estimate of $100 average cost savings per course. Allen and Wiley (2016) presented at the 2016 Open Education Conference on multiple cost-savings studies and concluded $100 was reasonable per-course savings estimates. Ruth (2018) calculated an average cost of a textbook was $79.37 per-course. Hilton, et al (2014) conducted a study with seven community colleges across the nation and found the average textbook cost to be $90.61. There is less agreement over exactly how much would be saved by adopting an OER course. This study used SUNY Canton as a case study to create an approach for estimating OER savings. Methodology: Step one: Identify NON-OER courses from UcanWeb Class Schedule. Step two: View textbook lists for the classes (Campus bookstore prices). Step three: Calculate the average textbook prices by averaging the new book and used book prices. Step four: Multiply the average textbook prices with the number of students in the course. Findings: The result of this calculation was straightforward. The average of a traditional textbooks is $132.45. Students potentially saved $1,091,879.94. Conclusion: (1) The result confirms what we have known: Adopting OER in place of traditional textbooks and materials achieves significant savings for students, as well as the parents and taxpayers who support them through grants and loans. (2) The average textbook savings for adopting an OER course is variable depending on the size of the college and as well as the number of enrollment students.Keywords: textbook savings, open textbooks, textbook costs assessment, open access
Procedia PDF Downloads 7511622 An Approach for Coagulant Dosage Optimization Using Soft Jar Test: A Case Study of Bangkhen Water Treatment Plant
Authors: Ninlawat Phuangchoke, Waraporn Viyanon, Setta Sasananan
Abstract:
The most important process of the water treatment plant process is the coagulation using alum and poly aluminum chloride (PACL), and the value of usage per day is a hundred thousand baht. Therefore, determining the dosage of alum and PACL are the most important factors to be prescribed. Water production is economical and valuable. This research applies an artificial neural network (ANN), which uses the Levenberg–Marquardt algorithm to create a mathematical model (Soft Jar Test) for prediction chemical dose used to coagulation such as alum and PACL, which input data consists of turbidity, pH, alkalinity, conductivity, and, oxygen consumption (OC) of Bangkhen water treatment plant (BKWTP) Metropolitan Waterworks Authority. The data collected from 1 January 2019 to 31 December 2019 cover changing seasons of Thailand. The input data of ANN is divided into three groups training set, test set, and validation set, which the best model performance with a coefficient of determination and mean absolute error of alum are 0.73, 3.18, and PACL is 0.59, 3.21 respectively.Keywords: soft jar test, jar test, water treatment plant process, artificial neural network
Procedia PDF Downloads 16711621 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data
Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah
Abstract:
At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.Keywords: Semantic Web, linked open data, database, statistic
Procedia PDF Downloads 17611620 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test
Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman
Abstract:
At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. These findings need to be confirmed with a greater number of stations across other Australian states.Keywords: floods, FLIKE, probability distributions, flood frequency, outlier
Procedia PDF Downloads 45211619 Income-Consumption Relationships in Pakistan (1980-2011): A Cointegration Approach
Authors: Himayatullah Khan, Alena Fedorova
Abstract:
The present paper analyses the income-consumption relationships in Pakistan using annual time series data from 1980-81 to 2010-1. The paper uses the Augmented Dickey-Fuller test to check the unit root and stationarity in these two time series. The paper finds that the two time series are nonstationary but stationary at their first difference levels. The Augmented Engle-Granger test and the Cointegrating Regression Durbin-Watson test imply that the two time series of consumption and income are cointegrated and that long-run marginal propensity to consume is 0.88 which is given by the estimated (static) equilibrium relation. The paper also used the error correction mechanism to find out to model dynamic relationship. The purpose of the ECM is to indicate the speed of adjustment from the short-run equilibrium to the long-run equilibrium state. The results show that MPC is equal to 0.93 and is highly significant. The coefficient of Engle-Granger residuals is negative but insignificant. Statistically, the equilibrium error term is zero, which suggests that consumption adjusts to changes in GDP in the same period. The short-run changes in GDP have a positive impact on short-run changes in consumption. The paper concludes that we may interpret 0.93 as the short-run MPC. The pair-wise Granger Causality test shows that both GDP and consumption Granger cause each other.Keywords: cointegrating regression, Augmented Dickey Fuller test, Augmented Engle-Granger test, Granger causality, error correction mechanism
Procedia PDF Downloads 41611618 Valorisation of Polyethylene and Plastic Bottle Wastes as Pavement Blocks
Authors: Babagana Mohammed, Fidelis Patrick Afangide
Abstract:
This research investigated the possibility of using waste low-dense polyethylene and waste plastic bottles for the production of interlock pavement blocks. In many parts of the world, interlock pavement block is used widely as modern day solution to outdoor flooring applications and the blocks have different shapes, sizes and colours suiting the imagination of landscape architects. Using suitable and conventional mould having a 220 x 135 x 50 mm³ shape, the interlock blocks were produced. The material constituents of the produced blocks were waste low-dense polyethylene and waste plastic bottles mixed in varying, respective percentage-weight proportions of; 100%+0%, 75%+25%, 50%+50% and 25%+75%. The blocks were then tested for unconfined compressive strength and water absorption properties. The test results compared well with those of conventional concrete interlock blocks and the research demonstrates the possibility of value recovery from the waste streams which are currently dumped in open-spaces thereby affecting the environment.Keywords: pavement blocks, polyethylene, plastic bottle, wastes, valorization
Procedia PDF Downloads 40511617 A Multi-Layer Based Architecture for the Development of an Open Source CAD/CAM Integration Virtual Platform
Authors: Alvaro Aguinaga, Carlos Avila, Edgar Cando
Abstract:
This article proposes a n-layer architecture, with a web client as a front-end, for the development of a virtual platform for process simulation on CNC machines. This Open-Source platform includes a CAD-CAM interface drawing primitives, and then used to furnish a CNC program that triggers a touch-screen virtual simulator. The objectives of this project are twofold. First one is an educational component that fosters new alternatives for the CAD-CAM/CNC learning process in undergrad and grade schools and technical and technological institutes emphasizing in the development of critical skills, discussion and collaborative work. The second objective puts together a research and technological component that will take the state of the art in CAD-CAM integration to a new level with the development of optimal algorithms and virtual platforms, on-line availability, that will pave the way for the long-term goal of this project, that is, to have a visible and active graduate school in Ecuador and a world wide Open-Innovation community in the area of CAD-CAM integration and operation of CNC machinery. The virtual platform, developed as a part of this study: (1) delivers improved training process of students, (2) creates a multidisciplinary team and a collaborative work space that will push the new generation of students to face future technological challenges, (3) implements industry standards for CAD/CAM, (4) presents a platform for the development of industrial applications. A protoype of this system was developed and implemented in a network of universities and technological institutes in Ecuador.Keywords: CAD-CAM integration, virtual platforms, CNC machines, multi-layer based architecture
Procedia PDF Downloads 42911616 The Effective of Training Program Using Neuro- Linguistic Programming (NLP) to Reduce the Test Anxiety through the Use of Biological Feedback
Authors: Mohammed Fakehy, Mohammed Haggag
Abstract:
The problem of test anxiety considered as one of the most important and most complex psychological problems faced by students of King Saud University, where university students in a need to bring their reassurance and psychological comfort, relieves feeling pain and difficulties of the study. Recently, there are programs and science that help human to change, including the science Linguistic Programming this neural science stems from not just the tips of the need to make the effort or continue to work, but provides the keys in which one can be controlled in the internal environment. Even human potential energy is extracted seeking to achieve success and happiness and excellence. Through the work of the researchers as members of the teaching staff at King Saud University and specialists in the field of psychology noticed the suffering of some students of King Saud University, test anxiety. In an attempt by the researchers to mitigate as much as possible of the unity of this concern, students will have a training program in Neuro Linguistic Programming. The main Question of this study is What is the effectiveness of the impact of a training program using NLP to reduce test anxiety by using a biological feedback. Therefore, the results of this study might serve as a good announcement about the usefulness of NLP programs which influence future research to significant effect of NLP on test anxiety.Keywords: neuro linguistic programming, test anxiety, biological feedback, king saud
Procedia PDF Downloads 52811615 Households’ Willingness to Pay for Watershed Management Practices in Lake Hawassa Watershed, Southern Ethiopia
Authors: Mulugeta Fola, Mengistu Ketema, Kumilachew Alamerie
Abstract:
Watershed provides vast economic benefits within and beyond the management area of interest. But most watersheds in Ethiopia are increasingly facing the threats of degradation due to both natural and man-made causes. To reverse these problems, communities’ participation in sustainable management programs is among the necessary measures. Hence, this study assessed the households’ willingness to pay for watershed management practices through a contingent valuation study approach. Double bounded dichotomous choice with open-ended follow-up format was used to elicit the households’ willingness to pay. Based on data collected from 275 randomly selected households, descriptive statistics results indicated that most households (79.64%) were willing to pay for watershed management practices. A bivariate Probit model was employed to identify determinants of households’ willingness to pay and estimate mean willingness to pay. Its result shows that age, gender, income, livestock size, perception of watershed degradation, social position, and offered bids were important variables affecting willingness to pay for watershed management practices. The study also revealed that the mean willingness to pay for watershed management practices was calculated to be 58.41 Birr and 47.27 Birr per year from the double bounded and open-ended format, respectively. The study revealed that the aggregate welfare gains from watershed management practices were calculated to be 931581.09 Birr and 753909.23 Birr per year from double bounded dichotomous choice and open-ended format, respectively. Therefore, the policymakers should make households to pay for the services of watershed management practices in the study area.Keywords: bivariate probit model, contingent valuation, watershed management practices, willingness to pay
Procedia PDF Downloads 22411614 Enhancing Self-Assessment and Management Potentials by Modifying Option Selections on Hartman’s Personality Test
Authors: Daniel L. Clinciu, IkromAbdulaev, Brian D. Oscar
Abstract:
Various personality profile tests are used to identify personality strengths and limits in individuals, helping both individuals and managers to optimize work and team effort in organizations. One such test, Hartman’s personality profile emphasizes four driving "core motives" influenced or affected by both strengths and limitations. The driving core motives are classified into four colors: Red-motivated by power; Blue-discipline and loyalty; White-peace; and Yellow–fun loving. Two shortcomings of Hartman’s personality test are noted; 1) only one choice for every item/situation allowed and 2) selection of a choice even if not applicable. A test taker may be as much nurturing as he is opinionated but since “opinionated” seems less attractive the individual would likely select nurturing, causing a misidentification in personality strengths and limits. Since few individuals have a "strong" personality, it is difficult to assess their true personality strengths and limits allowing either only one choice or requiring unwanted choices, undermining the potential of the test. We modified Hartman’s personality profile allowing test takers to make either multiple choices for any item/situation or leave them blank when not applying. Sixty-eight participants (38 males and 30 females), 17-49 years old, from countries in Asia, Europe, N. America, CIS, Africa, Latin America, and Oceania were included. 58 participants (85.3%) reported the modified test, allowing either multiple or no choices better identified their personality strengths and limits, while 10 participants (14.7%) expressed the original (one choice version) is sufficient. The overall results show our modified test enhanced the identification and balance of personality strengths and limits, aiding test takers, managers, and firms to better understand personality strengths and limits, particularly useful in making task-related, teamwork, and management decisions.Keywords: organizational behavior, personality tests, personality limitations, personality strengths, task management, team work
Procedia PDF Downloads 36311613 Nonlinear Aerodynamic Parameter Estimation of a Supersonic Air to Air Missile by Using Artificial Neural Networks
Authors: Tugba Bayoglu
Abstract:
Aerodynamic parameter estimation is very crucial in missile design phase, since accurate high fidelity aerodynamic model is required for designing high performance and robust control system, developing high fidelity flight simulations and verification of computational and wind tunnel test results. However, in literature, there is not enough missile aerodynamic parameter identification study for three main reasons: (1) most air to air missiles cannot fly with constant speed, (2) missile flight test number and flight duration are much less than that of fixed wing aircraft, (3) variation of the missile aerodynamic parameters with respect to Mach number is higher than that of fixed wing aircraft. In addition to these challenges, identification of aerodynamic parameters for high wind angles by using classical estimation techniques brings another difficulty in the estimation process. The reason for this, most of the estimation techniques require employing polynomials or splines to model the behavior of the aerodynamics. However, for the missiles with a large variation of aerodynamic parameters with respect to flight variables, the order of the proposed model increases, which brings computational burden and complexity. Therefore, in this study, it is aimed to solve nonlinear aerodynamic parameter identification problem for a supersonic air to air missile by using Artificial Neural Networks. The method proposed will be tested by using simulated data which will be generated with a six degree of freedom missile model, involving a nonlinear aerodynamic database. The data will be corrupted by adding noise to the measurement model. Then, by using the flight variables and measurements, the parameters will be estimated. Finally, the prediction accuracy will be investigated.Keywords: air to air missile, artificial neural networks, open loop simulation, parameter identification
Procedia PDF Downloads 28111612 Separation of Rare-Earth Metals from E-Wastes
Authors: Gulsara Akanova, Akmaral Ismailova, Duisek Kamysbayev
Abstract:
The separation of rare earth metals (REM) from a neodymium magnet has been widely studied in the last year. The waste of computer hard disk contains 25.41 % neodymium, 64.09 % iron, and <<1 % boron. To further the separation of rare-earth metals, the magnet dissolved in open and closed systems with nitric acid. In the closed system, the magnet was dissolved in a microwave sample preparation system at different temperatures and pressures and the dissolution process lasted 1 hour. In the open system, the acid dissolution of the magnet was conducted at room temperature and the process lasted 30-40 minutes. To remove the iron in the magnet, oxalic acid was used and precipitated as oxalates under both conditions. For separation of rare earth metals (Nd, Pr and Dy) from magnet waste is used sorption method.Keywords: dissolution of the magnet, Neodymium magnet, rare earth metals, separation, Sorption
Procedia PDF Downloads 20911611 Laparoscopic Proximal Gastrectomy in Gastroesophageal Junction Tumours
Authors: Ihab Saad Ahmed
Abstract:
Background For Siewert type I and II gastroesophageal junction tumor (GEJ) laparoscopic proximal gastrectomy can be performed. It is associated with several perioperative benefits compared with open proximal gastrectomy. The use of laparoscopic proximal gastrectomy (LPG) has become an increasingly popular approach for select tumors Methods We describe our technique for LPG, including the preoperative work-up, illustrated images of the main principle steps of the surgery, and our postoperative course. Results Thirteen pts (nine males, four female) with type I, II (GEJ) adenocarcinoma had laparoscopic radical proximal gastrectomy and D2 lymphadenectomy. All of our patient received neoadjuvant chemotherapy, eleven patients had intrathoracic anastomosis through mini thoracotomy (two hand sewn end to end anastomoses and the other 9 patient end to side using circular stapler), two patients with intrathoracic anastomosis had flap and wrap technique, two patients had thoracoscopic esophageal and mediastinal lymph node dissection with cervical anastomosis The mean blood loss 80ml, no cases were converted to open. The mean operative time 250 minute Average LN retrieved 19-25, No sever complication such as leakage, stenosis, pancreatic fistula ,or intra-abdominal abscess were reported. Only One patient presented with empyema 1.5 month after discharge that was managed conservatively. Conclusion For carefully selected patients, LPG in GEJ tumour type I and II is a safe and reasonable alternative for open technique , which is associated with similar oncologic outcomes and low morbidity. It showed less blood loss, respiratory infections, with similar 1- and 3-year survival rates.Keywords: LPG(laparoscopic proximal gastrectomy, GEJ( gastroesophageal junction tumour), d2 lymphadenectomy, neoadjuvant cth
Procedia PDF Downloads 12511610 Subclasses of Bi-Univalent Functions Associated with Hohlov Operator
Authors: Rashidah Omar, Suzeini Abdul Halim, Aini Janteng
Abstract:
The coefficients estimate problem for Taylor-Maclaurin series is still an open problem especially for a function in the subclass of bi-univalent functions. A function f ϵ A is said to be bi-univalent in the open unit disk D if both f and f-1 are univalent in D. The symbol A denotes the class of all analytic functions f in D and it is normalized by the conditions f(0) = f’(0) – 1=0. The class of bi-univalent is denoted by The subordination concept is used in determining second and third Taylor-Maclaurin coefficients. The upper bound for second and third coefficients is estimated for functions in the subclasses of bi-univalent functions which are subordinated to the function φ. An analytic function f is subordinate to an analytic function g if there is an analytic function w defined on D with w(0) = 0 and |w(z)| < 1 satisfying f(z) = g[w(z)]. In this paper, two subclasses of bi-univalent functions associated with Hohlov operator are introduced. The bound for second and third coefficients of functions in these subclasses is determined using subordination. The findings would generalize the previous related works of several earlier authors.Keywords: analytic functions, bi-univalent functions, Hohlov operator, subordination
Procedia PDF Downloads 29411609 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering
Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi
Abstract:
In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering
Procedia PDF Downloads 15211608 Efficient Pre-Processing of Single-Cell Assay for Transposase Accessible Chromatin with High-Throughput Sequencing Data
Authors: Fan Gao, Lior Pachter
Abstract:
The primary tool currently used to pre-process 10X Chromium single-cell ATAC-seq data is Cell Ranger, which can take very long to run on standard datasets. To facilitate rapid pre-processing that enables reproducible workflows, we present a suite of tools called scATAK for pre-processing single-cell ATAC-seq data that is 15 to 18 times faster than Cell Ranger on mouse and human samples. Our tool can also calculate chromatin interaction potential matrices, and generate open chromatin signal and interaction traces for cell groups. We use scATAK tool to explore the chromatin regulatory landscape of a healthy adult human brain and unveil cell-type specific features, and show that it provides a convenient and computational efficient approach for pre-processing single-cell ATAC-seq data.Keywords: single-cell, ATAC-seq, bioinformatics, open chromatin landscape, chromatin interactome
Procedia PDF Downloads 156