Search results for: open source data
29066 Computational and Experimental Determination of Acoustic Impedance of Internal Combustion Engine Exhaust
Authors: A. O. Glazkov, A. S. Krylova, G. G. Nadareishvili, A. S. Terenchenko, S. I. Yudin
Abstract:
The topic of the presented materials concerns the design of the exhaust system for a certain internal combustion engine. The exhaust system can be divided into two parts. The first is the engine exhaust manifold, turbocharger, and catalytic converters, which are called “hot part.” The second part is the gas exhaust system, which contains elements exclusively for reducing exhaust noise (mufflers, resonators), the accepted designation of which is the "cold part." The design of the exhaust system from the point of view of acoustics, that is, reducing the exhaust noise to a predetermined level, consists of working on the second part. Modern computer technology and software make it possible to design "cold part" with high accuracy in a given frequency range but with the condition of accurately specifying the input parameters, namely, the amplitude spectrum of the input noise and the acoustic impedance of the noise source in the form of an engine with a "hot part". Getting this data is a difficult problem: high temperatures, high exhaust gas velocities (turbulent flows), and high sound pressure levels (non-linearity mode) do not allow the calculated results to be applied with sufficient accuracy. The aim of this work is to obtain the most reliable acoustic output parameters of an engine with a "hot part" based on a complex of computational and experimental studies. The presented methodology includes several parts. The first part is a finite element simulation of the "cold part" of the exhaust system (taking into account the acoustic impedance of radiation of outlet pipe into open space) with the result in the form of the input impedance of "cold part". The second part is a finite element simulation of the "hot part" of the exhaust system (taking into account acoustic characteristics of catalytic units and geometry of turbocharger) with the result in the form of the input impedance of the "hot part". The next third part of the technique consists of the mathematical processing of the results according to the proposed formula for the convergence of the mathematical series of summation of multiple reflections of the acoustic signal "cold part" - "hot part". This is followed by conducting a set of tests on an engine stand with two high-temperature pressure sensors measuring pulsations in the nozzle between "hot part" and "cold part" of the exhaust system and subsequent processing of test results according to a well-known technique in order to separate the "incident" and "reflected" waves. The final stage consists of the mathematical processing of all calculated and experimental data to obtain a result in the form of a spectrum of the amplitude of the engine noise and its acoustic impedance.Keywords: acoustic impedance, engine exhaust system, FEM model, test stand
Procedia PDF Downloads 5929065 Artificial Intelligence Approach to Water Treatment Processes: Case Study of Daspoort Treatment Plant, South Africa
Authors: Olumuyiwa Ojo, Masengo Ilunga
Abstract:
Artificial neural network (ANN) has broken the bounds of the convention programming, which is actually a function of garbage in garbage out by its ability to mimic the human brain. Its ability to adopt, adapt, adjust, evaluate, learn and recognize the relationship, behavior, and pattern of a series of data set administered to it, is tailored after the human reasoning and learning mechanism. Thus, the study aimed at modeling wastewater treatment process in order to accurately diagnose water control problems for effective treatment. For this study, a stage ANN model development and evaluation methodology were employed. The source data analysis stage involved a statistical analysis of the data used in modeling in the model development stage, candidate ANN architecture development and then evaluated using a historical data set. The model was developed using historical data obtained from Daspoort Wastewater Treatment plant South Africa. The resultant designed dimensions and model for wastewater treatment plant provided good results. Parameters considered were temperature, pH value, colour, turbidity, amount of solids and acidity. Others are total hardness, Ca hardness, Mg hardness, and chloride. This enables the ANN to handle and represent more complex problems that conventional programming is incapable of performing.Keywords: ANN, artificial neural network, wastewater treatment, model, development
Procedia PDF Downloads 14929064 Sociocultural Foundations of Psychological Well-Being among Ethiopian Adults
Authors: Kassahun Tilahun
Abstract:
Most of the studies available on adult psychological well-being have been centered on Western countries. However, psychological well-being does not have the same meaning across the world. The Euro-American and African conceptions and experiences of psychological well-being differ systematically. As a result, questions like, how do people living in developing African countries, like Ethiopia, report their psychological well-being; what would the context-specific prominent determinants of their psychological well-being be, needs a definitive answer. This study was, therefore, aimed at developing a new theory that would address these socio-cultural issues of psychological well-being. Consequently, data were obtained through interview and open ended questionnaire. A total of 438 adults, working in governmental and non-governmental organizations situated in Addis Ababa, participated in the study. Appropriate qualitative method of data analysis, i.e. thematic content analysis, was employed for analyzing the data. The thematic analysis involves a type of abductive analysis, driven both by theoretical interest and the nature of the data. Reliability and credibility issues were addressed appropriately. The finding identified five major categories of themes, which are viewed as essential in determining the conceptions and experiences of psychological well-being of Ethiopian adults. These were; socio-cultural harmony, social cohesion, security, competence and accomplishment, and the self. Detailed discussion on the rational for including these themes was made and appropriate positive psychology interventions were proposed. Researchers are also encouraged to expand this qualitative research and in turn develop a suitable instrument taping the psychological well-being of adults with different sociocultural orientations.Keywords: sociocultural, psychological, well-being Ethiopia, adults
Procedia PDF Downloads 54629063 Road Traffic Accidents Analysis in Mexico City through Crowdsourcing Data and Data Mining Techniques
Authors: Gabriela V. Angeles Perez, Jose Castillejos Lopez, Araceli L. Reyes Cabello, Emilio Bravo Grajales, Adriana Perez Espinosa, Jose L. Quiroz Fabian
Abstract:
Road traffic accidents are among the principal causes of traffic congestion, causing human losses, damages to health and the environment, economic losses and material damages. Studies about traditional road traffic accidents in urban zones represents very high inversion of time and money, additionally, the result are not current. However, nowadays in many countries, the crowdsourced GPS based traffic and navigation apps have emerged as an important source of information to low cost to studies of road traffic accidents and urban congestion caused by them. In this article we identified the zones, roads and specific time in the CDMX in which the largest number of road traffic accidents are concentrated during 2016. We built a database compiling information obtained from the social network known as Waze. The methodology employed was Discovery of knowledge in the database (KDD) for the discovery of patterns in the accidents reports. Furthermore, using data mining techniques with the help of Weka. The selected algorithms was the Maximization of Expectations (EM) to obtain the number ideal of clusters for the data and k-means as a grouping method. Finally, the results were visualized with the Geographic Information System QGIS.Keywords: data mining, k-means, road traffic accidents, Waze, Weka
Procedia PDF Downloads 41729062 Design and Implementation a Platform for Adaptive Online Learning Based on Fuzzy Logic
Authors: Budoor Al Abid
Abstract:
Educational systems are increasingly provided as open online services, providing guidance and support for individual learners. To adapt the learning systems, a proper evaluation must be made. This paper builds the evaluation model Fuzzy C Means Adaptive System (FCMAS) based on data mining techniques to assess the difficulty of the questions. The following steps are implemented; first using a dataset from an online international learning system called (slepemapy.cz) the dataset contains over 1300000 records with 9 features for students, questions and answers information with feedback evaluation. Next, a normalization process as preprocessing step was applied. Then FCM clustering algorithms are used to adaptive the difficulty of the questions. The result is three cluster labeled data depending on the higher Wight (easy, Intermediate, difficult). The FCM algorithm gives a label to all the questions one by one. Then Random Forest (RF) Classifier model is constructed on the clustered dataset uses 70% of the dataset for training and 30% for testing; the result of the model is a 99.9% accuracy rate. This approach improves the Adaptive E-learning system because it depends on the student behavior and gives accurate results in the evaluation process more than the evaluation system that depends on feedback only.Keywords: machine learning, adaptive, fuzzy logic, data mining
Procedia PDF Downloads 19629061 Experimental and Numerical Processes of Open Die Forging of Multimetallic Materials with the Usage of Different Lubricants
Authors: Isik Cetintav, Cenk Misirli, Yilmaz Can, Damla Gunel
Abstract:
This work investigates experimental and numerical analysis of open die forging of multimetallic materials. Multimetallic material production has recently become an interesting research field. The mechanical properties of the materials to be used for the formation of multimetallic materials and the mechanical properties of the multimetallic materials produced will be compared and the material flows of the use of different lubricants will be examined. Furthermore, in this work, the mechanical properties of multimetallic metallic materials produced using different materials will be examined by using different lubricants. The advantages and disadvantages of different lubricants will be approached with the bi-metallic material to be produced. Cylindrical specimens consisting of two different materials were used in the experiments. Specimens were prepared as aluminum sleeve and copper core and upset at different reduction. This metal combination present a material model of which chemical composition is different. ABAQUS software was used for the simulations. Simulation and experimental results have also shown reasonable agreement.Keywords: multimetallic, forging, experimental, numerical
Procedia PDF Downloads 27829060 Open Fields' Dosimetric Verification for a Commercially-Used 3D Treatment Planning System
Authors: Nashaat A. Deiab, Aida Radwan, Mohamed Elnagdy, Mohamed S. Yahiya, Rasha Moustafa
Abstract:
This study is to evaluate and investigate the dosimetric performance of our institution's 3D treatment planning system, Elekta PrecisePLAN, for open 6MV fields including square, rectangular, variation in SSD, centrally blocked, missing tissue, square MLC and MLC shaped fields guided by the recommended QA tests prescribed in AAPM TG53, NCS report 15 test packages, IAEA TRS 430 and ESTRO booklet no.7. The study was performed for Elekta Precise linear accelerator designed for clinical range of 4, 6 and 15 MV photon beams with asymmetric jaws and fully integrated multileaf collimator that enables high conformance to target with sharp field edges. Seven different tests were done applied on solid water equivalent phantom along with 2D array dose detection system, the calculated doses using 3D treatment planning system PrecisePLAN, compared with measured doses to make sure that the dose calculations are accurate for open fields including square, rectangular, variation in SSD, centrally blocked, missing tissue, square MLC and MLC shaped fields. The QA results showed dosimetric accuracy of the TPS for open fields within the specified tolerance limits. However large square (25cm x 25cm) and rectangular fields (20cm x 5cm) some points were out of tolerance in penumbra region (11.38 % and 10.9 %, respectively). For the test of SSD variation, the large field resulted from SSD 125 cm for 10cm x 10cm filed the results recorded an error of 0.2% at the central axis and 1.01% in penumbra. The results yielded differences within the accepted tolerance level as recommended. Large fields showed variations in penumbra. These differences between dose values predicted by the TPS and the measured values at the same point may result from limitations of the dose calculation, uncertainties in the measurement procedure, or fluctuations in the output of the accelerator.Keywords: quality assurance, dose calculation, 3D treatment planning system, photon beam
Procedia PDF Downloads 51729059 JavaScript Object Notation Data against eXtensible Markup Language Data in Software Applications a Software Testing Approach
Authors: Theertha Chandroth
Abstract:
This paper presents a comparative study on how to check JSON (JavaScript Object Notation) data against XML (eXtensible Markup Language) data from a software testing point of view. JSON and XML are widely used data interchange formats, each with its unique syntax and structure. The objective is to explore various techniques and methodologies for validating comparison and integration between JSON data to XML and vice versa. By understanding the process of checking JSON data against XML data, testers, developers and data practitioners can ensure accurate data representation, seamless data interchange, and effective data validation.Keywords: XML, JSON, data comparison, integration testing, Python, SQL
Procedia PDF Downloads 14029058 Antistress Effects of Hydrangeae Dulcis Folium on Net Handing Stress-Induced Anxiety-Like Behavior in Zebrafish: Possible Mechanism of Action of Adrenocorticotropin Hormone (ACTH) Receptor
Authors: Lee Seungheon, Kim Ba-Ro
Abstract:
In this study, the anti-stress effects of the ethanolic extract of Hydrangeae Dulcis Folium (EHDF) were investigated. To determine the effects of EHDF on physical stress, changes in the whole-body cortisol level and behaviour were monitored in zebrafish. To induce physical stress, we used the net handling stress (NHS). Fish were treated with EHDF for 6 min before they were exposed to stress, and the fish were either evaluated via behavioural tests, including a novel tank test and an open field test or sacrificed to collect body fluid from the whole body. The results indicate that increased anxiety-like behaviours in the novel tank test and open field test under stress were recovered by treatment with EHDF at 5, 10 and 20 mg/L (P < 0.05). Moreover, compared with the normal group, which was not treated with NHS, the whole-body cortisol level was significantly increased by treatment with NHS in the control group. Compared with the control group, pre-treatment with EHDF at concentrations of 5, 10 and 20 mg/L for 6 min significantly prevented the increase in the whole-body cortisol level induced by NHS (P < 0.05). In addition, adrenocorticotropin hormone (ACTH) challenge studies showed that EHDF completely blocked the effects of ACTH (0.2 IU/g, IP) on cortisol secretion. These results suggest that EHDF may be a good anti-stress candidate and that its mechanism of action may be related to its positive effects on cortisol release.Keywords: net handling stress, zebrafish, hydrangeae dulcis folium, whole-body cortisol, novel tank test, open field test
Procedia PDF Downloads 29929057 Discrete Element Method Simulation of Crushable Pumice Sand
Authors: Sayed Hessam Bahmani, Rolsndo P. Orense
Abstract:
From an engineering point of view, pumice particles are problematic because of their crushability and compressibility due to their vesicular nature. Currently, information on the geotechnical characteristics of pumice sands is limited. While extensive empirical and laboratory tests can be implemented to characterize their behavior, these are generally time-consuming and expensive. These drawbacks have motivated attempts to study the effects of particle breakage of pumice sand through the Discrete Element Method (DEM). This method provides insights into the behavior of crushable granular material at both the micro and macro-level. In this paper, the results of single-particle crushing tests conducted in the laboratory are simulated using DEM through the open-source code YADE. This is done to better understand the parameters necessary to represent the pumice microstructure that governs its crushing features, and to examine how the resulting microstructure evolution affects a particle’s properties. The DEM particle model is then used to simulate the behavior of pumice sand during consolidated drained triaxial tests. The results indicate the importance of incorporating particle porosity and unique surface textures in the material characterization and show that interlocking between the crushed particles significantly influences the drained behavior of the pumice specimen.Keywords: pumice sand, triaxial compression, simulation, particle breakage
Procedia PDF Downloads 24529056 The Adoption and Use of Social Media as a Source of Information by Egyptian Government Journalists
Authors: Essam Mansour
Abstract:
This study purposes to explore the adoption and use of social media as a source of information by Egyptian government journalists. It applied a survey with a total of 386 journalists representing the three official newspapers of Egypt. Findings showed that 27.2% of journalists were found to not use social media, mainly males (69.7%), older than 40 years (77.7%) and mostly with a BA degree (80.4%). On the other hand, 72.8% of them were found to use these platforms who were also males (59.1%), younger than 40 years (65.9%) and mostly with a BA degree (93.2%). More than two-thirds (69.9%) were somewhat old users whose experience ranged from seven to ten years, and more than two-thirds (73.5%) have been heavily using these platforms (four to more than six hours a day. Such results confirm that a large number (95.7%) of users were found to be at least advanced users. Social media users’ home and work were the most significant places to access these platforms, which were found to be easy and useful to use. Most types of social media used were social news, media sharing and micro blogging, blogs comments and forums, social networking sites and bookmarking sites to perform tasks, such as finding information, making communication, keeping up to date, checking materials, sharing information and making discussions. A large number of users tend to accept these media platforms to be a source of information since they are accessible, linked references updated sources, accurate, promote current work, convenient, secured, credible, reliable, stabled, easily identified, copyrighted, build confident and contain filtered information. However, lack of know-how to cite sources, followed by lack of credibility of the source of news, lack of quality of information sources and lack of time were at least significant to journalists when using social media platforms.Keywords: social media, social networking sites, newspapers, journalists, Egypt
Procedia PDF Downloads 25829055 AI Software Algorithms for Drivers Monitoring within Vehicles Traffic - SiaMOTO
Authors: Ioan Corneliu Salisteanu, Valentin Dogaru Ulieru, Mihaita Nicolae Ardeleanu, Alin Pohoata, Bogdan Salisteanu, Stefan Broscareanu
Abstract:
Creating a personalized statistic for an individual within the population using IT systems, based on the searches and intercepted spheres of interest they manifest, is just one 'atom' of the artificial intelligence analysis network. However, having the ability to generate statistics based on individual data intercepted from large demographic areas leads to reasoning like that issued by a human mind with global strategic ambitions. The DiaMOTO device is a technical sensory system that allows the interception of car events caused by a driver, positioning them in time and space. The device's connection to the vehicle allows the creation of a source of data whose analysis can create psychological, behavioural profiles of the drivers involved. The SiaMOTO system collects data from many vehicles equipped with DiaMOTO, driven by many different drivers with a unique fingerprint in their approach to driving. In this paper, we aimed to explain the software infrastructure of the SiaMOTO system, a system designed to monitor and improve driver driving behaviour, as well as the criteria and algorithms underlying the intelligent analysis process.Keywords: artificial intelligence, data processing, driver behaviour, driver monitoring, SiaMOTO
Procedia PDF Downloads 9029054 Using Machine Learning Techniques to Extract Useful Information from Dark Data
Authors: Nigar Hussain
Abstract:
It is a subset of big data. Dark data means those data in which we fail to use for future decisions. There are many issues in existing work, but some need powerful tools for utilizing dark data. It needs sufficient techniques to deal with dark data. That enables users to exploit their excellence, adaptability, speed, less time utilization, execution, and accessibility. Another issue is the way to utilize dark data to extract helpful information to settle on better choices. In this paper, we proposed upgrade strategies to remove the dark side from dark data. Using a supervised model and machine learning techniques, we utilized dark data and achieved an F1 score of 89.48%.Keywords: big data, dark data, machine learning, heatmap, random forest
Procedia PDF Downloads 2829053 Application Difference between Cox and Logistic Regression Models
Authors: Idrissa Kayijuka
Abstract:
The logistic regression and Cox regression models (proportional hazard model) at present are being employed in the analysis of prospective epidemiologic research looking into risk factors in their application on chronic diseases. However, a theoretical relationship between the two models has been studied. By definition, Cox regression model also called Cox proportional hazard model is a procedure that is used in modeling data regarding time leading up to an event where censored cases exist. Whereas the Logistic regression model is mostly applicable in cases where the independent variables consist of numerical as well as nominal values while the resultant variable is binary (dichotomous). Arguments and findings of many researchers focused on the overview of Cox and Logistic regression models and their different applications in different areas. In this work, the analysis is done on secondary data whose source is SPSS exercise data on BREAST CANCER with a sample size of 1121 women where the main objective is to show the application difference between Cox regression model and logistic regression model based on factors that cause women to die due to breast cancer. Thus we did some analysis manually i.e. on lymph nodes status, and SPSS software helped to analyze the mentioned data. This study found out that there is an application difference between Cox and Logistic regression models which is Cox regression model is used if one wishes to analyze data which also include the follow-up time whereas Logistic regression model analyzes data without follow-up-time. Also, they have measurements of association which is different: hazard ratio and odds ratio for Cox and logistic regression models respectively. A similarity between the two models is that they are both applicable in the prediction of the upshot of a categorical variable i.e. a variable that can accommodate only a restricted number of categories. In conclusion, Cox regression model differs from logistic regression by assessing a rate instead of proportion. The two models can be applied in many other researches since they are suitable methods for analyzing data but the more recommended is the Cox, regression model.Keywords: logistic regression model, Cox regression model, survival analysis, hazard ratio
Procedia PDF Downloads 45429052 Disconnect between Water, Sanitation and Hygiene Related Behaviours of Children in School and Family
Authors: Rehan Mohammad
Abstract:
Background: Improved Water, Sanitation and Hygiene (WASH) practices in schools ensure children’s health, well-being and cognitive performance. In India under various WASH interventions in schools, teachers, and other staff make every possible effort to educate children about personal hygiene, sanitation practices and harms of open defecation. However, once children get back to their families, they see other practicing inappropriate WASH behaviors, and they consequently start following them. This show disconnect between school behavior and family behavior, which needs to be bridged to achieve desired WASH outcomes. Aims and Objectives: The aim of this study is to assess the factors causing disconnect of WASH-related behaviors between school and the family of children. It also suggests behavior change interventions to bridge the gap. Methodology: The present study has chosen a mixed- method approach. Both quantitative and qualitative methods of data collection have been used in the present study. The purposive sampling for data collection has been chosen. The data have been collected from 20% children in each age group of 04-08 years and 09-12 years spread over three primary schools and 20% of households to which they belong to which is spread over three slum communities in south district of Delhi. Results: The present study shows that despite of several behavior change interventions at school level, children still practice inappropriate WASH behaviors due to disconnect between school and family behaviors. These behaviors show variation from one age group to another. The inappropriate WASH behaviors being practiced by children include open defecation, wrong disposal of garbage, not keeping personal hygiene, not practicing hand washing practices during critical junctures and not washing fruits and vegetables before eating. The present study has highlighted that 80% of children in the age group of 04-08 years still practice inappropriate WASH behaviors when they go back to their families after school whereas, this percentage has reduced to 40% in case of children in the age group 09-12 years. Present study uncovers association between school and family teaching which creates a huge gap between WASH-related behavioral practices. The study has established that children learn and de-learn the WASH behaviors due to the evident disconnect between behavior change interventions at schools and household level. The study has also made it clear that children understand the significance of appropriate WASH practices but owing to the disconnect the behaviors remain unsettled. The study proposes several behavior change interventions to sync the behaviors of children at school and family level to ensure children’s health, well-being and cognitive performance.Keywords: behavioral interventions, child health, family behavior, school behavior, WASH
Procedia PDF Downloads 11129051 Multimodal Database of Retina Images for Africa: The First Open Access Digital Repository for Retina Images in Sub Saharan Africa
Authors: Simon Arunga, Teddy Kwaga, Rita Kageni, Michael Gichangi, Nyawira Mwangi, Fred Kagwa, Rogers Mwavu, Amos Baryashaba, Luis F. Nakayama, Katharine Morley, Michael Morley, Leo A. Celi, Jessica Haberer, Celestino Obua
Abstract:
Purpose: The main aim for creating the Multimodal Database of Retinal Images for Africa (MoDRIA) was to provide a publicly available repository of retinal images for responsible researchers to conduct algorithm development in a bid to curb the challenges of ophthalmic artificial intelligence (AI) in Africa. Methods: Data and retina images were ethically sourced from sites in Uganda and Kenya. Data on medical history, visual acuity, ocular examination, blood pressure, and blood sugar were collected. Retina images were captured using fundus cameras (Foru3-nethra and Canon CR-Mark-1). Images were stored on a secure online database. Results: The database consists of 7,859 retinal images in portable network graphics format from 1,988 participants. Images from patients with human immunodeficiency virus were 18.9%, 18.2% of images were from hypertensive patients, 12.8% from diabetic patients, and the rest from normal’ participants. Conclusion: Publicly available data repositories are a valuable asset in the development of AI technology. Therefore, is a need for the expansion of MoDRIA so as to provide larger datasets that are more representative of Sub-Saharan data.Keywords: retina images, MoDRIA, image repository, African database
Procedia PDF Downloads 12729050 Geochemical Characterization for Identification of Hydrocarbon Generation: Implication of Unconventional Gas Resources
Authors: Yousif M. Makeen
Abstract:
This research will address the processes of geochemical characterization and hydrocarbon generation process occurring within hydrocarbon source and/or reservoir rocks. The geochemical characterization includes organic-inorganic associations that influence the storage capacity of unconventional hydrocarbon resources (e.g. shale gas) and the migration process of oil/gas of the petroleum source/reservoir rocks. Kerogen i.e. the precursor of petroleum, occurs in various forms and types, may either be oil-prone, gas-prone, or both. China has a number of petroleum-bearing sedimentary basins commonly associated with shale gas, oil sands, and oil shale. Taken Sichuan basin as a selected basin in this study, the Sichuan basin has recorded notable successful discoveries of shale gas especially in the marine shale reservoirs within the area. However, a notable discoveries of lacustrine shale in the North-Este Fuling area indicate the accumulation of shale gas within non-marine source rock. The objective of this study is to evaluate the hydrocarbon storage capacity, generation, and retention processes in the rock matrix of hydrocarbon source/reservoir rocks within the Sichuan basin using an advanced X-ray tomography 3D imaging computational technology, commonly referred to as Micro-CT, SEM (Scanning Electron Microscope), optical microscope as well as organic geochemical facilities (e.g. vitrinite reflectance and UV light). The preliminary results of this study show that the lacustrine shales under investigation are acting as both source and reservoir rocks, which are characterized by very fine grains and very low permeability and porosity. Three pore structures have also been characterized in the study in the lacustrine shales, including organic matter pores, interparticle pores and intraparticle pores using x-ray Computed Tomography (CT). The benefits of this study would be a more successful oil and gas exploration and higher recovery factor, thus having a direct economic impact on China and the surrounding region. Methodologies: SRA TOC/TPH or Rock-Eval technique will be used to determine the source rock richness (S1 and S2) and Tmax. TOC analysis will be carried out using a multi N/C 3100 analyzer. The SRA and TOC results were used in calculating other parameters such as hydrogen index (HI) and production index (PI). This analysis will indicate the quantity of the organic matter. Minimum TOC limits generally accepted as essential for a source-rock are 0.5% for shales and 0.2% for carbonates. Contributions: This research could solve issues related to oil potential, provide targets, and serve as a pathfinder to future exploration activity in the Sichuan basin.Keywords: shale gas, unconventional resources, organic chemistry, Sichuan basin
Procedia PDF Downloads 3729049 In Search of CO₂: Gravity and Magnetic Data for Enhanced Oil Recovery (EOR) Prospect Generation in Central Libya
Authors: Ahmed Saheel
Abstract:
Enhanced oil recovery using carbon dioxide (CO₂-EOR) is a method that can increase oil production beyond what is typically achievable using conventional recovery methods by injecting, and hence storing, carbon dioxide (CO₂) in the oil reservoir. In Libya, plans are under way to source a proportion of this CO₂ from subsurface geology that is known from previous drilling to contain high volumes of CO₂. But first these subsurface volumes need to be more clearly defined and understood. Focusing on the Al-Harouj region of central Libya, ground gravity and airborne magnetic data from the LPI database and the African Magnetic Mapping Project respectively have been prepared and processed by Libyan Petroleum Institute (LPI) and Reid Geophysics Limited (RGL) to produce a range of grids and related products suitable for interpreting geological structure and to make recommendations for subsequent work that will assist CO₂ exploration for purposes of enhanced oil recovery (EOR).Keywords: gravity, magnetic, deduced lineaments, upward continuation
Procedia PDF Downloads 12029048 The Problem of the Use of Learning Analytics in Distance Higher Education: An Analytical Study of the Open and Distance University System in Mexico
Authors: Ismene Ithai Bras-Ruiz
Abstract:
Learning Analytics (LA) is employed by universities not only as a tool but as a specialized ground to enhance students and professors. However, not all the academic programs apply LA with the same goal and use the same tools. In fact, LA is formed by five main fields of study (academic analytics, action research, educational data mining, recommender systems, and personalized systems). These fields can help not just to inform academic authorities about the situation of the program, but also can detect risk students, professors with needs, or general problems. The highest level applies Artificial Intelligence techniques to support learning practices. LA has adopted different techniques: statistics, ethnography, data visualization, machine learning, natural language process, and data mining. Is expected that any academic program decided what field wants to utilize on the basis of his academic interest but also his capacities related to professors, administrators, systems, logistics, data analyst, and the academic goals. The Open and Distance University System (SUAYED in Spanish) of the University National Autonomous of Mexico (UNAM), has been working for forty years as an alternative to traditional programs; one of their main supports has been the employ of new information and communications technologies (ICT). Today, UNAM has one of the largest network higher education programs, twenty-six academic programs in different faculties. This situation means that every faculty works with heterogeneous populations and academic problems. In this sense, every program has developed its own Learning Analytic techniques to improve academic issues. In this context, an investigation was carried out to know the situation of the application of LA in all the academic programs in the different faculties. The premise of the study it was that not all the faculties have utilized advanced LA techniques and it is probable that they do not know what field of study is closer to their program goals. In consequence, not all the programs know about LA but, this does not mean they do not work with LA in a veiled or, less clear sense. It is very important to know the grade of knowledge about LA for two reasons: 1) This allows to appreciate the work of the administration to improve the quality of the teaching and, 2) if it is possible to improve others LA techniques. For this purpose, it was designed three instruments to determinate the experience and knowledge in LA. These were applied to ten faculty coordinators and his personnel; thirty members were consulted (academic secretary, systems manager, or data analyst, and coordinator of the program). The final report allowed to understand that almost all the programs work with basic statistics tools and techniques, this helps the administration only to know what is happening inside de academic program, but they are not ready to move up to the next level, this means applying Artificial Intelligence or Recommender Systems to reach a personalized learning system. This situation is not related to the knowledge of LA, but the clarity of the long-term goals.Keywords: academic improvements, analytical techniques, learning analytics, personnel expertise
Procedia PDF Downloads 12829047 Detailed Depositional Resolutions in Upper Miocene Sands of HT-3X Well, Nam Con Son Basin, Vietnam
Authors: Vo Thi Hai Quan
Abstract:
Nam Con Son sedimentary basin is one of the very important oil and gas basins in offshore Vietnam. Hai Thach field of block 05-2 contains mostly gas accumulations in fine-grained, sand/mud-rich turbidite system, which was deposited in a turbidite channel and fan environment. Major Upper Miocene reservoir of HT-3X lies above a well-developed unconformity. The main objectives of this study are to reconstruct depositional environment and to assess the reservoir quality using data from 14 meters of core samples and digital wireline data of the well HT-3X. The wireline log and core data showed that the vertical sequences of representative facies of the well mainly range from Tb to Te divisions of Bouma sequences with predominance of Tb and Tc compared to Td and Te divisions. Sediments in this well were deposited in a submarine fan association with very fine to fine-grained, homogeneous sandstones that have high porosity and permeability, high- density turbidity currents with longer transport route from the sediment source to the basin, indicating good quality of reservoir. Sediments are comprised mainly of the following sedimentary structures: massive, laminated sandstones, convoluted bedding, laminated ripples, cross-laminated ripples, deformed sandstones, contorted bedding.Keywords: Hai Thach field, Miocene sand, turbidite, wireline data
Procedia PDF Downloads 29229046 Assimilating Multi-Mission Satellites Data into a Hydrological Model
Authors: Mehdi Khaki, Ehsan Forootan, Joseph Awange, Michael Kuhn
Abstract:
Terrestrial water storage, as a source of freshwater, plays an important role in human lives. Hydrological models offer important tools for simulating and predicting water storages at global and regional scales. However, their comparisons with 'reality' are imperfect mainly due to a high level of uncertainty in input data and limitations in accounting for all complex water cycle processes, uncertainties of (unknown) empirical model parameters, as well as the absence of high resolution (both spatially and temporally) data. Data assimilation can mitigate this drawback by incorporating new sets of observations into models. In this effort, we use multi-mission satellite-derived remotely sensed observations to improve the performance of World-Wide Water Resources Assessment system (W3RA) hydrological model for estimating terrestrial water storages. For this purpose, we assimilate total water storage (TWS) data from the Gravity Recovery And Climate Experiment (GRACE) and surface soil moisture data from the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) into W3RA. This is done to (i) improve model estimations of water stored in ground and soil moisture, and (ii) assess the impacts of each satellite of data (from GRACE and AMSR-E) and their combination on the final terrestrial water storage estimations. These data are assimilated into W3RA using the Ensemble Square-Root Filter (EnSRF) filtering technique over Mississippi Basin (the United States) and Murray-Darling Basin (Australia) between 2002 and 2013. In order to evaluate the results, independent ground-based groundwater and soil moisture measurements within each basin are used.Keywords: data assimilation, GRACE, AMSR-E, hydrological model, EnSRF
Procedia PDF Downloads 28929045 Human Capital Mobility of a Skilled Workforce: A Need for a Future of Europe
Authors: Tiron-Tudor Adriana, Farcas Teodora Viorica, Ciolomic Ioana Andreea
Abstract:
The issue of human capital mobility inside Europe is still an open one. Even though there were created some tools in order to better move from one country to another to work and study the number of the people doing this is very low because of various factors presented in this paper. The "rethinking educational" agenda of the European Commission has open the floor for new projects which can create steps towards a European language for skills and competences, qualifications. One of these projects is the Partnership for Exchange of experience in Student on-the-job Training. As part of this project, we are interested to see the situation of the human capital inside EU and the elements that were created until now to support this mobility. Also, the main objective of the project is to make a comparison between the four countries involved in PEST project (Romania, Hungary, Finland, and Estonia), at the education and internship level. The results are helpful for the follow of the project, for identifying where changes can be done and need to be done.Keywords: ECVET, human capital mobility, partnership exchange, students on the job mobility, vocational education and training
Procedia PDF Downloads 42329044 The Use of Orthodontic Pacifiers to Prevent Pacifier Induced Malocclusion - A Literature Review
Authors: Maliha Ahmed Suleman, Sidra Ahmed Suleman
Abstract:
Introduction: The use of pacifiers is common amongst infants and young children as a comforting behavior. These non-nutritive sucking habits can be detrimental to the developing occlusion should they persist while the permanent dentition is established. Orthodontic pacifiers have been recommended as an alternative to conventional pacifiers as they are considered to have less interference with orofacial development. However, there is a lack of consensus on whether this is true. Aim and objectives: To review the prevalence of malocclusion associated with the use of orthodontic pacifiers. Methodology: Literature was identified through a rigorous search of the Embase, Pubmed, CINAHL, and Cochrane Library databases. Articles published from 2000 onwards were included. In total, 5 suitable papers were identified. Results: One study showed that the use of orthodontic pacifiers increased the risk of malocclusion, as seen through a greater prevalence of accentuated overjet, posterior crossbites, and anterior open bites in comparison to individuals who did not use pacifiers. However, this study found that there was a clinically significant reduction in the prevalence of anterior open bites amongst orthodontic pacifier users in comparison to conventional pacifier users. Another study found that both types of pacifiers lead to malocclusion; however, they found no difference in the mean overjet and prevalence of anterior open bites amongst conventional and orthodontic pacifier users. In contrast, one study suggested that orthodontic pacifiers do not seem to be related to the development of malocclusions in the primary dentitions, and using them between the ages of 0-3 months was actually beneficial as it prevents thumb-sucking habits. One of the systemic reviews concluded that orthodontic pacifiers do not seem to reduce the occurrence of posterior crossbites; however, they could reduce the development of open bites by virtue of their thin neck design. Whereas another systematic review concluded that there were no differences as to the effects on the stomatognathic system when comparing conventional and orthodontic pacifiers. Conclusion: There is limited and conflicting evidence to support the notion that orthodontic pacifiers can reduce the prevalence of malocclusion when compared to conventional pacifiers. Well-designed randomized controlled trials are required in the future in order to thoroughly assess the effects of orthodontic pacifiers on the developing occlusion and orofacial structures.Keywords: orthodontics, pacifier, malocclusion, review
Procedia PDF Downloads 8529043 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework
Procedia PDF Downloads 28429042 Using Non-Negative Matrix Factorization Based on Satellite Imagery for the Collection of Agricultural Statistics
Authors: Benyelles Zakaria, Yousfi Djaafar, Karoui Moussa Sofiane
Abstract:
Agriculture is fundamental and remains an important objective in the Algerian economy, based on traditional techniques and structures, it generally has a purpose of consumption. Collection of agricultural statistics in Algeria is done using traditional methods, which consists of investigating the use of land through survey and field survey. These statistics suffer from problems such as poor data quality, the long delay between collection of their last final availability and high cost compared to their limited use. The objective of this work is to develop a processing chain for a reliable inventory of agricultural land by trying to develop and implement a new method of extracting information. Indeed, this methodology allowed us to combine data from remote sensing and field data to collect statistics on areas of different land. The contribution of remote sensing in the improvement of agricultural statistics, in terms of area, has been studied in the wilaya of Sidi Bel Abbes. It is in this context that we applied a method for extracting information from satellite images. This method is called the non-negative matrix factorization, which does not consider the pixel as a single entity, but will look for components the pixel itself. The results obtained by the application of the MNF were compared with field data and the results obtained by the method of maximum likelihood. We have seen a rapprochement between the most important results of the FMN and those of field data. We believe that this method of extracting information from satellite data leads to interesting results of different types of land uses.Keywords: blind source separation, hyper-spectral image, non-negative matrix factorization, remote sensing
Procedia PDF Downloads 42329041 Analysis of Magnetic Anomaly Data for Identification Structure in Subsurface of Geothermal Manifestation at Candi Umbul Area, Magelang, Central Java Province, Indonesia
Authors: N. A. Kharisa, I. Wulandari, R. Narendratama, M. I. Faisal, K. Kirana, R. Zipora, I. Arfiansah, I. Suyanto
Abstract:
Acquisition of geophysical survey with magnetic method has been done in manifestation of geothermalat Candi Umbul, Grabag, Magelang, Central Java Province on 10-12 May 2013. This objective research is interpretation to interpret structural geology that control geothermal system in CandiUmbul area. The research has been finished with area size 1,5 km x 2 km and measurement space of 150 m. And each point of line space survey is 150 m using PPM Geometrics model G-856. Data processing was started with IGRF and diurnal variation correction to get total magnetic field anomaly. Then, advance processing was done until reduction to pole, upward continuation, and residual anomaly. That results become next interpretation in qualitative step. It is known that the biggest object position causes low anomaly located in central of area survey that comes from hot spring manifestation and demagnetization zone that indicates the existence of heat source activity. Then, modeling the anomaly map was used for quantitative interpretation step. The result of modeling is rock layers and geological structure model that can inform about the geothermal system. And further information from quantitative interpretations can be interpreted about lithology susceptibility. And lithology susceptibilities are andesiteas heat source has susceptibility value of (k= 0.00014 emu), basaltic as alteration rock (k= 0.0016 emu), volcanic breccia as reservoir rock (k= 0.0026 emu), andesite porfirtic as cap rock (k= 0.004 emu), lava andesite (k= 0.003 emu), and alluvium (k= 0.0007 emu). The hot spring manifestation is controlled by the normal fault which becomes a weak zone, easily passed by hot water which comes from the geothermal reservoir.Keywords: geological structure, geothermal system, magnetic, susceptibility
Procedia PDF Downloads 38429040 Radiation Safety Factor of Education and Research Institution in Republic of Korea
Authors: Yeo Ryeong Jeon, Pyong Kon Cho, Eun Ok Han, Hyon Chul Jang, Yong Min Kim
Abstract:
This study surveyed on recognition related to radiation safety for radiation safety managers and workers those who have been worked in Republic of Korea education and research institution. At present, South Korea has no guideline and manual of radiation safety for education and research institution. Therefore, we tried to find an educational basis for development of radiation safety guideline and manual. To check the level of knowledge, attitude, and behavior about radiation safety, we used the questionnaire that consisted of 29 questions against knowledge, attitude and behavior, 4 questions against self-efficacy and expectation based on four factors (radiation source, human, organizational and physical environment) of the Haddon's matrix. Responses were collected between May 4 and June 30, 2015. We analyzed questionnaire by means of IBM SPSS/WIN 15 which well known as statistical package for social science. The data were compared with mean, standard deviation, Pearson's correlation, ANOVA (analysis of variance) and regression analysis. 180 copies of the questionnaire were returned from 60 workplaces. The overall mean results for behavior level was relatively lower than knowledge and attitude level. In particular, organizational environment factor on the radiation safety management indicated the lowest behavior level. Most of the factors were correlated in Pearson’s correlation analysis, especially between knowledge of human factors and behavior of human factors (Pearson’s correlation coefficient 0.809, P<.01). When analysis performed in line with the main radiation source type, institutions where have been used only opened RI (radioisotope) behavior level was the lowest among all subjects. Finally, knowledge of radiation source factor (β=0.556, P<.001) and human factor(β=0.376, P<.001) had the greatest impact in terms of behavior practice. Radiation safety managers and workers think positively about radiation safety management, but are poorly informed organizational environment of their institution. Thus, each institution need to efforts to settlement of radiation safety culture. Also, pedagogical interventions for improving knowledge on radiation safety needs in terms of safety accident prevention.Keywords: radiation safety management, factor analysis, SPSS, republic of Korea
Procedia PDF Downloads 36429039 In Search of Bauman’s Moral Impulse in Shadow Factories of China
Authors: Akram Hatami, Naser Firoozi, Vesa Puhakka
Abstract:
Ethics and responsibility are rapidly becoming a distinguishing feature of organizations. In this paper, we analyze ethics and responsibility in shadow factories in China. We engage ourselves with Bauman’s moral impulse perspective because his idea can contextualize ethics and responsibility. Moral impulse is a feeling of a selfless, infinite and unconditional responsibility towards, and care for, Others. We analyze a case study from a secondary data source because, for such a critical phenomenon as business ethics in shadow factories, collecting primary data is difficult, since they are unregistered factories. We argue that there has not been enough attention given to the ethics and responsibility in shadow factories in China. Our main goal is to demonstrate that, considering the Other, more importantly the employees, in ethical decision-making is a simple instruction beyond the narrow version of ethics by ethical codes and rules.Keywords: moral impulse, responsibility, shadow factories, Bauman’s moral impulse
Procedia PDF Downloads 32929038 A Comprehensive Methodology for Voice Segmentation of Large Sets of Speech Files Recorded in Naturalistic Environments
Authors: Ana Londral, Burcu Demiray, Marcus Cheetham
Abstract:
Speech recording is a methodology used in many different studies related to cognitive and behaviour research. Modern advances in digital equipment brought the possibility of continuously recording hours of speech in naturalistic environments and building rich sets of sound files. Speech analysis can then extract from these files multiple features for different scopes of research in Language and Communication. However, tools for analysing a large set of sound files and automatically extract relevant features from these files are often inaccessible to researchers that are not familiar with programming languages. Manual analysis is a common alternative, with a high time and efficiency cost. In the analysis of long sound files, the first step is the voice segmentation, i.e. to detect and label segments containing speech. We present a comprehensive methodology aiming to support researchers on voice segmentation, as the first step for data analysis of a big set of sound files. Praat, an open source software, is suggested as a tool to run a voice detection algorithm, label segments and files and extract other quantitative features on a structure of folders containing a large number of sound files. We present the validation of our methodology with a set of 5000 sound files that were collected in the daily life of a group of voluntary participants with age over 65. A smartphone device was used to collect sound using the Electronically Activated Recorder (EAR): an app programmed to record 30-second sound samples that were randomly distributed throughout the day. Results demonstrated that automatic segmentation and labelling of files containing speech segments was 74% faster when compared to a manual analysis performed with two independent coders. Furthermore, the methodology presented allows manual adjustments of voiced segments with visualisation of the sound signal and the automatic extraction of quantitative information on speech. In conclusion, we propose a comprehensive methodology for voice segmentation, to be used by researchers that have to work with large sets of sound files and are not familiar with programming tools.Keywords: automatic speech analysis, behavior analysis, naturalistic environments, voice segmentation
Procedia PDF Downloads 28129037 Detection of Selected Heavy Metals in Raw Milk: Lahore, Pakistan
Authors: Huma Naeem, Saif-Ur-Rehman Kashif, Muhammad Nawaz Chaudhry
Abstract:
Milk plays a significant role in the dietary requirements of human beings as it is a single source that provides various essential nutrients. A study was conducted to evaluate the heavy metal concentration in the raw milk marketed in Data Gunj Baksh Town of Lahore. A total of 180 samples of raw milk were collected in pre-monsoon, monsoon and post-monsoon season from five colonies of Data Gunj Baksh Town, Lahore. The milk samples were subjected to heavy metal analysis (Cr, Cu) by atomic absorption spectrophotometer. Results indicated high levels of Cr and Cu in post-monsoon seasons. Heavy metals were detected in milk in all samples under study and exceeded the standards given by FAO.Keywords: atomic absorption spectrophotometer, chromium, copper, heavy metal
Procedia PDF Downloads 433