Search results for: analysis data
40950 Housing Price Dynamics: Comparative Study of 1980-1999 and the New Millenium
Authors: Janne Engblom, Elias Oikarinen
Abstract:
The understanding of housing price dynamics is of importance to a great number of agents: to portfolio investors, banks, real estate brokers and construction companies as well as to policy makers and households. A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models is dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Common Correlated Effects estimator (CCE) of dynamic panel data which also accounts for cross-sectional dependence which is caused by common structures of the economy. In presence of cross-sectional dependence standard OLS gives biased estimates. In this study, U.S housing price dynamics were examined empirically using the dynamic CCE estimator with first-difference of housing price as the dependent and first-differences of per capita income, interest rate, housing stock and lagged price together with deviation of housing prices from their long-run equilibrium level as independents. These deviations were also estimated from the data. The aim of the analysis was to provide estimates with comparisons of estimates between 1980-1999 and 2000-2012. Based on data of 50 U.S cities over 1980-2012 differences of short-run housing price dynamics estimates were mostly significant when two time periods were compared. Significance tests of differences were provided by the model containing interaction terms of independents and time dummy variable. Residual analysis showed very low cross-sectional correlation of the model residuals compared with the standard OLS approach. This means a good fit of CCE estimator model. Estimates of the dynamic panel data model were in line with the theory of housing price dynamics. Results also suggest that dynamics of a housing market is evolving over time.Keywords: dynamic model, panel data, cross-sectional dependence, interaction model
Procedia PDF Downloads 25140949 Identifying Critical Success Factors for Data Quality Management through a Delphi Study
Authors: Maria Paula Santos, Ana Lucas
Abstract:
Organizations support their operations and decision making on the data they have at their disposal, so the quality of these data is remarkably important and Data Quality (DQ) is currently a relevant issue, the literature being unanimous in pointing out that poor DQ can result in large costs for organizations. The literature review identified and described 24 Critical Success Factors (CSF) for Data Quality Management (DQM) that were presented to a panel of experts, who ordered them according to their degree of importance, using the Delphi method with the Q-sort technique, based on an online questionnaire. The study shows that the five most important CSF for DQM are: definition of appropriate policies and standards, control of inputs, definition of a strategic plan for DQ, organizational culture focused on quality of the data and obtaining top management commitment and support.Keywords: critical success factors, data quality, data quality management, Delphi, Q-Sort
Procedia PDF Downloads 21740948 FLEX: A Backdoor Detection and Elimination Method in Federated Scenario
Authors: Shuqi Zhang
Abstract:
Federated learning allows users to participate in collaborative model training without sending data to third-party servers, reducing the risk of user data privacy leakage, and is widely used in smart finance and smart healthcare. However, the distributed architecture design of federation learning itself and the existence of secure aggregation protocols make it inherently vulnerable to backdoor attacks. To solve this problem, the federated learning backdoor defense framework FLEX based on group aggregation, cluster analysis, and neuron pruning is proposed, and inter-compatibility with secure aggregation protocols is achieved. The good performance of FLEX is verified by building a horizontal federated learning framework on the CIFAR-10 dataset for experiments, which achieves 98% success rate of backdoor detection and reduces the success rate of backdoor tasks to 0% ~ 10%.Keywords: federated learning, secure aggregation, backdoor attack, cluster analysis, neuron pruning
Procedia PDF Downloads 9640947 Use of Artificial Intelligence in Teaching Practices: A Meta-Analysis
Authors: Azmat Farooq Ahmad Khurram, Sadaf Aslam
Abstract:
This meta-analysis systematically examines the use of artificial intelligence (AI) in instructional methods across diverse educational settings through a thorough analysis of empirical research encompassing various disciplines, educational levels, and regions. This study aims to assess the effects of AI integration on teaching methodologies, classroom dynamics, teachers' roles, and student engagement. Various research methods were used to gather data, including literature reviews, surveys, interviews, and focus group discussions. Findings indicate paradigm shifts in teaching and education, identify emerging trends, practices, and the application of artificial intelligence in learning, and provide educators, policymakers, and stakeholders with guidelines and recommendations for effectively integrating AI in educational contexts. The study concludes by suggesting future research directions and practical considerations for maximizing AI's positive influence on pedagogical practices.Keywords: artificial intelligence, teaching practices, meta-analysis, teaching-learning
Procedia PDF Downloads 7740946 The Effect of Group Counseling Program on 9th Grade Students' Assertiveness Levels
Authors: Ismail Seçer, Kerime Meryem Dereli̇oğlu
Abstract:
This study is conducted to determine the effects of group counseling program on secondary school 9th grade students’ assertiveness skills. The study group was formed of 100 students who have received education in Erzurum Kültür Elementary School in 2015-2016 education years. RAE-Rathus Assertiveness Schedule developed by Voltan Acar was applied on this group to gather data. 40 students who got lower grades from the inventory were divided randomly into experimental and control groups. Each group is formed of 20 students. Group counseling program was carried out on the experimental group to improve the students’ assertiveness skills for 8 weeks. Single-way and two-way analysis of covariance (ANCOVA) were used in the analysis of the data. The data was analyzed by using the SPSS 19.00. The results of the study show that assertiveness skills of the students who participate in the group counseling program increased meaningfully compared to the control group and pre-experiment. Besides, it was determined that the change observed in the experimental group occurred separately from the age and socio-economic level variables, and it was determined with the monitoring test applied after four months that this affect was continued. According to this result, it can be said that the applied group counseling program is an effective means to improve the assertiveness skills of secondary school students.Keywords: high school, assertiveness, assertiveness inventory, assertiveness education
Procedia PDF Downloads 24640945 A Proposal to Tackle Security Challenges of Distributed Systems in the Healthcare Sector
Authors: Ang Chia Hong, Julian Khoo Xubin, Burra Venkata Durga Kumar
Abstract:
Distributed systems offer many benefits to the healthcare industry. From big data analysis to business intelligence, the increased computational power and efficiency from distributed systems serve as an invaluable resource in the healthcare sector to utilize. However, as the usage of these distributed systems increases, many issues arise. The main focus of this paper will be on security issues. Many security issues stem from distributed systems in the healthcare industry, particularly information security. The data of people is especially sensitive in the healthcare industry. If important information gets leaked (Eg. IC, credit card number, address, etc.), a person’s identity, financial status, and safety might get compromised. This results in the responsible organization losing a lot of money in compensating these people and even more resources expended trying to fix the fault. Therefore, a framework for a blockchain-based healthcare data management system for healthcare was proposed. In this framework, the usage of a blockchain network is explored to store the encryption key of the patient’s data. As for the actual data, it is encrypted and its encrypted data, called ciphertext, is stored in a cloud storage platform. Furthermore, there are some issues that have to be emphasized and tackled for future improvements, such as a multi-user scheme that could be proposed, authentication issues that have to be tackled or migrating the backend processes into the blockchain network. Due to the nature of blockchain technology, the data will be tamper-proof, and its read-only function can only be accessed by authorized users such as doctors and nurses. This guarantees the confidentiality and immutability of the patient’s data.Keywords: distributed, healthcare, efficiency, security, blockchain, confidentiality and immutability
Procedia PDF Downloads 18440944 Modified InVEST for Whatsapp Messages Forensic Triage and Search through Visualization
Authors: Agria Rhamdhan
Abstract:
WhatsApp as the most popular mobile messaging app has been used as evidence in many criminal cases. As the use of mobile messages generates large amounts of data, forensic investigation faces the challenge of large data problems. The hardest part of finding this important evidence is because current practice utilizes tools and technique that require manual analysis to check all messages. That way, analyze large sets of mobile messaging data will take a lot of time and effort. Our work offers methodologies based on forensic triage to reduce large data to manageable sets resulting easier to do detailed reviews, then show the results through interactive visualization to show important term, entities and relationship through intelligent ranking using Term Frequency-Inverse Document Frequency (TF-IDF) and Latent Dirichlet Allocation (LDA) Model. By implementing this methodology, investigators can improve investigation processing time and result's accuracy.Keywords: forensics, triage, visualization, WhatsApp
Procedia PDF Downloads 16840943 Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories
Authors: Mariana Lima, Rodrigo Silva, Victor Stange, Teodiano Bastos
Abstract:
Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.Keywords: database, forensic genetics, genetic analysis, sample management, software solution
Procedia PDF Downloads 37040942 Stress Concentration Trend for Combined Loading Conditions
Authors: Aderet M. Pantierer, Shmuel Pantierer, Raphael Cordina, Yougashwar Budhoo
Abstract:
Stress concentration occurs when there is an abrupt change in geometry, a mechanical part under loading. These changes in geometry can include holes, notches, or cracks within the component. The modifications create larger stress within the part. This maximum stress is difficult to determine, as it is directly at the point of the minimum area. Strain gauges have yet to be developed to analyze stresses at such minute areas. Therefore, a stress concentration factor must be utilized. The stress concentration factor is a dimensionless parameter calculated solely on the geometry of a part. The factor is multiplied by the nominal, or average, stress of the component, which can be found analytically or experimentally. Stress concentration graphs exist for common loading conditions and geometrical configurations to aid in the determination of the maximum stress a part can withstand. These graphs were developed from historical data yielded from experimentation. This project seeks to verify a stress concentration graph for combined loading conditions. The aforementioned graph was developed using CATIA Finite Element Analysis software. The results of this analysis will be validated through further testing. The 3D modeled parts will be subjected to further finite element analysis using Patran-Nastran software. The finite element models will then be verified by testing physical specimen using a tensile testing machine. Once the data is validated, the unique stress concentration graph will be submitted for publication so it can aid engineers in future projects.Keywords: stress concentration, finite element analysis, finite element models, combined loading
Procedia PDF Downloads 44340941 Study of Components and Effective Factors on Organizational Commitment of Khoramabad Branchs Islamic Azad University’s Faculty Members
Authors: Mehry Daraei
Abstract:
The goal of this study was to survey the components and affective factors on organizational commitment of Islamic Azad university Khoramabad Baranch’s faculty members. The research method was correlation by causal modeling and data were gathered by questionnaire. Statistical society consisted of 147 faculty members in Islamic Azad University Khoramabad Branch and sample size was determined as 106 persons by Morgan’s sample table that were selected by class sampling. Correlation test, T-single group test and path analysis test were used for analysis of data. Data were analyzed by Lisrel software. The results showed that organizational corporate was the most effective element on organizational commitment and organizational corporate, experience work and organizational justice were only in direct relation with organizational commitment. Also, job security had direct and indirect effect on OC. Job security had effect on OC by gender. Gender variable had direct and indirect effect on OC. Gender had effect on OC by organizational corporate. Job opportunities out of university also had direct and indirect effect on OC, which means job opportunities had indirect effect on OC by organizational corporate.Keywords: organization, commitment, job security, Islamic Azad University
Procedia PDF Downloads 32340940 Performance Evaluation of Al Jame’s Roundabout Using SIDRA
Authors: D. Muley, H. S. Al-Mandhari
Abstract:
This paper evaluates the performance of a multi-lane four-legged modern roundabout operating in Muscat using SIDRA model. The performance measures include Degree of Saturation (DOS), average delay, and queue lengths. The geometric and traffic data were used for model preparation. Gap acceptance parameters, critical gap, and follow-up headway were used for calibration of SIDRA model. The results from the analysis showed that currently the roundabout is experiencing delays up to 610 seconds with DOS 1.67 during peak hour. Further, sensitivity analysis for general and roundabout parameters was performed, amongst lane width, cruise speed, inscribed diameter, entry radius, and entry angle showed that inscribed diameter is the most crucial factor affecting delay and DOS. Upgradation of the roundabout to the fully signalized junction was found as the suitable solution which will serve for future years with LOS C for design year having DOS of 0.9 with average control delay of 51.9 seconds per vehicle.Keywords: performance analysis, roundabout, sensitivity analysis, SIDRA
Procedia PDF Downloads 38240939 Analyzing the Sensation of Jogja Kembali Monument (Monjali): Case Study of Yogyakarta as the Implementation of Attraction Tour
Authors: Hutomo Abdurrohman, Muhammad Latief, Waridatun Nida, Ranta Dwi Irawati
Abstract:
Yogyakarta Kembali Monument (Monjali) is one of the most popular tourist attraction in Yogyakarta. Yogyakarta is known as ‘Student City’, and Monjali is a right place to learn and explore more about Yogyakarta, especially for students in elementary and junior high school to do the study tour. Monjali is located in North Ringroad, Jongkang, Sariharjo village, Ngaglik Subdistrict, Sleman Regency, Yogyakarta. Monjali offers many historical replicas, and also the story behind them. That is about the war between Indonesia's fighter, called TNI (Indonesian national army) and the colonizer of Netherlands in Yogyakarta, on March, 1st 1949. That event could open the eyes of the whole of Indonesia, because at that time the TNI was placed by the invaders. This research is an effort to evaluate the visitor's interest in Monjali as a special tourist attraction. The substance that we use in this research is the Monjali's visitors whom up to 17 years old by taking a respondent in every 15 persons who visit Monjali, and we need 200 respondents to know the condition and facilities of Monjali. This research has been collected since January 2017 until October 2017. We do the interview and spread the questionnaire which has been tested all of its validity and reliability. This data analysis is descriptive statistic analysis by using the qualitative data, which is converted into the quantitative data, use the Linkert Scale. The result of this research shows that the interest of Monjali's visitors is higher 75,6%. Based on the result, we know that Monjali is being an attractiveness for people which always experience its improvements and the development. Monjali is the success to be a place which combines the entertainment with its education as a vision of Yogyakarta as a Student City.Keywords: descriptive statistical analysis, Jogja Kembali monument, Linkert scale, sensation
Procedia PDF Downloads 18840938 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest
Authors: Bharatendra Rai
Abstract:
Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error
Procedia PDF Downloads 32340937 Mobile Devices and E-Learning Systems as a Cost-Effective Alternative for Digitizing Paper Quizzes and Questionnaires in Social Work
Authors: K. Myška, L. Pilařová
Abstract:
The article deals with possibilities of using cheap mobile devices with the combination of free or open source software tools as an alternative to professional hardware and software equipment. Especially in social work, it is important to find cheap yet functional solution that can compete with complex but expensive solutions for digitizing paper materials. Our research was focused on the analysis of cheap and affordable solutions for digitizing the most frequently used paper materials that are being commonly used by terrain workers in social work. We used comparative analysis as a research method. Social workers need to process data from paper forms quite often. It is still more affordable, time and cost-effective to use paper forms to get feedback in many cases. Collecting data from paper quizzes and questionnaires can be done with the help of professional scanners and software. These technologies are very powerful and have advanced options for digitizing and processing digitized data, but are also very expensive. According to results of our study, the combination of open source software and mobile phone or cheap scanner can be considered as a cost-effective alternative to professional equipment.Keywords: digitalization, e-learning, mobile devices, questionnaire
Procedia PDF Downloads 15140936 Modeling Waiting and Service Time for Patients: A Case Study of Matawale Health Centre, Zomba, Malawi
Authors: Moses Aron, Elias Mwakilama, Jimmy Namangale
Abstract:
Spending more time on long queues for a basic service remains a common challenge to most developing countries, including Malawi. For health sector in particular, Out-Patient Department (OPD) experiences long queues. This puts the lives of patients at risk. However, using queuing analysis to under the nature of the problems and efficiency of service systems, such problems can be abated. Based on a kind of service, literature proposes different possible queuing models. However, unlike using generalized assumed models proposed by literature, use of real time case study data can help in deeper understanding the particular problem model and how such a model can vary from one day to the other and also from each case to another. As such, this study uses data obtained from one urban HC for BP, Pediatric and General OPD cases to investigate an average queuing time for patients within the system. It seeks to highlight the proper queuing model by investigating the kind of distributions functions over patient’s arrival time, inter-arrival time, waiting time and service time. Comparable with the standard set values by WHO, the study found that patients at this HC spend more waiting times than service times. On model investigation, different days presented different models ranging from an assumed M/M/1, M/M/2 to M/Er/2. As such, through sensitivity analysis, in general, a commonly assumed M/M/1 model failed to fit the data but rather an M/Er/2 demonstrated to fit well. An M/Er/3 model seemed to be good in terms of measuring resource utilization, proposing a need to increase medical personnel at this HC. However, an M/Er/4 showed to cause more idleness of human resources.Keywords: health care, out-patient department, queuing model, sensitivity analysis
Procedia PDF Downloads 43540935 Seismic Interpretation and Petrophysical Evaluation of SM Field, Libya
Authors: Abdalla Abdelnabi, Yousf Abushalah
Abstract:
The G Formation is a major gas producing reservoir in the SM Field, eastern, Libya. It is called G limestone because it consists of shallow marine limestone. Well data and 3D-Seismic in conjunction with the results of a previous study were used to delineate the hydrocarbon reservoir of Middle Eocene G-Formation of SM Field area. The data include three-dimensional seismic data acquired in 2009. It covers approximately an area of 75 mi² and with more than 9 wells penetrating the reservoir. Seismic data are used to identify any stratigraphic and structural and features such as channels and faults and which may play a significant role in hydrocarbon traps. The well data are used to calculation petrophysical analysis of S field. The average porosity of the Middle Eocene G Formation is very good with porosity reaching 24% especially around well W 6. Average water saturation was calculated for each well from porosity and resistivity logs using Archie’s formula. The average water saturation for the whole well is 25%. Structural mapping of top and bottom of Middle Eocene G formation revealed the highest area in the SM field is at 4800 ft subsea around wells W4, W5, W6, and W7 and the deepest point is at 4950 ft subsea. Correlation between wells using well data and structural maps created from seismic data revealed that net thickness of G Formation range from 0 ft in the north part of the field to 235 ft in southwest and south part of the field. The gas water contact is found at 4860 ft using the resistivity log. The net isopach map using both the trapezoidal and pyramid rules are used to calculate the total bulk volume. The original gas in place and the recoverable gas were calculated volumetrically to be 890 Billion Standard Cubic Feet (BSCF) and 630 (BSCF) respectively.Keywords: 3D seismic data, well logging, petrel, kingdom suite
Procedia PDF Downloads 15040934 Diagnosis of Logistics Processes: Bibliometric Review and Analysis
Authors: S. F. Bayona, J. Nunez, D. Paez
Abstract:
The diagnostic processes have been consolidated as fundamental tools in the adequate knowledge of organizations and their processes. The diagnosis is related to the interpretation of the data, findings and the relevant information, to determine problems, causes, or the simple state and behavior of a process, without including a solution to the problems detected. The objective of this work is to identify the necessary stages to diagnose the logistic processes in a metalworking company, from the literary revision of different disciplines. A total of 62 articles were chosen to identify, through bibliometric analysis, the most cited articles, as well as the most frequent authors and journals. The results allowed to identify the two fundamental stages in the diagnostic process: a primary phase (general) based on the logical subjectivity of the knowledge of the person who evaluates, and the secondary phase (specific), related to the interpretation of the results, findings or data. Also, two phases were identified, one related to the definition of the scope of the actions to be developed and the other, as an initial description of what was observed in the process.Keywords: business, diagnostic, management, process
Procedia PDF Downloads 15640933 Factors Affecting Cost Efficiency of Municipal Waste Services in Tuscan Municipalities: An Empirical Investigation by Accounting for Different Management
Authors: María Molinos-Senante, Giulia Romano
Abstract:
This paper aims at investigating the effect of ownership in the efficiency assessment of municipal solid waste management. In doing so, the Data Envelopment Analysis meta-frontier approach integrating unsorted waste as undesirable output was applied. Three different clusters of municipalities have been created on the basis of the ownership type of municipal waste operators. In the second stage of analysis, the paper investigates factors affecting efficiency, in order to provide an outlook of levers to be used by policy and decision makers to improve efficiency, taking into account different management models in force. Results show that public waste management firms have better performance than mixed and private ones since their efficiency scores are significantly larger. Moreover, it has been demonstrated that the efficiency of waste management firms is statistically influenced by the age of population, population served, municipal size, population density and tourism rate. It evidences the importance of economies of scale on the cost efficiency of waste management. This issue is relevant for policymakers to define and implement policies aimed to improve the long-term sustainability of waste management in municipalities.Keywords: data envelopment analysis, efficiency, municipal solid waste, ownership, undesirable output
Procedia PDF Downloads 15940932 Problems of Learning English Vowels Pronunciation in Nigeria
Authors: Wasila Lawan Gadanya
Abstract:
This paper examines the problems of learning English vowel pronunciation. The objective is to identify some of the factors that affect the learning of English vowel sounds and their proper realization in words. The theoretical framework adopted is based on both error analysis and contrastive analysis. The data collection instruments used in the study are questionnaire and word list for the respondents (students) and observation of some of their lecturers. All the data collected were analyzed using simple percentage. The findings show that it is not a single factor that affects the learning of English vowel pronunciation rather many factors concurrently do so. Among the factors examined, it has been found that lack of correlation between English orthography and its pronunciation, not mother-tongue (which most people consider as a factor affecting learning of the pronunciation of a second language), has the greatest influence on students’ learning and realization of English vowel sounds since the respondents in this study are from different ethnic groups of Nigeria and thus speak different languages but having the same or almost the same problem when pronouncing the English vowel sounds.Keywords: English vowels, learning, Nigeria, pronunciation
Procedia PDF Downloads 45140931 Mixture statistical modeling for predecting mortality human immunodeficiency virus (HIV) and tuberculosis(TB) infection patients
Authors: Mohd Asrul Affendi Bi Abdullah, Nyi Nyi Naing
Abstract:
The purpose of this study was to identify comparable manner between negative binomial death rate (NBDR) and zero inflated negative binomial death rate (ZINBDR) with died patients with (HIV + T B+) and (HIV + T B−). HIV and TB is a serious world wide problem in the developing country. Data were analyzed with applying NBDR and ZINBDR to make comparison which a favorable model is better to used. The ZINBDR model is able to account for the disproportionately large number of zero within the data and is shown to be a consistently better fit than the NBDR model. Hence, as a results ZINBDR model is a superior fit to the data than the NBDR model and provides additional information regarding the died mechanisms HIV+TB. The ZINBDR model is shown to be a use tool for analysis death rate according age categorical.Keywords: zero inflated negative binomial death rate, HIV and TB, AIC and BIC, death rate
Procedia PDF Downloads 43240930 Analysis of Process Methane Hydrate Formation That Include the Important Role of Deep-Sea Sediments with Analogy in Kerek Formation, Sub-Basin Kendeng, Central Java, Indonesia
Authors: Yan Bachtiar Muslih, Hangga Wijaya, Trio Fani, Putri Agustin
Abstract:
Demand of Energy in Indonesia always increases 5-6% a year, but production of conventional energy always decreases 3-5% a year, it means that conventional energy in 20-40 years ahead will not able to complete all energy demand in Indonesia, one of the solve way is using unconventional energy that is gas hydrate, gas hydrate is gas that form by biogenic process, gas hydrate stable in condition with extremely depth and low temperature, gas hydrate can form in two condition that is in pole condition and in deep-sea condition, wherein this research will focus in gas hydrate that association with methane form methane hydrate in deep-sea condition and usually form in depth between 150-2000 m, this research will focus in process of methane hydrate formation that is biogenic process and the important role of deep-sea sediment so can produce accumulation of methane hydrate, methane hydrate usually will be accumulated in find sediment in deep-sea environment with condition high-pressure and low-temperature this condition too usually make methane hydrate change into white nodule, methodology of this research is geology field work and laboratory analysis, from geology field work will get sample data consist of 10-15 samples from Kerek Formation outcrops as random for imagine the condition of deep-sea environment that influence the methane hydrate formation and also from geology field work will get data of measuring stratigraphy in outcrops Kerek Formation too from this data will help to imagine the process in deep-sea sediment like energy flow, supply sediment, and etc, and laboratory analysis is activity to analyze all data that get from geology field work, the result of this research can used to exploration activity of methane hydrate in another prospect deep-sea environment in Indonesia.Keywords: methane hydrate, deep-sea sediment, kerek formation, sub-basin of kendeng, central java, Indonesia
Procedia PDF Downloads 46140929 Decoding the Natural Hazards: The Data Paradox, Juggling Data Flows, Transparency and Secrets, Analysis of Khuzestan and Lorestan Floods of Iran
Authors: Kiyanoush Ghalavand
Abstract:
We have a complex paradox in the agriculture and environment sectors in the age of technology. In the one side, the achievements of the science and information ages are shaping to come that is very dangerous than ever last decades. The progress of the past decades is historic, connecting people, empowering individuals, groups, and states, and lifting a thousand people out of land and poverty in the process. Floods are the most frequent natural hazards damaging and recurring of all disasters in Iran. Additionally, floods are morphing into new and even more devastating forms in recent years. Khuzestan and Lorestan Provinces experienced heavy rains that began on March 28, 2019, and led to unprecedented widespread flooding and landslides across the provinces. The study was based on both secondary and primary data. For the present study, a questionnaire-based primary survey was conducted. Data were collected by using a specially designed questionnaire and other instruments, such as focus groups, interview schedules, inception workshops, and roundtable discussions with stakeholders at different levels. Farmers in Khuzestan and Lorestan provinces were the statistical population for this study. Data were analyzed with several software such as ATLASti, NVivo SPSS Win, ،E-Views. According to a factorial analysis conducted for the present study, 10 groups of factors were categorized climatic, economic, cultural, supportive, instructive, planning, military, policymaking, geographical, and human factors. They estimated 71.6 percent of explanatory factors of flood management obstacles in the agricultural sector in Lorestan and Khuzestan provinces. Several recommendations were finally made based on the study findings.Keywords: chaos theory, natural hazards, risks, environmental risks, paradox
Procedia PDF Downloads 14540928 An Assessment of Different Blade Tip Timing (BTT) Algorithms Using an Experimentally Validated Finite Element Model Simulator
Authors: Mohamed Mohamed, Philip Bonello, Peter Russhard
Abstract:
Blade Tip Timing (BTT) is a technology concerned with the estimation of both frequency and amplitude of rotating blades. A BTT system comprises two main parts: (a) the arrival time measurement system, and (b) the analysis algorithms. Simulators play an important role in the development of the analysis algorithms since they generate blade tip displacement data from the simulated blade vibration under controlled conditions. This enables an assessment of the performance of the different algorithms with respect to their ability to accurately reproduce the original simulated vibration. Such an assessment is usually not possible with real engine data since there is no practical alternative to BTT for blade vibration measurement. Most simulators used in the literature are based on a simple spring-mass-damper model to determine the vibration. In this work, a more realistic experimentally validated simulator based on the Finite Element (FE) model of a bladed disc (blisk) is first presented. It is then used to generate the necessary data for the assessment of different BTT algorithms. The FE modelling is validated using both a hammer test and two firewire cameras for the mode shapes. A number of autoregressive methods, fitting methods and state-of-the-art inverse methods (i.e. Russhard) are compared. All methods are compared with respect to both synchronous and asynchronous excitations with both single and simultaneous frequencies. The study assesses the applicability of each method for different conditions of vibration, amount of sampling data, and testing facilities, according to its performance and efficiency under these conditions.Keywords: blade tip timing, blisk, finite element, vibration measurement
Procedia PDF Downloads 31040927 A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling
Authors: Addin Osman, Anwar Ali Yahya, Mohammed Basit Kamal
Abstract:
Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.Keywords: ABET, accreditation, benchmark collection, machine learning, program educational objectives, student outcomes, supervised multi-class classification, text mining
Procedia PDF Downloads 17240926 Analysis of CO₂ Two-Phase Ejector with Taguchi and ANOVA Optimization and Refrigerant Selection with Enviro Economic Concerns by TOPSIS Analysis
Authors: Karima Megdouli, Bourhan tachtouch
Abstract:
Ejector refrigeration cycles offer an alternative to conventional systems for producing cold from low-temperature heat. In this article, a thermodynamic model is presented. This model has the advantage of simplifying the calculation algorithm and describes the complex double-throttling mechanism that occurs in the ejector. The model assumption and calculation algorithm are presented first. The impact of each efficiency is evaluated. Validation is performed on several data sets. The ejector model is then used to simulate a RES (refrigeration ejector system), to validate its robustness and suitability for use in predicting thermodynamic cycle performance. A Taguchi and ANOVA optimization is carried out on a RES. TOPSIS analysis was applied to decide the optimum refrigerants with cost, safety, environmental and enviro economic concerns along with thermophysical properties.Keywords: ejector, velocity distribution, shock circle, Taguchi and ANOVA optimization, TOPSIS analysis
Procedia PDF Downloads 8940925 Socioeconomic Factors Associated with the Knowledge, Attitude, and Practices of Oil Palm Smallholders toward Ganoderma Disease
Authors: K. Assis, B. Bonaventure, A. Abdul Rahim, H. Affendy, A. Mohammad Amizi
Abstract:
Oil palm smallholders are considered as a very important producer of oil palm in Malaysia. They are categorized into two, which are organized smallholder and independent smallholder. In this study, there were 1000 oil palms smallholders have been interviewed by using a structured questionnaire. The main objective of the survey is to identify the relationship between socioeconomic characteristics of smallholders with their knowledge, attitude, and practices toward Ganoderma disease. The locations of study include Peninsular Malaysia and Sabah. There were three important aspects studied, namely knowledge of Ganoderma disease, attitude towards the disease as well as the practices in managing the disease. Cluster analysis, factor analysis, and binary logistic regression were used to analyze the data collected. The findings of the study should provide a baseline data which can be used by the relevant agencies to conduct programs or to formulate a suitable development plan to improve the knowledge, attitude and practices of oil palm smallholders in managing Ganoderma disease.Keywords: attitude, Ganoderma, knowledge, oil palm, practices, smallholders
Procedia PDF Downloads 39840924 Investigation of Surface Electromyograph Signal Acquired from the around Shoulder Muscles of Upper Limb Amputees
Authors: Amanpreet Kaur, Ravinder Agarwal, Amod Kumar
Abstract:
Surface electromyography is a strategy to measure the muscle activity of the skin. Sensors placed on the skin recognize the electrical current or signal generated by active muscles. A lot of the research has focussed on the detection of signal from upper limb amputee with activity of triceps and biceps muscles. The purpose of this study was to correlate phantom movement and sEMG activity in residual stump muscles of transhumeral amputee from the shoulder muscles. Eight non- amputee and seven right hand amputees were recruited for this study. sEMG data were collected for the trapezius, pectoralis and teres muscles for elevation, protraction and retraction of shoulder. Contrast between the amputees and non-amputees muscles action have been investigated. Subsequently, to investigate the impact of class separability for different motions of shoulder, analysis of variance for experimental recorded data was carried out. Results were analyzed to recognize different shoulder movements and represent a step towards the surface electromyography controlled system for amputees. Difference in F ratio (p < 0.05) values indicates the distinction in mean therefore these analysis helps to determine the independent motion. The identified signal would be used to design more accurate and efficient controllers for the upper-limb amputee for researchers.Keywords: around shoulder amputation, surface electromyography, analysis of variance, features
Procedia PDF Downloads 43340923 On-Line Data-Driven Multivariate Statistical Prediction Approach to Production Monitoring
Authors: Hyun-Woo Cho
Abstract:
Detection of incipient abnormal events in production processes is important to improve safety and reliability of manufacturing operations and reduce losses caused by failures. The construction of calibration models for predicting faulty conditions is quite essential in making decisions on when to perform preventive maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of process measurement data. The calibration model is used to predict faulty conditions from historical reference data. This approach utilizes variable selection techniques, and the predictive performance of several prediction methods are evaluated using real data. The results shows that the calibration model based on supervised probabilistic model yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.Keywords: calibration model, monitoring, quality improvement, feature selection
Procedia PDF Downloads 35640922 Experimental Modal Analysis of Reinforced Concrete Square Slabs
Authors: M. S. Ahmed, F. A. Mohammad
Abstract:
The aim of this paper is to perform experimental modal analysis (EMA) of reinforced concrete (RC) square slabs. EMA is the process of determining the modal parameters (Natural Frequencies, damping factors, modal vectors) of a structure from a set of frequency response functions FRFs (curve fitting). Although experimental modal analysis (or modal testing) has grown steadily in popularity since the advent of the digital FFT spectrum analyzer in the early 1970’s, studying all members and materials using such method have not yet been well documented. Therefore, in this work, experimental tests were conducted on RC square specimens (0.6m x 0.6m with 40 mm). Experimental analysis is based on freely supported boundary condition. Moreover, impact testing as a fast and economical means of finding the modes of vibration of a structure was used during the experiments. In addition, Pico Scope 6 device and MATLAB software were used to acquire data, analyze and plot Frequency Response Function (FRF). The experimental natural frequencies which were extracted from measurements exhibit good agreement with analytical predictions. It is showed that EMA method can be usefully employed to perform the dynamic behavior of RC slabs.Keywords: natural frequencies, mode shapes, modal analysis, RC slabs
Procedia PDF Downloads 40840921 Environmental Evaluation of Two Kind of Drug Production (Syrup and Pomade Form) Using Life Cycle Assessment Methodology
Authors: H. Aksas, S. Boughrara, K. Louhab
Abstract:
The goal of this study was the use of life cycle assessment (LCA) methodology to assess the environmental impact of pharmaceutical product (four kinds of syrup form and tree kinds of pomade form), which are produced in one leader manufactory in Algeria town that is SAIDAL Company. The impacts generated have evaluated using SimpaPro7.1 with CML92 Method for syrup form and EPD 2007 for pomade form. All impacts evaluated have compared between them, with determination of the compound contributing to each impacts in each case. Data needed to conduct Life Cycle Inventory (LCI) came from this factory, by the collection of theoretical data near the responsible technicians and engineers of the company, the practical data are resulting from the assay of pharmaceutical liquid, obtained at the laboratories of the university. This data represent different raw material imported from European and Asian country necessarily to formulate the drug. Energy used is coming from Algerian resource for the input. Outputs are the result of effluent analysis of this factory with different form (liquid, solid and gas form). All this data (input and output) represent the ecobalance.Keywords: pharmaceutical product, drug residues, LCA methodology, environmental impacts
Procedia PDF Downloads 246