Search results for: open source data
27812 Barriers to Entry: The Pitfall of Charter School Accountability
Authors: Ian Kingsbury
Abstract:
The rapid expansion of charter schools (public schools that receive government but do not face the same regulations as traditional public schools) over the preceding two decades has raised concerns over the potential for graft and fraud. These concerns are largely justified: Incidents of financial crime and mismanagement are not unheard of, and the charter sector has become a darling of hedge fund managers. In response, several states have strengthened their charter school regulatory regimes. Imposing regulations and attempting to increase accountability seem like sensible measures, and perhaps they are necessary. However, increased regulation may come at the cost of imposing barriers to entry. Specifically, increased regulation often entails evidence for a high likelihood of fiscal solvency. That should theoretically entail access to capital in the short-term, which may systematically preclude Black or Hispanic applicants from opening charter schools. Moreover, increased regulation necessarily entails more red tape. The institutional wherewithal and the number of hours required to complete an application to open a charter school might favor those who have partnered with an education service provider, specifically a charter management organization (CMO) or education management organization (EMO). These potential barriers to entry pose a significant policy concern. Just as policymakers hope to increase the share of minority teachers and principals, they should sensibly care whether individuals who open charter schools look like the students in that school. Moreover, they might be concerned if successful applications in states with stringent regulations are overwhelmingly affiliated with education service providers. One of the original missions of charter schools was to serve as a laboratory of innovation. Approving only those applications affiliated with education service providers (and in effect establishing a parallel network of schools rather than a diverse marketplace of schools) undermines that mission. Data and methods: The analysis examines more than 2,000 charter school applications from 15 states. It compares the outcomes of applications from states with a strong regulatory environment (those with high scores) from NACSA-the National Association of Charter School Authorizers- to applications from states with a weak regulatory environment (those with a low NACSA score). If the hypothesis is correct, applicants not affiliated with an ESP are more likely to be rejected in high-regulation states compared to those affiliated with an ESP, and minority candidates not affiliated with an education service provider (ESP) are particularly likely to be rejected. Initial returns indicate that the hypothesis holds. More applications in low NASCA-scoring Arizona come from individuals not associated with an ESP, and those individuals are as likely to be accepted as those affiliated with an ESP. On the other hand, applicants in high-NACSA scoring Indiana and Ohio are more than 20 percentage points more likely to be accepted if they are affiliated with an ESP, and the effect is particularly pronounced for minority candidates. These findings should spur policymakers to consider the drawbacks of charter school accountability and consider accountability regimes that do not impose barriers to entry.Keywords: accountability, barriers to entry, charter schools, choice
Procedia PDF Downloads 15927811 A Stable Method for Determination of the Number of Independent Components
Authors: Yuyan Yi, Jingyi Zheng, Nedret Billor
Abstract:
Independent component analysis (ICA) is one of the most commonly used blind source separation (BSS) techniques for signal pre-processing, such as noise reduction and feature extraction. The main parameter in the ICA method is the number of independent components (IC). Although there have been several methods for the determination of the number of ICs, it has not been given sufficient attentionto this important parameter. In this study, wereview the mostused methods fordetermining the number of ICs and providetheir advantages and disadvantages. Further, wepropose an improved version of column-wise ICAByBlock method for the determination of the number of ICs.To assess the performance of the proposed method, we compare the column-wise ICAbyBlock with several existing methods through different ICA methods by using simulated and real signal data. Results show that the proposed column-wise ICAbyBlock is an effective and stable method for determining the optimal number of components in ICA. This method is simple, and results can be demonstrated intuitively with good visualizations.Keywords: independent component analysis, optimal number, column-wise, correlation coefficient, cross-validation, ICAByblock
Procedia PDF Downloads 9927810 Implementation of Geo-Crowdsourcing Mobile Applications in e-Government of V4 Countries: A State-of-the-Art Survey
Authors: Barbora Haltofová
Abstract:
In recent years, citizens have become an important source of geographic information and, therefore, geo-crowdsourcing, often known as volunteered geographic information, has provided an interesting alternative to traditional mapping practices which are becoming expensive, resource-intensive and unable to capture the dynamic nature of urban environments. In order to address a gap in research literature, this paper deals with a survey conducted to assess the current state of geo-crowdsourcing, a recent phenomenon popular with people who collect geographic information using their smartphones. This article points out that there is an increasing body of knowledge of geo-crowdsourcing mobile applications in the Visegrad countries marked by the ubiquitous Internet connection and the current massive proliferation of smartphones. This article shows how geo-crowdsourcing can be used as a complement, or in some cases a replacement, to traditionally generated sources of spatial data and information in public management. It discusses the new spaces of citizen participation constructed by these geo-crowdsourcing practices.Keywords: citizen participation, e-Government, geo-crowdsourcing, participatory mapping, mobile applications
Procedia PDF Downloads 33427809 Social Networking Application: What Is Their Quality and How Can They Be Adopted in Open Distance Learning Environments?
Authors: Asteria Nsamba
Abstract:
Social networking applications and tools have become compelling platforms for generating and sharing knowledge across the world. Social networking applications and tools refer to a variety of social media platforms which include Facebook, Twitter WhatsApp, blogs and Wikis. The most popular of these platforms are Facebook, with 2.41 billion active users on a monthly basis, followed by WhatsApp with 1.6 billion users and Twitter with 330 million users. These communication platforms have not only impacted social lives but have also impacted students’ learning, across different delivery modes in higher education: distance, conventional and blended learning modes. With this amount of interest in these platforms, knowledge sharing has gained importance within the context in which it is required. In open distance learning (ODL) contexts, social networking platforms can offer students and teachers the platform on which to create and share knowledge, and form learning collaborations. Thus, they can serve as support mechanisms to increase interactions and reduce isolation and loneliness inherent in ODL. Despite this potential and opportunity, research indicates that many ODL teachers are not inclined to using social media tools in learning. Although it is unclear why these tools are uncommon in these environments, concerns raised in the literature have indicated that many teachers have not mastered the art of teaching with technology. Using technological, pedagogical content knowledge (TPCK) and product quality theory, and Bloom’s Taxonomy as lenses, this paper is aimed at; firstly, assessing the quality of three social media applications: Facebook, Twitter and WhatsApp, in order to determine the extent to which they are suitable platforms for teaching and learning, in terms of content generation, information sharing and learning collaborations. Secondly, the paper demonstrates the application of teaching, learning and assessment using Bloom’s Taxonomy.Keywords: distance education, quality, social networking tools, TPACK
Procedia PDF Downloads 12427808 Steps towards the Development of National Health Data Standards in Developing Countries
Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray
Abstract:
The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia
Procedia PDF Downloads 33827807 Teaching Behaviours of Effective Secondary Mathematics Teachers: A Study in Dhaka, Bangladesh
Authors: Asadullah Sheikh, Kerry Barnett, Paul Ayres
Abstract:
Despite significant progress in access, equity and public examination success, poor student performance in mathematics in secondary schools has become a major concern in Bangladesh. A substantial body of research has emphasised the important contribution of teaching practices to student achievement. However, this has not been investigated in Bangladesh. Therefore, the study sought to find out the effectiveness of mathematics teaching practices as a means of improving secondary school mathematics in Dhaka Municipality City (DMC) area, Bangladesh. The purpose of this study was twofold, first, to identify the 20 highest performing secondary schools in mathematics in DMC, and second, to investigate the teaching practices of mathematics teachers in these schools. A two-phase mixed method approach was adopted. In the first phase, secondary source data were obtained from the Board of Intermediate and Secondary Education (BISE), Dhaka and value-added measures used to identify the 20 highest performing secondary schools in mathematics. In the second phase, a concurrent mixed method design, where qualitative methods were embedded within a dominant quantitative approach was utilised. A purposive sampling strategy was used to select fifteen teachers from the 20 highest performing secondary schools. The main sources of data were classroom teaching observations, and teacher interviews. The data from teacher observations were analysed with descriptive and nonparametric statistics. The interview data were analysed qualitatively. The main findings showed teachers adopt a direct teaching approach which incorporates orientation, structuring, modelling, practice, questioning and teacher-student interaction that creates an individualistic learning environment. The variation in developmental levels of teaching skill indicate that teachers do not necessarily use the qualitative (i.e., focus, stage, quality and differentiation) aspects of teaching behaviours effectively. This is the first study to investigate teaching behaviours of effective secondary mathematics teachers within Dhaka, Bangladesh. It contributes in an international dimension to the field of educational effectiveness and raise questions about existing constructivist approaches. Further, it contributes to important insights about teaching behaviours that can be used to inform the development of evidence-based policy and practice on quality teaching in Bangladesh.Keywords: effective teaching, mathematics, secondary schools, student achievement, value-added measures
Procedia PDF Downloads 23827806 The Lived Experiences of Fathers with Children Who Have Cerebral Palsy: An Interpretative Phenomenological Analysis
Authors: Krizette Ladera
Abstract:
Fathers are there not only to provide the financial stability of a family but a father is also there to provide the love and support that usually people would see as the mother’s responsibility. To describe the lived experiences and how fathers make sense of their lived experiences with their children who have cerebral palsy is the main objective of the study. A qualitative research using a thematic analysis was used for the study. The qualitative research focused on the personal narratives, self-report and expression of the participant’s memory in terms of how they tell their stories. The interpretative phenomenological analysis was used to focus on the experience of the participants on how they will describe their experiences, and to also add on that the IPA will also attempt to describe and explain the meaning of human experiences using interview, specifically on the father who have a child that suffers from cerebral palsy. For the sampling technique, the snowball technique was used to gather participants from the referral of other participants. The five non-randomly selected fathers will be served as the participants for the research. A self-made interview with an open-ended question was used as the research instrument; it includes profiling of the respondent as well as their experiences in taking care of their child that suffers from cerebral palsy. In analyzing a data, the researcher used the thematic analysis where in the interview was made into a transcript, then it was organized and divided themes. After that, the relations of each themes, was identified and it was later documented and translated into written text format using thematic grouping. Finally, the researcher analyzed each data according to its themes and put it in a table to be presented in the result section of the study And as for the result of the study, the researcher was able to come up with the four (4) main themes that most of the participants experienced and those are: The experiences in finding out about the condition of the Child, disclosing the condition of the child to the family and its emotional effect, The experiences of living the day of day realities in providing the physical, financial, emotional and a well balanced environment to the child, and the religious perspectives of the fathers. Along with those four (4) themes comes the subtheme which explains the themes in a more detailed explanation.Keywords: cerebral palsy, children, fathers, lived experiences
Procedia PDF Downloads 20527805 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-
Authors: Nieto Bernal Wilson, Carmona Suarez Edgar
Abstract:
The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects. Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse
Procedia PDF Downloads 40927804 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis
Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee
Abstract:
In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences
Procedia PDF Downloads 74327803 A Digital Twin Approach to Support Real-time Situational Awareness and Intelligent Cyber-physical Control in Energy Smart Buildings
Authors: Haowen Xu, Xiaobing Liu, Jin Dong, Jianming Lian
Abstract:
Emerging smart buildings often employ cyberinfrastructure, cyber-physical systems, and Internet of Things (IoT) technologies to increase the automation and responsiveness of building operations for better energy efficiency and lower carbon emission. These operations include the control of Heating, Ventilation, and Air Conditioning (HVAC) and lighting systems, which are often considered a major source of energy consumption in both commercial and residential buildings. Developing energy-saving control models for optimizing HVAC operations usually requires the collection of high-quality instrumental data from iterations of in-situ building experiments, which can be time-consuming and labor-intensive. This abstract describes a digital twin approach to automate building energy experiments for optimizing HVAC operations through the design and development of an adaptive web-based platform. The platform is created to enable (a) automated data acquisition from a variety of IoT-connected HVAC instruments, (b) real-time situational awareness through domain-based visualizations, (c) adaption of HVAC optimization algorithms based on experimental data, (d) sharing of experimental data and model predictive controls through web services, and (e) cyber-physical control of individual instruments in the HVAC system using outputs from different optimization algorithms. Through the digital twin approach, we aim to replicate a real-world building and its HVAC systems in an online computing environment to automate the development of building-specific model predictive controls and collaborative experiments in buildings located in different climate zones in the United States. We present two case studies to demonstrate our platform’s capability for real-time situational awareness and cyber-physical control of the HVAC in the flexible research platforms within the Oak Ridge National Laboratory (ORNL) main campus. Our platform is developed using adaptive and flexible architecture design, rendering the platform generalizable and extendable to support HVAC optimization experiments in different types of buildings across the nation.Keywords: energy-saving buildings, digital twins, HVAC, cyber-physical system, BIM
Procedia PDF Downloads 11027802 Automated Testing to Detect Instance Data Loss in Android Applications
Authors: Anusha Konduru, Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai
Abstract:
Mobile applications are increasing in a significant amount, each to address the requirements of many users. However, the quick developments and enhancements are resulting in many underlying defects. Android apps create and handle a large variety of 'instance' data that has to persist across runs, such as the current navigation route, workout results, antivirus settings, or game state. Due to the nature of Android, an app can be paused, sent into the background, or killed at any time. If the instance data is not saved and restored between runs, in addition to data loss, partially-saved or corrupted data can crash the app upon resume or restart. However, it is difficult for the programmer to manually test this issue for all the activities. This results in the issue of data loss that the data entered by the user are not saved when there is any interruption. This issue can degrade user experience because the user needs to reenter the information each time there is an interruption. Automated testing to detect such data loss is important to improve the user experience. This research proposes a tool, DroidDL, a data loss detector for Android, which detects the instance data loss from a given android application. We have tested 395 applications and found 12 applications with the issue of data loss. This approach is proved highly accurate and reliable to find the apps with this defect, which can be used by android developers to avoid such errors.Keywords: Android, automated testing, activity, data loss
Procedia PDF Downloads 23727801 Solar-Thermal-Electric Stirling Engine-Powered System for Residential Units
Authors: Florian Misoc, Cyril Okhio, Joshua Tolbert, Nick Carlin, Thomas Ramey
Abstract:
This project is focused on designing a Stirling engine system for a solar-thermal-electrical system that can supply electric power to a single residential unit. Since Stirling engines are heat engines operating any available heat source, is notable for its ability to generate clean and reliable energy without emissions. Due to the need of finding alternative energy sources, the Stirling engines are making a comeback with the recent technologies, which include thermal energy conservation during the heat transfer process. Recent reviews show mounting evidence and positive test results that Stirling engines are able to produce constant energy supply that ranges from 5kW to 20kW. Solar Power source is one of the many uses for Stirling engines. Using solar energy to operate Stirling engines is an idea considered by many researchers, due to the ease of adaptability of the Stirling engine. In this project, the Stirling engine developed was designed and tested to operate from biomass source of energy, i.e., wood pellets stove, during low solar radiation, with good results. A 20% efficiency of the engine was estimated, and 18% efficiency was measured, making it suitable and appropriate for residential applications. The effort reported was aimed at exploring parameters necessary to design, build and test a ‘Solar Powered Stirling Engine (SPSE)’ using Water (H₂O) as the Heat Transfer medium, with Nitrogen as the working gas that can reach or exceed an efficiency of 20%. The main objectives of this work consisted in: converting a V-twin cylinder air compressor into an alpha-type Stirling engine, construct a Solar Water Heater, by using an automotive radiator as the high-temperature reservoir for the Stirling engine, and an array of fixed mirrors that concentrate the solar radiation on the automotive radiator/high-temperature reservoir. The low-temperature reservoir is the surrounding air at ambient temperature. This work has determined that a low-cost system is sufficiently efficient and reliable. Off-the-shelf components have been used and estimates of the ability of the Engine final design to meet the electricity needs of small residence have been determined.Keywords: stirling engine, solar-thermal, power inverter, alternator
Procedia PDF Downloads 27827800 Big Data: Appearance and Disappearance
Authors: James Moir
Abstract:
The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.Keywords: big data, appearance, disappearance, surface, epistemology
Procedia PDF Downloads 42127799 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images
Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann
Abstract:
FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design
Procedia PDF Downloads 27827798 Density Determination of Liquid Niobium by Means of Ohmic Pulse-Heating for Critical Point Estimation
Authors: Matthias Leitner, Gernot Pottlacher
Abstract:
Experimental determination of critical point data like critical temperature, critical pressure, critical volume and critical compressibility of high-melting metals such as niobium is very rare due to the outstanding experimental difficulties in reaching the necessary extreme temperature and pressure regimes. Experimental techniques to achieve such extreme conditions could be diamond anvil devices, two stage gas guns or metal samples hit by explosively accelerated flyers. Electrical pulse-heating under increased pressures would be another choice. This technique heats thin wire samples of 0.5 mm diameter and 40 mm length from room temperature to melting and then further to the end of the stable phase, the spinodal line, within several microseconds. When crossing the spinodal line, the sample explodes and reaches the gaseous phase. In our laboratory, pulse-heating experiments can be performed under variation of the ambient pressure from 1 to 5000 bar and allow a direct determination of critical point data for low-melting, but not for high-melting metals. However, the critical point also can be estimated by extrapolating the liquid phase density according to theoretical models. A reasonable prerequisite for the extrapolation is the existence of data that cover as much as possible of the liquid phase and at the same time exhibit small uncertainties. Ohmic pulse-heating was therefore applied to determine thermal volume expansion, and from that density of niobium over the entire liquid phase. As a first step, experiments under ambient pressure were performed. The second step will be to perform experiments under high-pressure conditions. During the heating process, shadow images of the expanding sample wire were captured at a frame rate of 4 × 105 fps to monitor the radial expansion as a function of time. Simultaneously, the sample radiance was measured with a pyrometer operating at a mean effective wavelength of 652 nm. To increase the accuracy of temperature deduction, spectral emittance in the liquid phase is also taken into account. Due to the high heating rates of about 2 × 108 K/s, longitudinal expansion of the wire is inhibited which implies an increased radial expansion. As a consequence, measuring the temperature dependent radial expansion is sufficient to deduce density as a function of temperature. This is accomplished by evaluating the full widths at half maximum of the cup-shaped intensity profiles that are calculated from each shadow image of the expanding wire. Relating these diameters to the diameter obtained before the pulse-heating start, the temperature dependent volume expansion is calculated. With the help of the known room-temperature density, volume expansion is then converted into density data. The so-obtained liquid density behavior is compared to existing literature data and provides another independent source of experimental data. In this work, the newly determined off-critical liquid phase density was in a second step utilized as input data for the estimation of niobium’s critical point. The approach used, heuristically takes into account the crossover from mean field to Ising behavior, as well as the non-linearity of the phase diagram’s diameter.Keywords: critical point data, density, liquid metals, niobium, ohmic pulse-heating, volume expansion
Procedia PDF Downloads 21927797 Exploring the Feasibility of Utilizing Blockchain in Cloud Computing and AI-Enabled BIM for Enhancing Data Exchange in Construction Supply Chain Management
Authors: Tran Duong Nguyen, Marwan Shagar, Qinghao Zeng, Aras Maqsoodi, Pardis Pishdad, Eunhwa Yang
Abstract:
Construction supply chain management (CSCM) involves the collaboration of many disciplines and actors, which generates vast amounts of data. However, inefficient, fragmented, and non-standardized data storage often hinders this data exchange. The industry has adopted building information modeling (BIM) -a digital representation of a facility's physical and functional characteristics to improve collaboration, enhance transmission security, and provide a common data exchange platform. Still, the volume and complexity of data require tailored information categorization, aligning with stakeholders' preferences and demands. To address this, artificial intelligence (AI) can be integrated to handle this data’s magnitude and complexities. This research aims to develop an integrated and efficient approach for data exchange in CSCM by utilizing AI. The paper covers five main objectives: (1) Investigate existing framework and BIM adoption; (2) Identify challenges in data exchange; (3) Propose an integrated framework; (4) Enhance data transmission security; and (5) Develop data exchange in CSCM. The proposed framework demonstrates how integrating BIM and other technologies, such as cloud computing, blockchain, and AI applications, can significantly improve the efficiency and accuracy of data exchange in CSCM.Keywords: construction supply chain management, BIM, data exchange, artificial intelligence
Procedia PDF Downloads 2627796 Representation Data without Lost Compression Properties in Time Series: A Review
Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan
Abstract:
Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.Keywords: compression properties, uncertainty, uncertain time series, mining technique, weather prediction
Procedia PDF Downloads 42827795 Municipal Solid Waste Management Using Life Cycle Assessment Approach: Case Study of Maku City, Iran
Authors: L. Heidari, M. Jalili Ghazizade
Abstract:
This paper aims to determine the best environmental and economic scenario for Municipal Solid Waste (MSW) management of the Maku city by using Life Cycle Assessment (LCA) approach. The functional elements of this study are collection, transportation, and disposal of MSW in Maku city. Waste composition and density, as two key parameters of MSW, have been determined by field sampling, and then, the other important specifications of MSW like chemical formula, thermal energy and water content were calculated. These data beside other information related to collection and disposal facilities are used as a reliable source of data to assess the environmental impacts of different waste management options, including landfills, composting, recycling and energy recovery. The environmental impact of MSW management options has been investigated in 15 different scenarios by Integrated Waste Management (IWM) software. The photochemical smog, greenhouse gases, acid gases, toxic emissions, and energy consumption of each scenario are measured. Then, the environmental indices of each scenario are specified by weighting these parameters. Economic costs of scenarios have been also compared with each other based on literature. As final result, since the organic materials make more than 80% of the waste, compost can be a suitable method. Although the major part of the remaining 20% of waste can be recycled, due to the high cost of necessary equipment, the landfill option has been suggested. Therefore, the scenario with 80% composting and 20% landfilling is selected as superior environmental and economic scenario. This study shows that, to select a scenario with practical applications, simultaneously environmental and economic aspects of different scenarios must be considered.Keywords: IWM software, life cycle assessment, Maku, municipal solid waste management
Procedia PDF Downloads 23827794 A Method to Evaluate and Compare Web Information Extractors
Authors: Patricia Jiménez, Rafael Corchuelo, Hassan A. Sleiman
Abstract:
Web mining is gaining importance at an increasing pace. Currently, there are many complementary research topics under this umbrella. Their common theme is that they all focus on applying knowledge discovery techniques to data that is gathered from the Web. Sometimes, these data are relatively easy to gather, chiefly when it comes from server logs. Unfortunately, there are cases in which the data to be mined is the data that is displayed on a web document. In such cases, it is necessary to apply a pre-processing step to first extract the information of interest from the web documents. Such pre-processing steps are performed using so-called information extractors, which are software components that are typically configured by means of rules that are tailored to extracting the information of interest from a web page and structuring it according to a pre-defined schema. Paramount to getting good mining results is that the technique used to extract the source information is exact, which requires to evaluate and compare the different proposals in the literature from an empirical point of view. According to Google Scholar, about 4 200 papers on information extraction have been published during the last decade. Unfortunately, they were not evaluated within a homogeneous framework, which leads to difficulties to compare them empirically. In this paper, we report on an original information extraction evaluation method. Our contribution is three-fold: a) this is the first attempt to provide an evaluation method for proposals that work on semi-structured documents; the little existing work on this topic focuses on proposals that work on free text, which has little to do with extracting information from semi-structured documents. b) It provides a method that relies on statistically sound tests to support the conclusions drawn; the previous work does not provide clear guidelines or recommend statistically sound tests, but rather a survey that collects many features to take into account as well as related work; c) We provide a novel method to compute the performance measures regarding unsupervised proposals; otherwise they would require the intervention of a user to compute them by using the annotations on the evaluation sets and the information extracted. Our contributions will definitely help researchers in this area make sure that they have advanced the state of the art not only conceptually, but from an empirical point of view; it will also help practitioners make informed decisions on which proposal is the most adequate for a particular problem. This conference is a good forum to discuss on our ideas so that we can spread them to help improve the evaluation of information extraction proposals and gather valuable feedback from other researchers.Keywords: web information extractors, information extraction evaluation method, Google scholar, web
Procedia PDF Downloads 24827793 Computer Self-Efficacy, Study Behaviour and Use of Electronic Information Resources in Selected Polytechnics in Ogun State, Nigeria
Authors: Fredrick Olatunji Ajegbomogun, Bello Modinat Morenikeji, Okorie Nancy Chituru
Abstract:
Electronic information resources are highly relevant to students' academic and research needs but are grossly underutilized, despite the institutional commitment to making them available. The under-utilisation of these resources could be attributed to a low level of study behaviour coupled with a low level of computer self-efficacy. This study assessed computer self-efficacy, study behaviour, and the use of electronic information resources by students in selected polytechnics in Ogun State. A simple random sampling technique using Krejcie and Morgan's (1970) Table was used to select 370 respondents for the study. A structured questionnaire was used to collect data on respondents. Data were analysed using frequency counts, percentages, mean, standard deviation, Pearson Product Moment Correlation (PPMC) and multiple regression analysis. Results reveal that the internet (= 1.94), YouTube (= 1.74), and search engines (= 1.72) were the common information resources available to the students, while the Internet (= 4.22) is the most utilized resource. Major reasons for using electronic information resources were to source materials and information (= 3.30), for research (= 3.25), and to augment class notes (= 2.90). The majority (91.0%) of the respondents have a high level of computer self-efficacy in the use of electronic information resources through selecting from screen menus (= 3.12), using data files ( = 3.10), and efficient use of computers (= 3.06). Good preparation for tests (= 3.27), examinations (= 3.26), and organization of tutorials (= 3.11) are the common study behaviours of the respondents. Overall, 93.8% have good study behaviour. Inadequate computer facilities to access information (= 3.23), and poor internet access (= 2.87) were the major challenges confronting students’ use of electronic information resources. According to the PPMC results, study behavior (r = 0.280) and computer self-efficacy (r = 0.304) have significant (p 0.05) relationships with the use of electronic information resources. Regression results reveal that self-efficacy (=0.214) and study behavior (=0.122) positively (p 0.05) influenced students' use of electronic information resources. The study concluded that students' use of electronic information resources depends on the purpose, their computer self-efficacy, and their study behaviour. Therefore, the study recommended that the management should encourage the students to improve their study habits and computer skills, as this will enhance their continuous and more effective utilization of electronic information resources.Keywords: computer self-efficacy, study behaviour, electronic information resources, polytechnics, Nigeria
Procedia PDF Downloads 12027792 Digital Portfolio as Mediation to Enhance Willingness to Communicate in English
Authors: Saeko Toyoshima
Abstract:
This research will discuss if performance tasks with technology would enhance students' willingness to communicate. The present study investigated how Japanese learners of English would change their attitude to communication in their target language by experiencing a performance task, called 'digital portfolio', in the classroom, applying the concepts of action research. The study adapted questionnaires including four-Likert and open-end questions as mixed-methods research. There were 28 students in the class. Many of Japanese university students with low proficiency (A1 in Common European Framework of References in Language Learning and Teaching) have difficulty in communicating in English due to the low proficiency and the lack of practice in and outside of the classroom at secondary education. They should need to mediate between themselves in the world of L1 and L2 with completing a performance task for communication. This paper will introduce the practice of CALL class where A1 level students have made their 'digital portfolio' related to the topics of TED® (Technology, Entertainment, Design) Talk materials. The students had 'Portfolio Session' twice in one term, once in the middle, and once at the end of the course, where they introduced their portfolio to their classmates and international students in English. The present study asked the students to answer a questionnaire about willingness to communicate twice, once at the end of the first term and once at the end of the second term. The four-Likert questions were statistically analyzed with a t-test, and the answers to open-end questions were analyzed to clarify the difference between them. They showed that the students had a more positive attitude to communication in English and enhanced their willingness to communicate through the experiences of the task. It will be the implication of this paper that making and presenting portfolio as a performance task would lead them to construct themselves in English and enable them to communicate with the others enjoyably and autonomously.Keywords: action research, digital portfoliio, computer-assisted language learning, ELT with CALL system, mixed methods research, Japanese English learners, willingness to communicate
Procedia PDF Downloads 11827791 Data Mining As A Tool For Knowledge Management: A Review
Authors: Maram Saleh
Abstract:
Knowledge has become an essential resource in today’s economy and become the most important asset of maintaining competition advantage in organizations. The importance of knowledge has made organizations to manage their knowledge assets and resources through all multiple knowledge management stages such as: Knowledge Creation, knowledge storage, knowledge sharing and knowledge use. Researches on data mining are continues growing over recent years on both business and educational fields. Data mining is one of the most important steps of the knowledge discovery in databases process aiming to extract implicit, unknown but useful knowledge and it is considered as significant subfield in knowledge management. Data miming have the great potential to help organizations to focus on extracting the most important information on their data warehouses. Data mining tools and techniques can predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. This review paper explores the applications of data mining techniques in supporting knowledge management process as an effective knowledge discovery technique. In this paper, we identify the relationship between data mining and knowledge management, and then focus on introducing some application of date mining techniques in knowledge management for some real life domains.Keywords: Data Mining, Knowledge management, Knowledge discovery, Knowledge creation.
Procedia PDF Downloads 20827790 Access and Outcome of Khas Land in Bangladesh: An Insight into Its Distribution Process and Influence on Landless Farmers
Authors: M. Foysul Alam
Abstract:
Land diluvion is a common phenomenon in Bangladesh. According to Bangladesh land administration manual, land gained by diluvion and alluvion are considered as Khas (state) lands. Distributing khas lands to the poorest landless people was suggested as a way of improving the livelihood by mobilizing their economic activity and improving their social status in Bangladesh. The impact of this initiative, which started in 1980 as a state policy, is perhaps one of the least experimentally documented ones in the country. The primary aim of this study was to investigate the economic and social status of some landless farmers focusing on determining whether the distribution of khas lands provide part of the explanation for possible improvement of their livelihood during this period of almost four decades. If not, the secondary objective of this study was to identify the issues and challenges of this policy, whether in strategic planning or implementation efforts. Both subjectively observable and objectively measurable data were collected from 150 households from 30 villages of 6 districts worse suffering from landlessness. A well-organized questionnaire, including 10 open ended question, were used to collect specific data and experiences of the landless farmers. Special attention was given to those who left peasantry becoming involved in other economic activities. The study found that Khas lands have very nominal or even zero influence on landless farmers due to 1. Lack of Khas land inventory 2. Lack of availability of information to the farmers 3. Complexity in application procedure for demanding Khas land and 4. Political nexus of local influential farmers makes it nearly impossible for those farmers who actually deserve it. It also found that most of the lucky farmers who eventually got a piece of Khas land had links with the local elites of their community. The research concludes that it is too early to decide how good the Khas land is for the marginalized landless farmers in Bangladesh, with actually a few numbers of them given the possession of one. This document provides crucial recommendations for overcoming the prevailing challenges.Keywords: Khas land, land distribution, marginalized farmers, rural livelihood
Procedia PDF Downloads 12527789 LIZTOXD: Inclusive Lizard Toxin Database by Using MySQL Protocol
Authors: Iftikhar A. Tayubi, Tabrej Khan, Mansoor M. Alsubei, Fahad A. Alsaferi
Abstract:
LIZTOXD provides a single source of high-quality information about proteinaceous lizard toxins that will be an invaluable resource for pharmacologists, neuroscientists, toxicologists, medicinal chemists, ion channel scientists, clinicians, and structural biologists. We will provide an intuitive, well-organized and user-friendly web interface that allows users to explore the detail information of Lizard and toxin proteins. It includes common name, scientific name, entry id, entry name, protein name and length of the protein sequence. The utility of this database is that it can provide a user-friendly interface for users to retrieve the information about Lizard, toxin and toxin protein of different Lizard species. These interfaces created in this database will satisfy the demands of the scientific community by providing in-depth knowledge about Lizard and its toxin. In the next phase of our project we will adopt methodology and by using A MySQL and Hypertext Preprocessor (PHP) which and for designing Smart Draw. A database is a wonderful piece of equipment for storing large quantities of data efficiently. The users can thus navigate from one section to another, depending on the field of interest of the user. This database contains a wealth of information on species, toxins, toxins, clinical data etc. LIZTOXD resource that provides comprehensive information about protein toxins from lizard toxins. The combination of specific classification schemes and a rich user interface allows researchers to easily locate and view information on the sequence, structure, and biological activity of these toxins. This manually curated database will be a valuable resource for both basic researchers as well as those interested in potential pharmaceutical and agricultural applications of lizard toxins.Keywords: LIZTOXD, MySQL, PHP, smart draw
Procedia PDF Downloads 16227788 Developing a GIS-Based Tool for the Management of Fats, Oils, and Grease (FOG): A Case Study of Thames Water Wastewater Catchment
Authors: Thomas D. Collin, Rachel Cunningham, Bruce Jefferson, Raffaella Villa
Abstract:
Fats, oils and grease (FOG) are by-products of food preparation and cooking processes. FOG enters wastewater systems through a variety of sources such as households, food service establishments, and industrial food facilities. Over time, if no source control is in place, FOG builds up on pipe walls, leading to blockages, and potentially to sewer overflows which are a major risk to the Environment and Human Health. UK water utilities spend millions of pounds annually trying to control FOG. Despite UK legislation specifying that discharge of such material is against the law, it is often complicated for water companies to identify and prosecute offenders. Hence, it leads to uncertainties regarding the attitude to take in terms of FOG management. Research is needed to seize the full potential of implementing current practices. The aim of this research was to undertake a comprehensive study to document the extent of FOG problems in sewer lines and reinforce existing knowledge. Data were collected to develop a model estimating quantities of FOG available for recovery within Thames Water wastewater catchments. Geographical Information System (GIS) software was used in conjunction to integrate data with a geographical component. FOG was responsible for at least 1/3 of sewer blockages in Thames Water waste area. A waste-based approach was developed through an extensive review to estimate the potential for FOG collection and recovery. Three main sources were identified: residential, commercial and industrial. Commercial properties were identified as one of the major FOG producers. The total potential FOG generated was estimated for the 354 wastewater catchments. Additionally, raw and settled sewage were sampled and analysed for FOG (as hexane extractable material) monthly at 20 sewage treatment works (STW) for three years. A good correlation was found with the sampled FOG and population equivalent (PE). On average, a difference of 43.03% was found between the estimated FOG (waste-based approach) and sampled FOG (raw sewage sampling). It was suggested that the approach undertaken could overestimate the FOG available, the sampling could only capture a fraction of FOG arriving at STW, and/or the difference could account for FOG accumulating in sewer lines. Furthermore, it was estimated that on average FOG could contribute up to 12.99% of the primary sludge removed. The model was further used to investigate the relationship between estimated FOG and number of blockages. The higher the FOG potential, the higher the number of FOG-related blockages is. The GIS-based tool was used to identify critical areas (i.e. high FOG potential and high number of FOG blockages). As reported in the literature, FOG was one of the main causes of sewer blockages. By identifying critical areas (i.e. high FOG potential and high number of FOG blockages) the model further explored the potential for source-control in terms of ‘sewer relief’ and waste recovery. Hence, it helped targeting where benefits from implementation of management strategies could be the highest. However, FOG is still likely to persist throughout the networks, and further research is needed to assess downstream impacts (i.e. at STW).Keywords: fat, FOG, GIS, grease, oil, sewer blockages, sewer networks
Procedia PDF Downloads 20927787 Anomaly Detection Based Fuzzy K-Mode Clustering for Categorical Data
Authors: Murat Yazici
Abstract:
Anomalies are irregularities found in data that do not adhere to a well-defined standard of normal behavior. The identification of outliers or anomalies in data has been a subject of study within the statistics field since the 1800s. Over time, a variety of anomaly detection techniques have been developed in several research communities. The cluster analysis can be used to detect anomalies. It is the process of associating data with clusters that are as similar as possible while dissimilar clusters are associated with each other. Many of the traditional cluster algorithms have limitations in dealing with data sets containing categorical properties. To detect anomalies in categorical data, fuzzy clustering approach can be used with its advantages. The fuzzy k-Mode (FKM) clustering algorithm, which is one of the fuzzy clustering approaches, by extension to the k-means algorithm, is reported for clustering datasets with categorical values. It is a form of clustering: each point can be associated with more than one cluster. In this paper, anomaly detection is performed on two simulated data by using the FKM cluster algorithm. As a significance of the study, the FKM cluster algorithm allows to determine anomalies with their abnormality degree in contrast to numerous anomaly detection algorithms. According to the results, the FKM cluster algorithm illustrated good performance in the anomaly detection of data, including both one anomaly and more than one anomaly.Keywords: fuzzy k-mode clustering, anomaly detection, noise, categorical data
Procedia PDF Downloads 5327786 Microgravity, Hydrological and Metrological Monitoring of Shallow Ground Water Aquifer in Al-Ain, UAE
Authors: Serin Darwish, Hakim Saibi, Amir Gabr
Abstract:
The United Arab Emirates (UAE) is situated within an arid zone where the climate is arid and the recharge of the groundwater is very low. Groundwater is the primary source of water in the United Arab Emirates. However, rapid expansion, population growth, agriculture, and industrial activities have negatively affected these limited water resources. The shortage of water resources has become a serious concern due to the over-pumping of groundwater to meet demand. In addition to the deficit of groundwater, the UAE has one of the highest per capita water consumption rates in the world. In this study, a combination of time-lapse measurements of microgravity and depth to groundwater level in selected wells in Al Ain city was used to estimate the variations in groundwater storage. Al-Ain is the second largest city in Abu Dhabi Emirates and the third largest city in the UAE. The groundwater in this region has been overexploited. Relative gravity measurements were acquired using the Scintrex CG-6 Autograv. This latest generation gravimeter from Scintrex Ltd provides fast, precise gravity measurements and automated corrections for temperature, tide, instrument tilt and rejection of data noise. The CG-6 gravimeter has a resolution of 0.1μGal. The purpose of this study is to measure the groundwater storage changes in the shallow aquifers based on the application of microgravity method. The gravity method is a nondestructive technique that allows collection of data at almost any location over the aquifer. Preliminary results indicate a possible relationship between microgravity and water levels, but more work needs to be done to confirm this. The results will help to develop the relationship between monthly microgravity changes with hydrological and hydrogeological changes of shallow phreatic. The study will be useful in water management considerations and additional future investigations.Keywords: Al-Ain, arid region, groundwater, microgravity
Procedia PDF Downloads 15227785 Reduce, Reuse and Recycle: Grand Challenges in Construction Recovery Process
Authors: Abioye A. Oyenuga, Rao Bhamidiarri
Abstract:
Hurling a successful Construction and Demolition Waste (C&DW) recycling operation around the globe is a challenge today, predominantly because secondary materials markets are yet to be integrated. Reducing, Reusing and recycling of (C&DW) have been employed over the years, and various techniques have been investigated. However, the economic and environmental viability of its application seems limited. This paper discusses the costs and benefits in using secondary materials and focus on investigating reuse and recycling process for five major types of construction materials: concrete, metal, wood, cardboard/paper, and plasterboard. Data obtained from demolition specialist and contractors are considered and evaluated. With the date source, the research paper found that construction material recovery process fully incorporate the 3R’s process and shows how energy recovery by means of 3R's principles can be evaluated. This scrutiny leads to the empathy of grand challenges in construction material recovery process. Recommendations to deepen material recovery process are also discussed.Keywords: construction and demolition waste (C&DW), 3R concept, recycling, reuse, waste management, UK
Procedia PDF Downloads 42827784 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme
Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme
Procedia PDF Downloads 48027783 In vitro Antioxidant Scavenging of Root Fraction of Bryonia dioica
Authors: Yamani Amal, Lazaae Jamila, Elachouri Mostafa
Abstract:
Plants and their active agents – especially polyphenols – may have a principal role in the treatment of diseases that result from the defect of physiological antioxidant mechanisms. Bryonia dioica is well known in Moroccan traditional medicine for alleviatin pain and traiting many diseases. We have focused on plant belonging to Cucurbitaceae Family from around the world to understand their therapeutic uses and their potential antioxidant activities Although several biological activities and Chemical composition of Bryonia dioica are well characterized, no direct, in vitro study, of this natural product examined the antioxydant effect of the extract from the roots of Bryonia dioica. The aim of this study was to determine in vitro antioxidant activity of the B.dioica root, using antioxidant analysis methods based on determination of Hydroxyradical Scavenging, 1,1-diphenyl-2-picrylhydrazine (DPPH) radical scavenging, Hydrogenperoxide Scavenging and Nitric Oxide Scavenging. In this study, it was demonstrated, that, B. dioica root extract showed excellent antioxidant properties. This investigation showed that the roots of this plant contain potent natural scavengers R. It may represent an interesting source of antioxidant phenolics that may favour the extension of their cultivation as new source of natural antioxidants in addition to containing high quality proteins for human or animal nutrition. Therefore, there is need for all stakeholders on the Morocco to strive towards taking advantage of our enormous biodiversity resources to free our people from diseases, abject poverty and stagnation.Keywords: Morocco, bryoniadioica, in vitro, antioxydant
Procedia PDF Downloads 384