Search results for: geospatial data science
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26074

Search results for: geospatial data science

24844 How Western Donors Allocate Official Development Assistance: New Evidence From a Natural Language Processing Approach

Authors: Daniel Benson, Yundan Gong, Hannah Kirk

Abstract:

Advancement in national language processing techniques has led to increased data processing speeds, and reduced the need for cumbersome, manual data processing that is often required when processing data from multilateral organizations for specific purposes. As such, using named entity recognition (NER) modeling and the Organisation of Economically Developed Countries (OECD) Creditor Reporting System database, we present the first geotagged dataset of OECD donor Official Development Assistance (ODA) projects on a global, subnational basis. Our resulting data contains 52,086 ODA projects geocoded to subnational locations across 115 countries, worth a combined $87.9bn. This represents the first global, OECD donor ODA project database with geocoded projects. We use this new data to revisit old questions of how ‘well’ donors allocate ODA to the developing world. This understanding is imperative for policymakers seeking to improve ODA effectiveness.

Keywords: international aid, geocoding, subnational data, natural language processing, machine learning

Procedia PDF Downloads 52
24843 Teaching Practices for Subverting Significant Retentive Learner Errors in Arithmetic

Authors: Michael Lousis

Abstract:

The systematic identification of the most conspicuous and significant errors made by learners during three-years of testing of their progress in learning Arithmetic throughout the development of the Kassel Project in England and Greece was accomplished. How much retentive these errors were over three-years in the officially provided school instruction of Arithmetic in these countries has also been shown. The learners’ errors in Arithmetic stemmed from a sample, which was comprised of two hundred (200) English students and one hundred and fifty (150) Greek students. The sample was purposefully selected according to the students’ participation in each testing session in the development of the three-year project, in both domains simultaneously in Arithmetic and Algebra. Specific teaching practices have been invented and are presented in this study for subverting these learners’ errors, which were found out to be retentive to the level of the nationally provided mathematical education of each country. The invention and the development of these proposed teaching practices were founded on the rationality of the theoretical accounts concerning the explanation, prediction and control of the errors, on the conceptual metaphor and on an analysis, which tried to identify the required cognitive components and skills of the specific tasks, in terms of Psychology and Cognitive Science as applied to information-processing. The aim of the implementation of these instructional practices is not only the subversion of these errors but the achievement of the mathematical competence, as this was defined to be constituted of three elements: appropriate representations - appropriate meaning - appropriately developed schemata. However, praxis is of paramount importance, because there is no independent of science ‘real-truth’ and because praxis serves as quality control when it takes the form of a cognitive method.

Keywords: arithmetic, cognitive science, cognitive psychology, information-processing paradigm, Kassel project, level of the nationally provided mathematical education, praxis, remedial mathematical teaching practices, retentiveness of errors

Procedia PDF Downloads 299
24842 Compressed Suffix Arrays to Self-Indexes Based on Partitioned Elias-Fano

Authors: Guo Wenyu, Qu Youli

Abstract:

A practical and simple self-indexing data structure, Partitioned Elias-Fano (PEF) - Compressed Suffix Arrays (CSA), is built in linear time for the CSA based on PEF indexes. Moreover, the PEF-CSA is compared with two classical compressed indexing methods, Ferragina and Manzini implementation (FMI) and Sad-CSA on different type and size files in Pizza & Chili. The PEF-CSA performs better on the existing data in terms of the compression ratio, count, and locates time except for the evenly distributed data such as proteins data. The observations of the experiments are that the distribution of the φ is more important than the alphabet size on the compression ratio. Unevenly distributed data φ makes better compression effect, and the larger the size of the hit counts, the longer the count and locate time.

Keywords: compressed suffix array, self-indexing, partitioned Elias-Fano, PEF-CSA

Procedia PDF Downloads 234
24841 Data, Digital Identity and Antitrust Law: An Exploratory Study of Facebook’s Novi Digital Wallet

Authors: Wanjiku Karanja

Abstract:

Facebook has monopoly power in the social networking market. It has grown and entrenched its monopoly power through the capture of its users’ data value chains. However, antitrust law’s consumer welfare roots have prevented it from effectively addressing the role of data capture in Facebook’s market dominance. These regulatory blind spots are augmented in Facebook’s proposed Diem cryptocurrency project and its Novi Digital wallet. Novi, which is Diem’s digital identity component, shall enable Facebook to collect an unprecedented volume of consumer data. Consequently, Novi has seismic implications on internet identity as the network effects of Facebook’s large user base could establish it as the de facto internet identity layer. Moreover, the large tracts of data Facebook shall collect through Novi shall further entrench Facebook's market power. As such, the attendant lock-in effects of this project shall be very difficult to reverse. Urgent regulatory action is therefore required to prevent this expansion of Facebook’s data resources and monopoly power. This research thus highlights the importance of data capture to competition and market health in the social networking industry. It utilizes interviews with key experts to empirically interrogate the impact of Facebook’s data capture and control of its users’ data value chains on its market power. This inquiry is contextualized against Novi’s expansive effect on Facebook’s data value chains. It thus addresses the novel antitrust issues arising at the nexus of Facebook’s monopoly power and the privacy of its users’ data. It also explores the impact of platform design principles, specifically data portability and data portability, in mitigating Facebook’s anti-competitive practices. As such, this study finds that Facebook is a powerful monopoly that dominates the social media industry to the detriment of potential competitors. Facebook derives its power from its size, annexure of the consumer data value chain, and control of its users’ social graphs. Additionally, the platform design principles of data interoperability and data portability are not a panacea to restoring competition in the social networking market. Their success depends on the establishment of robust technical standards and regulatory frameworks.

Keywords: antitrust law, data protection law, data portability, data interoperability, digital identity, Facebook

Procedia PDF Downloads 107
24840 Creating an Impact through Environmental Law and Policy with a Focus on Environmental Science Restoration with Social Impacts

Authors: Lauren Beth Birney

Abstract:

BOP-CCERS is a consortium of scientists, K-16 New York City students, faculty, academicians, teachers, stakeholders, STEM Industry professionals, CBO’s, NPO’s, citizen scientists, and local businesses working in partnership to restore New York Harbor’s oyster populations while at the same time providing clean water in New York Harbor. BOP-CCERS gives students an opportunity to learn hands-on about environmental stewardship as well as environmental law and policy by giving students real responsibility. The purpose of this REU will allow for the BOP CCERS Project to further broaden its parameters into the focus of environmental law and policy where further change can be affected. Creating opportunities for undergraduates to work collaboratively with graduate students in law and policy and envision themselves in STEM careers in the field of law continues to be of importance in this project. More importantly, creating opportunities for underrepresented students to pursue careers in STEM Education has been a goal of the project over the last ten years. By raising the level of student interest in community-based citizen science integrated into environmental law and policy, a more diversified workforce will be fostered through the momentum of this dynamic program. The continuing climate crisis facing our planet calls for 21st-century skill development that includes learning and innovation skills derived from critical thinking, which will help REU students address the issues of climate change facing our planet. The demand for a climate-friendly workforce will continue to be met through this community-based citizen science effort. Environmental laws and policies play a crucial role in protecting humans, animals, resources, and habitats. Without these laws, there would be no regulations concerning pollution or contamination of our waterways. Environmental law serves as a mechanism to protect the land, air, water, and soil of our planet. To protect the environment, it is crucial that future policymakers and legal experts both understand and value the importance of environmental protection. The Environmental Law and Policy REU provides students with the opportunity to learn, through hands-on work, the skills, and knowledge needed to help foster a legal workforce centered around environmental protection while participating alongside the BOP CCERS researchers in order to gain research experience. Broadening this area to law and policy will further increase these opportunities and permit students to ultimately affect and influence larger-scale change on a global level while further diversifying the STEM workforce. Students’ findings will be shared at the annual STEM Institute at Pace University in August 2022. Basic research methodologies include qualitative and quantitative analysis performed by the research team. Early findings indicate that providing students with an opportunity to experience, explore and participate in environmental science programs such as these enhances their interests in pursuing STEM careers in Law and Policy, with the focus being on providing opportunities for underserved, marginalized, and underrepresented populations.

Keywords: environmental restoration science, citizen science, environmental law and policy, STEM education

Procedia PDF Downloads 86
24839 Data Quality Enhancement with String Length Distribution

Authors: Qi Xiu, Hiromu Hota, Yohsuke Ishii, Takuya Oda

Abstract:

Recently, collectable manufacturing data are rapidly increasing. On the other hand, mega recall is getting serious as a social problem. Under such circumstances, there are increasing needs for preventing mega recalls by defect analysis such as root cause analysis and abnormal detection utilizing manufacturing data. However, the time to classify strings in manufacturing data by traditional method is too long to meet requirement of quick defect analysis. Therefore, we present String Length Distribution Classification method (SLDC) to correctly classify strings in a short time. This method learns character features, especially string length distribution from Product ID, Machine ID in BOM and asset list. By applying the proposal to strings in actual manufacturing data, we verified that the classification time of strings can be reduced by 80%. As a result, it can be estimated that the requirement of quick defect analysis can be fulfilled.

Keywords: string classification, data quality, feature selection, probability distribution, string length

Procedia PDF Downloads 305
24838 Temporally Coherent 3D Animation Reconstruction from RGB-D Video Data

Authors: Salam Khalifa, Naveed Ahmed

Abstract:

We present a new method to reconstruct a temporally coherent 3D animation from single or multi-view RGB-D video data using unbiased feature point sampling. Given RGB-D video data, in form of a 3D point cloud sequence, our method first extracts feature points using both color and depth information. In the subsequent steps, these feature points are used to match two 3D point clouds in consecutive frames independent of their resolution. Our new motion vectors based dynamic alignment method then fully reconstruct a spatio-temporally coherent 3D animation. We perform extensive quantitative validation using novel error functions to analyze the results. We show that despite the limiting factors of temporal and spatial noise associated to RGB-D data, it is possible to extract temporal coherence to faithfully reconstruct a temporally coherent 3D animation from RGB-D video data.

Keywords: 3D video, 3D animation, RGB-D video, temporally coherent 3D animation

Procedia PDF Downloads 354
24837 Determining Abnomal Behaviors in UAV Robots for Trajectory Control in Teleoperation

Authors: Kiwon Yeom

Abstract:

Change points are abrupt variations in a data sequence. Detection of change points is useful in modeling, analyzing, and predicting time series in application areas such as robotics and teleoperation. In this paper, a change point is defined to be a discontinuity in one of its derivatives. This paper presents a reliable method for detecting discontinuities within a three-dimensional trajectory data. The problem of determining one or more discontinuities is considered in regular and irregular trajectory data from teleoperation. We examine the geometric detection algorithm and illustrate the use of the method on real data examples.

Keywords: change point, discontinuity, teleoperation, abrupt variation

Procedia PDF Downloads 147
24836 Multidimensional Item Response Theory Models for Practical Application in Large Tests Designed to Measure Multiple Constructs

Authors: Maria Fernanda Ordoñez Martinez, Alvaro Mauricio Montenegro

Abstract:

This work presents a statistical methodology for measuring and founding constructs in Latent Semantic Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations present on Item Response Theory. More precisely, we propose initially reducing dimensionality with specific use of Principal Component Analysis for the linguistic data and then, producing axes of groups made from a clustering analysis of the semantic data. This approach allows the user to give meaning to previous clusters and found the real latent structure presented by data. The methodology is applied in a set of real semantic data presenting impressive results for the coherence, speed and precision.

Keywords: semantic analysis, factorial analysis, dimension reduction, penalized logistic regression

Procedia PDF Downloads 423
24835 Enhanced Automated Teller Machine Using Short Message Service Authentication Verification

Authors: Rasheed Gbenga Jimoh, Akinbowale Nathaniel Babatunde

Abstract:

The use of Automated Teller Machine (ATM) has become an important tool among commercial banks, customers of banks have come to depend on and trust the ATM conveniently meet their banking needs. Although the overwhelming advantages of ATM cannot be over-emphasized, its alarming fraud rate has become a bottleneck in it’s full adoption in Nigeria. This study examined the menace of ATM in the society another cost of running ATM services by banks in the country. The researcher developed a prototype of an enhanced Automated Teller Machine Authentication using Short Message Service (SMS) Verification. The developed prototype was tested by Ten (10) respondents who are users of ATM cards in the country and the data collected was analyzed using Statistical Package for Social Science (SPSS). Based on the results of the analysis, it is being envisaged that the developed prototype will go a long way in reducing the alarming rate of ATM fraud in Nigeria.

Keywords: ATM, ATM fraud, e-banking, prototyping

Procedia PDF Downloads 290
24834 Analysis of Production Forecasting in Unconventional Gas Resources Development Using Machine Learning and Data-Driven Approach

Authors: Dongkwon Han, Sangho Kim, Sunil Kwon

Abstract:

Unconventional gas resources have dramatically changed the future energy landscape. Unlike conventional gas resources, the key challenges in unconventional gas have been the requirement that applies to advanced approaches for production forecasting due to uncertainty and complexity of fluid flow. In this study, artificial neural network (ANN) model which integrates machine learning and data-driven approach was developed to predict productivity in shale gas. The database of 129 wells of Eagle Ford shale basin used for testing and training of the ANN model. The Input data related to hydraulic fracturing, well completion and productivity of shale gas were selected and the output data is a cumulative production. The performance of the ANN using all data sets, clustering and variables importance (VI) models were compared in the mean absolute percentage error (MAPE). ANN model using all data sets, clustering, and VI were obtained as 44.22%, 10.08% (cluster 1), 5.26% (cluster 2), 6.35%(cluster 3), and 32.23% (ANN VI), 23.19% (SVM VI), respectively. The results showed that the pre-trained ANN model provides more accurate results than the ANN model using all data sets.

Keywords: unconventional gas, artificial neural network, machine learning, clustering, variables importance

Procedia PDF Downloads 178
24833 Procedure Model for Data-Driven Decision Support Regarding the Integration of Renewable Energies into Industrial Energy Management

Authors: M. Graus, K. Westhoff, X. Xu

Abstract:

The climate change causes a change in all aspects of society. While the expansion of renewable energies proceeds, industry could not be convinced based on general studies about the potential of demand side management to reinforce smart grid considerations in their operational business. In this article, a procedure model for a case-specific data-driven decision support for industrial energy management based on a holistic data analytics approach is presented. The model is executed on the example of the strategic decision problem, to integrate the aspect of renewable energies into industrial energy management. This question is induced due to considerations of changing the electricity contract model from a standard rate to volatile energy prices corresponding to the energy spot market which is increasingly more affected by renewable energies. The procedure model corresponds to a data analytics process consisting on a data model, analysis, simulation and optimization step. This procedure will help to quantify the potentials of sustainable production concepts based on the data from a factory. The model is validated with data from a printer in analogy to a simple production machine. The overall goal is to establish smart grid principles for industry via the transformation from knowledge-driven to data-driven decisions within manufacturing companies.

Keywords: data analytics, green production, industrial energy management, optimization, renewable energies, simulation

Procedia PDF Downloads 420
24832 Dissimilarity-Based Coloring for Symbolic and Multivariate Data Visualization

Authors: K. Umbleja, M. Ichino, H. Yaguchi

Abstract:

In this paper, we propose a coloring method for multivariate data visualization by using parallel coordinates based on dissimilarity and tree structure information gathered during hierarchical clustering. The proposed method is an extension for proximity-based coloring that suffers from a few undesired side effects if hierarchical tree structure is not balanced tree. We describe the algorithm by assigning colors based on dissimilarity information, show the application of proposed method on three commonly used datasets, and compare the results with proximity-based coloring. We found our proposed method to be especially beneficial for symbolic data visualization where many individual objects have already been aggregated into a single symbolic object.

Keywords: data visualization, dissimilarity-based coloring, proximity-based coloring, symbolic data

Procedia PDF Downloads 152
24831 Spatial Assessment of Creek Habitats of Marine Fish Stock in Sindh Province

Authors: Syed Jamil H. Kazmi, Faiza Sarwar

Abstract:

The Indus delta of Sindh Province forms the largest creeks zone of Pakistan. The Sindh coast starts from the mouth of Hab River and terminates at Sir Creek area. In this paper, we have considered the major creeks from the site of Bin Qasim Port in Karachi to Jetty of Keti Bunder in Thatta District. A general decline in the mangrove forest has been observed that within a span of last 25 years. The unprecedented human interventions damage the creeks habitat badly which includes haphazard urban development, industrial and sewage disposal, illegal cutting of mangroves forest, reduced and inconsistent fresh water flow mainly from Jhang and Indus rivers. These activities not only harm the creeks habitat but affected the fish stock substantially. Fishing is the main livelihood of coastal people but with the above-mentioned threats, it is also under enormous pressure by fish catches resulted in unchecked overutilization of the fish resources. This pressure is almost unbearable when it joins with deleterious fishing methods, uncontrolled fleet size, increase trash and by-catch of juvenile and illegal mesh size. Along with these anthropogenic interventions study area is under the red zone of tropical cyclones and active seismicity causing floods, sea intrusion, damage mangroves forests and devastation of fish stock. In order to sustain the natural resources of the Indus Creeks, this study was initiated with the support of FAO, WWF and NIO, the main purpose was to develop a Geo-Spatial dataset for fish stock assessment. The study has been spread over a year (2013-14) on monthly basis which mainly includes detailed fish stock survey, water analysis and few other environmental analyses. Environmental analysis also includes the habitat classification of study area which has done through remote sensing techniques for 22 years’ time series (1992-2014). Furthermore, out of 252 species collected, fifteen species from estuarine and marine groups were short-listed to measure the weight, health and growth of fish species at each creek under GIS data through SPSS system. Furthermore, habitat suitability analysis has been conducted by assessing the surface topographic and aspect derivation through different GIS techniques. The output variables then overlaid in GIS system to measure the creeks productivity. Which provided the results in terms of subsequent classes: extremely productive, highly productive, productive, moderately productive and less productive. This study has revealed the Geospatial tools utilization along with the evaluation of the fisheries resources and creeks habitat risk zone mapping. It has also been identified that the geo-spatial technologies are highly beneficial to identify the areas of high environmental risk in Sindh Creeks. This has been clearly discovered from this study that creeks with high rugosity are more productive than the creeks with low levels of rugosity. The study area has the immense potential to boost the economy of Pakistan in terms of fish export, if geo-spatial techniques are implemented instead of conventional techniques.

Keywords: fish stock, geo-spatial, productivity analysis, risk

Procedia PDF Downloads 226
24830 The Effects of a Circuit Training Program on Muscle Strength, Agility, Anaerobic Performance and Cardiovascular Endurance

Authors: Wirat Sonchan, Pratoom Moungmee, Anek Sootmongkol

Abstract:

This study aimed to examine the effects of a circuit training program on muscle strength, agility, anaerobic performance and cardiovascular endurance. The study involved 24 freshmen (age 18.87+0.68 yr.) male students of the Faculty of Sport Science, Burapha University. They sample study were randomly divided into two groups: Circuit Training group (CT; n=12) and a Control group (C; n=12). Baseline data on height, weight, muscle strength (hand grip dynamometer and leg strength dynamometer), agility (agility T-Test), and anaerobic performance (Running-based Anaerobic Sprint Test) and cardiovascular endurance (20 m Endurance Shuttle Run Test) were collected. The circuit training program included one circuit of eight stations of 30/60 seconds of work/rest interval with two cycles in Week 1-4, and 60/90 seconds of work/rest interval with three cycles in Week 5-8, performed three times per week. Data were analyzed using paired t-tests and independent sample t-test. Statistically significance level was set at 0.05. The results show that after 8 weeks of a training program, muscle strength, agility, anaerobic capacity and cardiovascular endurance increased significantly in the CT Group (p < 0.05), while significant increase was not observed in the C Group (p < 0.05). The results of this study suggest that the circuit training program improved muscle strength, agility, anaerobic capacity and cardiovascular endurance of the study subjects. This program may be used as a guideline for selecting a set of exercise to improve physical fitness.

Keywords: circuit training, physical fitness, cardiovascular endurance, anaerobic performance

Procedia PDF Downloads 482
24829 Developing Structured Sizing Systems for Manufacturing Ready-Made Garments of Indian Females Using Decision Tree-Based Data Mining

Authors: Hina Kausher, Sangita Srivastava

Abstract:

In India, there is a lack of standard, systematic sizing approach for producing readymade garments. Garments manufacturing companies use their own created size tables by modifying international sizing charts of ready-made garments. The purpose of this study is to tabulate the anthropometric data which covers the variety of figure proportions in both height and girth. 3,000 data has been collected by an anthropometric survey undertaken over females between the ages of 16 to 80 years from some states of India to produce the sizing system suitable for clothing manufacture and retailing. This data is used for the statistical analysis of body measurements, the formulation of sizing systems and body measurements tables. Factor analysis technique is used to filter the control body dimensions from a large number of variables. Decision tree-based data mining is used to cluster the data. The standard and structured sizing system can facilitate pattern grading and garment production. Moreover, it can exceed buying ratios and upgrade size allocations to retail segments.

Keywords: anthropometric data, data mining, decision tree, garments manufacturing, sizing systems, ready-made garments

Procedia PDF Downloads 119
24828 Introducing Design Principles for Clinical Decision Support Systems

Authors: Luca Martignoni

Abstract:

The increasing usage of clinical decision support systems in healthcare and the demand for software that enables doctors to take informed decisions is changing everyday clinical practice. However, as technology advances not only are the benefits of technology growing, but so are the potential risks. A growing danger is the doctors’ over-reliance on the proposed decision of the clinical decision support system, leading towards deskilling and rash decisions by doctors. In that regard, identifying doctors' requirements for software and developing approaches to prevent technological over-reliance is of utmost importance. In this paper, we report the results of a design science research study, focusing on the requirements and design principles of ultrasound software. We conducted a total of 15 interviews with experts about poten-tial ultrasound software functions. Subsequently, we developed meta-requirements and design principles to design future clinical decision support systems efficiently and as free from the occur-rence of technological over-reliance as possible.

Keywords: clinical decision support systems, technological over-reliance, design principles, design science research

Procedia PDF Downloads 84
24827 Facility Data Model as Integration and Interoperability Platform

Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes

Abstract:

Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.

Keywords: airport ontology, energy management, facility data model, ontology modeling

Procedia PDF Downloads 426
24826 'Wandering Uterus': An Analogy of Perception of Women in Hippocratic Corpus and Post-Modern Times

Authors: Ankita Sharma

Abstract:

The study proposes to review the perception of women in the Classical Age (500-336 BC) when Greek Philosophy was in bloom. It was observed that women had very few rights and were still under the control of men. One of the possible reasons for this exclusion was woman’s biology that had a huge influence on her being seen as inferior to men. The text ‘Hippocratic Corpus’ focuses on the biological construct of the female body in classical Greek science that perpetuated the idea of women as second-class citizens and were considered inherently weaker than men. The research highlights the significance of the text that was used to encourage women of that time to get married and produce children and how till today the perception remains the same. The Greek belief of need for confinement and control of 'wandering uterus' has led to superior understanding of men. The pivotal emphasis of this research is to women and their bodies that are depicted in a misogynistic way which paved the way for Hippocratic writers to influence the society’s attitude towards women in their writings. It is intended to draw attention to the prevailing cultural assumptions and preconceived notions about female anatomy that had a pervasive influence in the following centuries with its roots being in ancient science.

Keywords: classical Greek theory, women, wandering womb, modern ideology

Procedia PDF Downloads 175
24825 Measuring Technology of Airship Propeller Thrust and Torque in China Academy of Aerospace Aerodynamics

Authors: Ma Hongqiang, Yang Hui, Wen Haoju, Feng Jiabo, Bi Zhixian, Nie Ying

Abstract:

In order to measure thrust and torque of airship propeller, a two-component balance and data acquisition system was developed in China Academy of Aerospace Aerodynamics(CAAA) in early time. During the development, some problems were encountered. At first, the measuring system and its protective parts made the weight of whole system increase significantly. Secondly, more parts might induce more failures, so the reliability of the system was decreased. In addition, the rigidity of the system was lowered, and the structure was more possible to vibrate. Therefore, CAAA and the Academy of Opto-Electronics, Chinese Academy of Science(AOECAS) developed a new technology, use the propeller supporting rack as a spring element, attach strain gages onto it, sum up as a generalized balance. And new math models, new calibration methods and new load determining methods were developed.

Keywords: airship, propeller, thrust and torque, flight test

Procedia PDF Downloads 329
24824 Different Approaches to Teaching a Database Course to Undergraduate and Graduate Students

Authors: Samah Senbel

Abstract:

Database Design is a fundamental part of the Computer Science and Information technology curricula in any school, as well as in the study of management, business administration, and data analytics. In this study, we compare the performance of two groups of students studying the same database design and implementation course at Sacred Heart University in the fall of 2018. Both courses used the same textbook and were taught by the same professor, one for seven graduate students and one for 26 undergraduate students (juniors). The undergraduate students were aged around 20 years old with little work experience, while the graduate students averaged 35 years old and all were employed in computer-related or management-related jobs. The textbook used was 'Database Systems, Design, Implementation, and Management' by Coronel and Morris, and the course was designed to follow the textbook roughly a chapter per week. The first 6 weeks covered the design aspect of a database, followed by a paper exam. The next 6 weeks covered the implementation aspect of the database using SQL followed by a lab exam. Since the undergraduate students are on a 16 week semester, we spend the last three weeks of the course covering NoSQL. This part of the course was not included in this study. After the course was over, we analyze the results of the two groups of students. An interesting discrepancy was observed: In the database design part of the course, the average grade of the graduate students was 92%, while that of the undergraduate students was 77% for the same exam. In the implementation part of the course, we observe the opposite: the average grade of the graduate students was 65% while that of the undergraduate students was 73%. The overall grades were quite similar: the graduate average was 78% and that of the undergraduates was 75%. Based on these results, we concluded that having both classes follow the same time schedule was not beneficial, and an adjustment is needed. The graduates could spend less time on design and the undergraduates would benefit from more design time. In the fall of 2019, 30 students registered for the undergraduate course and 15 students registered for the graduate course. To test our conclusion, the undergraduates spend about 67% of time (eight classes) on the design part of the course and 33% (four classes) on the implementation part, using the exact exams as the previous year. This resulted in an improvement in their average grades on the design part from 77% to 83% and also their implementation average grade from 73% to 79%. In conclusion, we recommend using two separate schedules for teaching the database design course. For undergraduate students, it is important to spend more time on the design part rather than the implementation part of the course. While for the older graduate students, we recommend spending more time on the implementation part, as it seems that is the part they struggle with, even though they have a higher understanding of the design component of databases.

Keywords: computer science education, database design, graduate and undergraduate students, pedagogy

Procedia PDF Downloads 103
24823 Assumption of Cognitive Goals in Science Learning

Authors: Mihail Calalb

Abstract:

The aim of this research is to identify ways for achieving sustainable conceptual understanding within science lessons. For this purpose, a set of teaching and learning strategies, parts of the theory of visible teaching and learning (VTL), is studied. As a result, a new didactic approach named "learning by being" is proposed and its correlation with educational paradigms existing nowadays in science teaching domain is analysed. In the context of VTL the author describes the main strategies of "learning by being" such as guided self-scaffolding, structuring of information, and recurrent use of previous knowledge or help seeking. Due to the synergy effect of these learning strategies applied simultaneously in class, the impact factor of learning by being on cognitive achievement of students is up to 93 % (the benchmark level is equal to 40% when an experienced teacher applies permanently the same conventional strategy during two academic years). The key idea in "learning by being" is the assumption by the student of cognitive goals. From this perspective, the article discusses the role of student’s personal learning effort within several teaching strategies employed in VTL. The research results emphasize that three mandatory student – related moments are present in each constructivist teaching approach: a) students’ personal learning effort, b) student – teacher mutual feedback and c) metacognition. Thus, a successful educational strategy will target to achieve an involvement degree of students into the class process as high as possible in order to make them not only know the learning objectives but also to assume them. In this way, we come to the ownership of cognitive goals or students’ deep intrinsic motivation. A series of approaches are inherent to the students’ ownership of cognitive goals: independent research (with an impact factor on cognitive achievement equal to 83% according to the results of VTL); knowledge of success criteria (impact factor – 113%); ability to reveal similarities and patterns (impact factor – 132%). Although it is generally accepted that the school is a public service, nonetheless it does not belong to entertainment industry and in most of cases the education declared as student – centered actually hides the central role of the teacher. Even if there is a proliferation of constructivist concepts, mainly at the level of science education research, we have to underline that conventional or frontal teaching, would never disappear. Research results show that no modern method can replace an experienced teacher with strong pedagogical content knowledge. Such a teacher will inspire and motivate his/her students to love and learn physics. The teacher is precisely the condensation point for an efficient didactic strategy – be it constructivist or conventional. In this way, we could speak about "hybridized teaching" where both the student and the teacher have their share of responsibility. In conclusion, the core of "learning by being" approach is guided learning effort that corresponds to the notion of teacher–student harmonic oscillator, when both things – guidance from teacher and student’s effort – are equally important.

Keywords: conceptual understanding, learning by being, ownership of cognitive goals, science learning

Procedia PDF Downloads 156
24822 Beginning Physics Experiments Class Using Multi Media in National University of Laos

Authors: T. Nagata, S. Xaphakdy, P. Souvannavong, P. Chanthamaly, K. Sithavong, C. H. Lee, S. Phommathat, V. Srithilat, P. Sengdala, B. Phetarnousone, B. Siharath, X. Chemcheng, T. Yamaguchi, A. Suenaga, S. Kashima

Abstract:

National University of Laos (NUOL) requested Japan International Cooperation Agency (JICA) volunteers to begin a physics experiments class using multi media. However, there are issues. NUOL had no physics experiment class, no space for physics experiments, experiment materials were not used for many years and were scattered in various places, and there is no projector and laptop computer in the unit. This raised the question: How do authors begin the physics experiments class using multimedia? To solve this problem, the JICA took some steps, took stock of what was available and reviewed the syllabus. The JICA then revised the experiment materials to assess what was available and then developed textbooks for experiments using them; however, the question remained, what about the multimedia component of the course? Next, the JICA reviewed Physics teacher Pavy Souvannavong’s YouTube channel, where he and his students upload video reports of their physics classes at NUOL using their smartphones. While they use multi-media, almost all the videos recorded were of class presentations. To improve the multimedia style, authors edited the videos in the style of another YouTube channel, “Science for Lao,” which is a science education group made up of Japan Overseas Cooperation Volunteers (JOCV) in Laos. They created the channel to enhance science education in Laos, and hold regular monthly meetings in the capital, Vientiane, and at teacher training colleges in the country. They edit the video clips in three parts, which are the materials and procedures part including pictures, practice footage of the experiment part, and then the result and conclusion part. Then students perform experiments and prepare for presentation by following the videos. The revised experiment presentation reports use PowerPoint presentations, material pictures and experiment video clips. As for providing textbooks and submitting reports, the students use the e-Learning system of “Moodle” of the Information Technology Center in Dongdok campus of NUOL. The Korean International Cooperation Agency (KOICA) donated those facilities. The authors have passed the process of the revised materials, developed textbooks, the PowerPoint slides presented by students, downloaded textbooks and uploaded reports, to begin the physics experiments class using multimedia. This is the practice research report for beginning a physics experiments class using multimedia in the physics unit at the Department of Natural Science, Faculty of Education, at the NUOL.

Keywords: NUOL, JICA, KOICA, physics experiment materials, smartphone, Moodle, IT center, Science for Lao

Procedia PDF Downloads 336
24821 Indigenizing the Curriculum: Teaching at the Ifugao State University, Philippines

Authors: Nancy Ann P. Gonzales, Serafin L. Ngohayon

Abstract:

The Nurturing Indigenous Knowledge Experts (NIKE) among the young generation in Ifugao was a project in Ifugao, Philippines spearheaded by the Ifugao State University (IFSU) and was sponsored by the UNESCO Association in Japan. Through the project, he Ifugao Indigenous Knowledge Workbook was developed. It contains nine chapters. The workbook was pilot-tested to students who had IK classes. The descriptive survey method of research was used. A questionnaire was used to gather data from first year Bachelor of Elementary Education and Bachelor of Political Science students. Frequency count, percentage and mean were computed. T-test was used to determine if there exists significant difference on knowledge gained before and after IK was taught to the students. Results revealed that the respondents have an increased level of IK in all the areas covered in the NIKE workbook after they enrolled in their classes. It is alarming to note that the students are knowledgeable about IK but they are not practicing it. However, according to the respondents, they will apply their IK through teaching after graduation.

Keywords: curriculum, elders, Indigenous knowledge, and students

Procedia PDF Downloads 340
24820 A Machine Learning Model for Dynamic Prediction of Chronic Kidney Disease Risk Using Laboratory Data, Non-Laboratory Data, and Metabolic Indices

Authors: Amadou Wurry Jallow, Adama N. S. Bah, Karamo Bah, Shih-Ye Wang, Kuo-Chung Chu, Chien-Yeh Hsu

Abstract:

Chronic kidney disease (CKD) is a major public health challenge with high prevalence, rising incidence, and serious adverse consequences. Developing effective risk prediction models is a cost-effective approach to predicting and preventing complications of chronic kidney disease (CKD). This study aimed to develop an accurate machine learning model that can dynamically identify individuals at risk of CKD using various kinds of diagnostic data, with or without laboratory data, at different follow-up points. Creatinine is a key component used to predict CKD. These models will enable affordable and effective screening for CKD even with incomplete patient data, such as the absence of creatinine testing. This retrospective cohort study included data on 19,429 adults provided by a private research institute and screening laboratory in Taiwan, gathered between 2001 and 2015. Univariate Cox proportional hazard regression analyses were performed to determine the variables with high prognostic values for predicting CKD. We then identified interacting variables and grouped them according to diagnostic data categories. Our models used three types of data gathered at three points in time: non-laboratory, laboratory, and metabolic indices data. Next, we used subgroups of variables within each category to train two machine learning models (Random Forest and XGBoost). Our machine learning models can dynamically discriminate individuals at risk for developing CKD. All the models performed well using all three kinds of data, with or without laboratory data. Using only non-laboratory-based data (such as age, sex, body mass index (BMI), and waist circumference), both models predict chronic kidney disease as accurately as models using laboratory and metabolic indices data. Our machine learning models have demonstrated the use of different categories of diagnostic data for CKD prediction, with or without laboratory data. The machine learning models are simple to use and flexible because they work even with incomplete data and can be applied in any clinical setting, including settings where laboratory data is difficult to obtain.

Keywords: chronic kidney disease, glomerular filtration rate, creatinine, novel metabolic indices, machine learning, risk prediction

Procedia PDF Downloads 85
24819 Road Accidents Bigdata Mining and Visualization Using Support Vector Machines

Authors: Usha Lokala, Srinivas Nowduri, Prabhakar K. Sharma

Abstract:

Useful information has been extracted from the road accident data in United Kingdom (UK), using data analytics method, for avoiding possible accidents in rural and urban areas. This analysis make use of several methodologies such as data integration, support vector machines (SVM), correlation machines and multinomial goodness. The entire datasets have been imported from the traffic department of UK with due permission. The information extracted from these huge datasets forms a basis for several predictions, which in turn avoid unnecessary memory lapses. Since data is expected to grow continuously over a period of time, this work primarily proposes a new framework model which can be trained and adapt itself to new data and make accurate predictions. This work also throws some light on use of SVM’s methodology for text classifiers from the obtained traffic data. Finally, it emphasizes the uniqueness and adaptability of SVMs methodology appropriate for this kind of research work.

Keywords: support vector mechanism (SVM), machine learning (ML), support vector machines (SVM), department of transportation (DFT)

Procedia PDF Downloads 254
24818 A Relational Data Base for Radiation Therapy

Authors: Raffaele Danilo Esposito, Domingo Planes Meseguer, Maria Del Pilar Dorado Rodriguez

Abstract:

As far as we know, it is still unavailable a commercial solution which would allow to manage, openly and configurable up to user needs, the huge amount of data generated in a modern Radiation Oncology Department. Currently, available information management systems are mainly focused on Record & Verify and clinical data, and only to a small extent on physical data. Thus, results in a partial and limited use of the actually available information. In the present work we describe the implementation at our department of a centralized information management system based on a web server. Our system manages both information generated during patient planning and treatment, and information of general interest for the whole department (i.e. treatment protocols, quality assurance protocols etc.). Our objective it to be able to analyze in a simple and efficient way all the available data and thus to obtain quantitative evaluations of our treatments. This would allow us to improve our work flow and protocols. To this end we have implemented a relational data base which would allow us to use in a practical and efficient way all the available information. As always we only use license free software.

Keywords: information management system, radiation oncology, medical physics, free software

Procedia PDF Downloads 222
24817 A Study of Safety of Data Storage Devices of Graduate Students at Suan Sunandha Rajabhat University

Authors: Komol Phaisarn, Natcha Wattanaprapa

Abstract:

This research is a survey research with an objective to study the safety of data storage devices of graduate students of academic year 2013, Suan Sunandha Rajabhat University. Data were collected by questionnaire on the safety of data storage devices according to CIA principle. A sample size of 81 was drawn from population by purposive sampling method. The results show that most of the graduate students of academic year 2013 at Suan Sunandha Rajabhat University use handy drive to store their data and the safety level of the devices is at good level.

Keywords: security, safety, storage devices, graduate students

Procedia PDF Downloads 337
24816 Effectiveness of Active Learning in Social Science Courses at Japanese Universities

Authors: Kumiko Inagaki

Abstract:

In recent, years, Japanese universities have begun to face a dilemma: more than half of all high school graduates go on to attend an institution of higher learning, overwhelming Japanese universities accustomed to small student bodies. These universities have been forced to embrace qualitative changes to accommodate the increased number and diversity of students who enter their establishments, students who differ in their motivations for learning, their levels of eagerness to learn, and their perspectives on the future. One of these changes is an increase in awareness among Japanese educators of the importance of active learning, which deepens students’ understanding of course material through a range of activities, including writing, speaking, thinking, and presenting, in addition to conventional “passive learning” methods such as listening to a one-way lecture.  The purpose of this study is to examine the effectiveness of the teaching method adapted to improve active learning. A teaching method designed to promote active learning was implemented in a social science course at one of the most popular universities in Japan. A questionnaire using a five-point response format was given to students in 2,305 courses throughout the university to evaluate the effectiveness of the method based on the following measures: ① the ratio of students who were motivated to attend the classes, ② the rate at which students learned new information, and ③ the teaching method adopted in the classes. The results of this study show that the percentage of students who attended the active learning course eagerly, and the rate of new knowledge acquired through the course, both exceeded the average for the university, the department, and the subject area of social science. In addition, there are strong correlations between teaching method and student motivation and between teaching method and knowledge acquisition rate. These results indicate that the active learning teaching method was effectively implemented and that it may improve student eagerness to attend class and motivation to learn.

Keywords: active learning, Japanese university, teaching method, university education

Procedia PDF Downloads 181
24815 Simulation of a Cost Model Response Requests for Replication in Data Grid Environment

Authors: Kaddi Mohammed, A. Benatiallah, D. Benatiallah

Abstract:

Data grid is a technology that has full emergence of new challenges, such as the heterogeneity and availability of various resources and geographically distributed, fast data access, minimizing latency and fault tolerance. Researchers interested in this technology address the problems of the various systems related to the industry such as task scheduling, load balancing and replication. The latter is an effective solution to achieve good performance in terms of data access and grid resources and better availability of data cost. In a system with duplication, a coherence protocol is used to impose some degree of synchronization between the various copies and impose some order on updates. In this project, we present an approach for placing replicas to minimize the cost of response of requests to read or write, and we implement our model in a simulation environment. The placement techniques are based on a cost model which depends on several factors, such as bandwidth, data size and storage nodes.

Keywords: response time, query, consistency, bandwidth, storage capacity, CERN

Procedia PDF Downloads 253