Search results for: accuracy assessment.
7682 Probabilistic Life Cycle Assessment of the Nano Membrane Toilet
Authors: A. Anastasopoulou, A. Kolios, T. Somorin, A. Sowale, Y. Jiang, B. Fidalgo, A. Parker, L. Williams, M. Collins, E. J. McAdam, S. Tyrrel
Abstract:
Developing countries are nowadays confronted with great challenges related to domestic sanitation services in view of the imminent water scarcity. Contemporary sanitation technologies established in these countries are likely to pose health risks unless waste management standards are followed properly. This paper provides a solution to sustainable sanitation with the development of an innovative toilet system, called Nano Membrane Toilet (NMT), which has been developed by Cranfield University and sponsored by the Bill & Melinda Gates Foundation. The particular technology converts human faeces into energy through gasification and provides treated wastewater from urine through membrane filtration. In order to evaluate the environmental profile of the NMT system, a deterministic life cycle assessment (LCA) has been conducted in SimaPro software employing the Ecoinvent v3.3 database. The particular study has determined the most contributory factors to the environmental footprint of the NMT system. However, as sensitivity analysis has identified certain critical operating parameters for the robustness of the LCA results, adopting a stochastic approach to the Life Cycle Inventory (LCI) will comprehensively capture the input data uncertainty and enhance the credibility of the LCA outcome. For that purpose, Monte Carlo simulations, in combination with an artificial neural network (ANN) model, have been conducted for the input parameters of raw material, produced electricity, NOX emissions, amount of ash and transportation of fertilizer. The given analysis has provided the distribution and the confidence intervals of the selected impact categories and, in turn, more credible conclusions are drawn on the respective LCIA (Life Cycle Impact Assessment) profile of NMT system. Last but not least, the specific study will also yield essential insights into the methodological framework that can be adopted in the environmental impact assessment of other complex engineering systems subject to a high level of input data uncertainty.Keywords: sanitation systems, nano-membrane toilet, lca, stochastic uncertainty analysis, Monte Carlo simulations, artificial neural network
Procedia PDF Downloads 2247681 Drape Simulation by Commercial Software and Subjective Assessment of Virtual Drape
Authors: Evrim Buyukaslan, Simona Jevsnik, Fatma Kalaoglu
Abstract:
Simulation of fabrics is more difficult than any other simulation due to complex mechanics of fabrics. Most of the virtual garment simulation software use mass-spring model and incorporate fabric mechanics into simulation models. The accuracy and fidelity of these virtual garment simulation software is a question mark. Drape is a subjective phenomenon and evaluation of drape has been studied since 1950’s. On the other hand, fabric and garment simulation is relatively new. Understanding drape perception of subjects when looking at fabric simulations is critical as virtual try-on becomes more of an issue by enhanced online apparel sales. Projected future of online apparel retailing is that users may view their avatars and try-on the garment on their avatars in the virtual environment. It is a well-known fact that users will not be eager to accept this innovative technology unless it is realistic enough. Therefore, it is essential to understand what users see when they are displaying fabrics in a virtual environment. Are they able to distinguish the differences between various fabrics in virtual environment? The purpose of this study is to investigate human perception when looking at a virtual fabric and determine the most visually noticeable drape parameter. To this end, five different fabrics are mechanically tested, and their drape simulations are generated by commercial garment simulation software (Optitex®). The simulation images are processed by an image analysis software to calculate drape parameters namely; drape coefficient, node severity, and peak angles. A questionnaire is developed to evaluate drape properties subjectively in a virtual environment. Drape simulation images are shown to 27 subjects and asked to rank the samples according to their questioned drape property. The answers are compared to the calculated drape parameters. The results show that subjects are quite sensitive to drape coefficient changes while they are not very sensitive to changes in node dimensions and node distributions.Keywords: drape simulation, drape evaluation, fabric mechanics, virtual fabric
Procedia PDF Downloads 3367680 Analyzing the Shearing-Layer Concept Applied to Urban Green System
Authors: S. Pushkar, O. Verbitsky
Abstract:
Currently, green rating systems are mainly utilized for correctly sizing mechanical and electrical systems, which have short lifetime expectancies. In these systems, passive solar and bio-climatic architecture, which have long lifetime expectancies, are neglected. Urban rating systems consider buildings and services in addition to neighborhoods and public transportation as integral parts of the built environment. The main goal of this study was to develop a more consistent point allocation system for urban building standards by using six different lifetime shearing layers: Site, Structure, Skin, Services, Space, and Stuff, each reflecting distinct environmental damages. This shearing-layer concept was applied to internationally well-known rating systems: Leadership in Energy and Environmental Design (LEED) for Neighborhood Development, BRE Environmental Assessment Method (BREEAM) for Communities, and Comprehensive Assessment System for Building Environmental Efficiency (CASBEE) for Urban Development. The results showed that LEED for Neighborhood Development and BREEAM for Communities focused on long-lifetime-expectancy building designs, whereas CASBEE for Urban Development gave equal importance to the Building and Service Layers. Moreover, although this rating system was applied using a building-scale assessment, “Urban Area + Buildings” focuses on a short-lifetime-expectancy system design, neglecting to improve the architectural design by considering bio-climatic and passive solar aspects.Keywords: green rating system, urban community, sustainable design, standardization, shearing-layer concept, passive solar architecture
Procedia PDF Downloads 5797679 Optimizing Pediatric Pneumonia Diagnosis with Lightweight MobileNetV2 and VAE-GAN Techniques in Chest X-Ray Analysis
Authors: Shriya Shukla, Lachin Fernando
Abstract:
Pneumonia, a leading cause of mortality in young children globally, presents significant diagnostic challenges, particularly in resource-limited settings. This study presents an approach to diagnosing pediatric pneumonia using Chest X-Ray (CXR) images, employing a lightweight MobileNetV2 model enhanced with synthetic data augmentation. Addressing the challenge of dataset scarcity and imbalance, the study used a Variational Autoencoder-Generative Adversarial Network (VAE-GAN) to generate synthetic CXR images, improving the representation of normal cases in the pediatric dataset. This approach not only addresses the issues of data imbalance and scarcity prevalent in medical imaging but also provides a more accessible and reliable diagnostic tool for early pneumonia detection. The augmented data improved the model’s accuracy and generalization, achieving an overall accuracy of 95% in pneumonia detection. These findings highlight the efficacy of the MobileNetV2 model, offering a computationally efficient yet robust solution well-suited for resource-constrained environments such as mobile health applications. This study demonstrates the potential of synthetic data augmentation in enhancing medical image analysis for critical conditions like pediatric pneumonia.Keywords: pneumonia, MobileNetV2, image classification, GAN, VAE, deep learning
Procedia PDF Downloads 1227678 Measuring Principal and Teacher Cultural Competency: A Need Assessment of Three Proximate PreK-5 Schools
Authors: Teresa Caswell
Abstract:
Throughout the United States and within a myriad of demographic contexts, students of color experience the results of systemic inequities as an academic outcome. These disparities continue despite the increased resources provided to students and ongoing instruction-focused professional learning received by teachers. The researcher postulated that lower levels of educator cultural competency are an underlying factor of why resource and instructional interventions are less effective than desired. Before implementing any type of intervention, however, cultural competency needed to be confirmed as a factor in schools demonstrating academic disparities between racial subgroups. A needs assessment was designed to measure levels of individual beliefs, including cultural competency, in both principals and teachers at three neighboring schools verified to have academic disparities. The resulting mixed method study utilized the Optimal Theory Applied to Identity Development (OTAID) model to measure cultural competency quantitatively, through self-identity inventory survey items, with teachers and qualitatively, through one-on-one interviews, with each school’s principal. A joint display was utilized to see combined data within and across school contexts. Each school was confirmed to have misalignments between principal and teacher levels of cultural competency beliefs while also indicating that a number of participants in the self-identity inventory survey may have intentionally skipped items referencing the term oppression. Additional use of the OTAID model and self-identity inventory in future research and across contexts is needed to determine transferability and dependability as cultural competency measures.Keywords: cultural competency, identity development, mixed-method analysis, needs assessment
Procedia PDF Downloads 1507677 Technical Non-Destructive Evaluation of Burnt Bridge at CH. 57+450 Along Abuja-Abaji-Lokoja Road, Nigeria
Authors: Abraham O. Olaniyi, Oluyemi Oke, Atilade Otunla
Abstract:
The structural performance of bridges decreases progressively throughout their service life due to many contributing factors (fatigue, carbonation, fire incidents etc.). Around the world, numerous bridges have attained their estimated service life and many have approached this limit. The structural integrity assessment of the burnt composite bridge located at CH57+450, Koita village along Abuja-Abaji-Lokoja road, Nigeria, is presented as a case study and shall be forthwith referred to as the 'Koita bridge' in this paper. From the technical evaluation, the residual compressive strength of the concrete piers was found to be below 16.0 N/mm2. This value is very low compared to the expected design value of 30.0 N/mm2. The pier capping beam at pier location 1 has a very low residual compressive strength. The cover to the reinforcement of certain capping beams has an outline of reinforcement which signifies poor concrete cover and the mean compressive strength is also less than 20.0 N/mm2. The steel girder indicated black colouration as a result of the fire incident without any significant structural defect like buckling or warping of the steel section. This paper reviews the structural integrity assessment and repair methodology of the Koita bridge; a composite bridge damaged by fire, highlighting the various challenges of limited obtainable guidance documents about the bridge. The objectives are to increase the understanding of processes and versatile equipment required to test and assess a fire-damaged bridge in order to improve the quality of structural appraisal and rehabilitation; thus, eliminating the prejudice associated with current visual inspection techniques.Keywords: assessment, bridge, rehabilitation, sustainability
Procedia PDF Downloads 3647676 Policy Guidelines to Enhance the Mathematics Teachers’ Association of the Philippines (MTAP) Saturday Class Program
Authors: Roselyn Alejandro-Ymana
Abstract:
The study was an attempt to assess the MTAP Saturday Class Program along its eight components namely, modules, instructional materials, scheduling, trainer-teachers, supervisory support, administrative support, financial support and educational facilities, the results of which served as bases in developing policy guidelines to enhance the MTAP Saturday Class Program. Using a descriptive development method of research, this study involved the participation of twenty-eight (28) schools with MTAP Saturday Class Program in the Division of Dasmarinas City where twenty-eight school heads, one hundred twenty-five (125) teacher-trainer, one hundred twenty-five (125) pupil program participants, and their corresponding one hundred twenty-five (125) parents were purposively drawn to constitute the study’s respondent. A self-made validated survey questionnaire together with Pre and Post-Test Assessment Test in Mathematics for pupils participating in the program, and an unstructured interview guide was used to gather the data needed in the study. Data obtained from the instruments administered was organized and analyzed through the use of statistical tools that included the Mean, Weighted Mean, Relative Frequency, Standard Deviation, F-Test or One-Way ANOVA and the T-Test. Results of the study revealed that all the eight domains involved in the MTAP Saturday Class Program were practiced with the areas of 'trainer-teachers', 'educational facilities', and 'supervisory support' identified as the program’s strongest components while the areas of 'financial support', 'modules' and 'scheduling' as being the weakest program’s components. Moreover, the study revealed based on F-Test, that there was a significant difference in the assessment made by the respondents in each of the eight (8) domains. It was found out that the parents deviated significantly from the assessment of either the school heads or the teachers on the indicators of the program. There is much to be desired when it comes to the quality of the implementation of the MTAP Saturday Class Program. With most of the indicators of each component of the program, having received overall average ratings that were at least 0.5 point away from the ideal rating 5 for total quality, school heads, teachers, and supervisors need to work harder for total quality of the implementation of the MTAP Saturday Class Program in the division.Keywords: mathematics achievement, MTAP program, policy guidelines, program assessment
Procedia PDF Downloads 2117675 Project Management at University: Towards an Evaluation Process around Cooperative Learning
Authors: J. L. Andrade-Pineda, J.M. León-Blanco, M. Calle, P. L. González-R
Abstract:
The enrollment in current Master's degree programs usually pursues gaining the expertise required in real-life workplaces. The experience we present here concerns the learning process of "Project Management Methodology (PMM)", around a cooperative/collaborative mechanism aimed at affording students measurable learning goals and providing the teacher with the ability of focusing on the weaknesses detected. We have designed a mixed summative/formative evaluation, which assures curriculum engage while enriches the comprehension of PMM key concepts. In this experience we converted the students into active actors in the evaluation process itself and we endowed ourselves as teachers with a flexible process in which along with qualifications (score), other attitudinal feedback arises. Despite the high level of self-affirmation on their discussion within the interactive assessment sessions, they ultimately have exhibited a great ability to review and correct the wrong reasoning when that was the case.Keywords: cooperative-collaborative learning, educational management, formative-summative assessment, leadership training
Procedia PDF Downloads 1677674 Developing an Integrated Seismic Risk Model for Existing Buildings in Northern Algeria
Authors: R. Monteiro, A. Abarca
Abstract:
Large scale seismic risk assessment has become increasingly popular to evaluate the physical vulnerability of a given region to seismic events, by putting together hazard, exposure and vulnerability components. This study, developed within the scope of the EU-funded project ITERATE (Improved Tools for Disaster Risk Mitigation in Algeria), explains the steps and expected results for the development of an integrated seismic risk model for assessment of the vulnerability of residential buildings in Northern Algeria. For this purpose, the model foresees the consideration of an updated seismic hazard model, as well as ad-hoc exposure and physical vulnerability models for local residential buildings. The first results of this endeavor, such as the hazard model and a specific taxonomy to be used for the exposure and fragility components of the model are presented, using as starting point the province of Blida, in Algeria. Specific remarks and conclusions regarding the characteristics of the Northern Algerian in-built are then made based on these results.Keywords: Northern Algeria, risk, seismic hazard, vulnerability
Procedia PDF Downloads 1997673 Life Cycle Assessment-Based Environmental Assessment of the Production and Maintenance of Wooden Windows
Authors: Pamela Del Rosario, Elisabetta Palumbo, Marzia Traverso
Abstract:
The building sector plays an important role in addressing pressing environmental issues such as climate change and resource scarcity. The energy performance of buildings is considerably affected by the external envelope. In fact, a considerable proportion of the building energy demand is due to energy losses through the windows. Nevertheless, according to literature, to pay attention only to the contribution of windows to the building energy performance, i.e., their influence on energy use during building operation, could result in a partial evaluation. Hence, it is important to consider not only the building energy performance but also the environmental performance of windows, and this not only during the operational stage but along its complete life cycle. Life Cycle Assessment (LCA) according to ISO 14040:2006 and ISO 14044:2006+A1:2018 is one of the most adopted and robust methods to evaluate the environmental performance of products throughout their complete life cycle. This life-cycle based approach avoids the shift of environmental impacts of a life cycle stage to another, allowing to allocate them to the stage in which they originated and to adopt measures that optimize the environmental performance of the product. Moreover, the LCA method is widely implemented in the construction sector to assess whole buildings as well as construction products and materials. LCA is regulated by the European Standards EN 15978:2011, at the building level, and EN 15804:2012+A2:2019, at the level of construction products and materials. In this work, the environmental performance of wooden windows was assessed by implementing the LCA method and adopting primary data. More specifically, the emphasis is given to embedded and operational impacts. Furthermore, correlations are made between these environmental impacts and aspects such as type of wood and window transmittance. In the particular case of the operational impacts, special attention is set on the definition of suitable maintenance scenarios that consider the potential climate influence on the environmental impacts. For this purpose, a literature review was conducted, and expert consultation was carried out. The study underlined the variability of the embedded environmental impacts of wooden windows by considering different wood types and transmittance values. The results also highlighted the need to define appropriate maintenance scenarios for precise assessment results. It was found that both the service life and the window maintenance requirements in terms of treatment and its frequency are highly dependent not only on the wood type and its treatment during the manufacturing process but also on the weather conditions of the place where the window is installed. In particular, it became evident that maintenance-related environmental impacts were the highest for climate regions with the lowest temperatures and the greatest amount of precipitation.Keywords: embedded impacts, environmental performance, life cycle assessment, LCA, maintenance stage, operational impacts, wooden windows
Procedia PDF Downloads 2317672 An Alternative Framework of Multi-Resolution Nested Weighted Essentially Non-Oscillatory Schemes for Solving Euler Equations with Adaptive Order
Authors: Zhenming Wang, Jun Zhu, Yuchen Yang, Ning Zhao
Abstract:
In the present paper, an alternative framework is proposed to construct a class of finite difference multi-resolution nested weighted essentially non-oscillatory (WENO) schemes with an increasingly higher order of accuracy for solving inviscid Euler equations. These WENO schemes firstly obtain a set of reconstruction polynomials by a hierarchy of nested central spatial stencils, and then recursively achieve a higher order approximation through the lower-order precision WENO schemes. The linear weights of such WENO schemes can be set as any positive numbers with a requirement that their sum equals one and they will not pollute the optimal order of accuracy in smooth regions and could simultaneously suppress spurious oscillations near discontinuities. Numerical results obtained indicate that these alternative finite-difference multi-resolution nested WENO schemes with different accuracies are very robust with low dissipation and use as few reconstruction stencils as possible while maintaining the same efficiency, achieving the high-resolution property without any equivalent multi-resolution representation. Besides, its finite volume form is easier to implement in unstructured grids.Keywords: finite-difference, WENO schemes, high order, inviscid Euler equations, multi-resolution
Procedia PDF Downloads 1437671 The Direct Deconvolutional Model in the Large-Eddy Simulation of Turbulence
Authors: Ning Chang, Zelong Yuan, Yunpeng Wang, Jianchun Wang
Abstract:
The utilization of Large Eddy Simulation (LES) has been extensive in turbulence research. LES concentrates on resolving the significant grid-scale motions while representing smaller scales through subfilter-scale (SFS) models. The deconvolution model, among the available SFS models, has proven successful in LES of engineering and geophysical flows. Nevertheless, the thorough investigation of how sub-filter scale dynamics and filter anisotropy affect SFS modeling accuracy remains lacking. The outcomes of LES are significantly influenced by filter selection and grid anisotropy, factors that have not been adequately addressed in earlier studies. This study examines two crucial aspects of LES: Firstly, the accuracy of direct deconvolution models (DDM) is evaluated concerning sub-filter scale (SFS) dynamics across varying filter-to-grid ratios (FGR) in isotropic turbulence. Various invertible filters are employed, including Gaussian, Helmholtz I and II, Butterworth, Chebyshev I and II, Cauchy, Pao, and rapidly decaying filters. The importance of FGR becomes evident as it plays a critical role in controlling errors for precise SFS stress prediction. When FGR is set to 1, the DDM models struggle to faithfully reconstruct SFS stress due to inadequate resolution of SFS dynamics. Notably, prediction accuracy improves when FGR is set to 2, leading to accurate reconstruction of SFS stress, except for cases involving Helmholtz I and II filters. Remarkably high precision, nearly 100%, is achieved at an FGR of 4 for all DDM models. Furthermore, the study extends to filter anisotropy and its impact on SFS dynamics and LES accuracy. By utilizing the dynamic Smagorinsky model (DSM), dynamic mixed model (DMM), and direct deconvolution model (DDM) with anisotropic filters, aspect ratios (AR) ranging from 1 to 16 are examined in LES filters. The results emphasize the DDM’s proficiency in accurately predicting SFS stresses under highly anisotropic filtering conditions. Notably high correlation coefficients exceeding 90% are observed in the a priori study for the DDM’s reconstructed SFS stresses, surpassing those of the DSM and DMM models. However, these correlations tend to decrease as filter anisotropy increases. In the a posteriori analysis, the DDM model consistently outperforms the DSM and DMM models across various turbulence statistics, including velocity spectra, probability density functions related to vorticity, SFS energy flux, velocity increments, strainrate tensors, and SFS stress. It is evident that as filter anisotropy intensifies, the results of DSM and DMM deteriorate, while the DDM consistently delivers satisfactory outcomes across all filter-anisotropy scenarios. These findings underscore the potential of the DDM framework as a valuable tool for advancing the development of sophisticated SFS models for LES in turbulence research.Keywords: deconvolution model, large eddy simulation, subfilter scale modeling, turbulence
Procedia PDF Downloads 747670 Shale Gas and Oil Resource Assessment in Middle and Lower Indus Basin of Pakistan
Authors: Amjad Ali Khan, Muhammad Ishaq Saqi, Kashif Ali
Abstract:
The focus of hydrocarbon exploration in Pakistan has been primarily on conventional hydrocarbon resources. Directorate General Petroleum Concessions (DGPC) has taken the lead on the assessment of indigenous unconventional oil and gas resources, which has resulted in a ‘Shale Oil/Gas Resource Assessment Study’ conducted with the help of USAID. This was critically required in the energy-starved Pakistan, where the gap between indigenous oil & gas production and demand continues to widen for a long time. Exploration & exploitation of indigenous unconventional resources of Pakistan have become vital to meet our energy demand and reduction of oil and gas import bill of the country. This study has attempted to bridge a critical gap in geological information about the potential of shale gas & oil in Pakistan in the four formations, i.e., Sembar, Lower Goru, Ranikot and Ghazij in the Middle and Lower Indus Basins, which were selected for the study as for resource assessment for shale gas & oil. The primary objective of the study was to estimate and establish shale oil/gas resource assessment of the study area by carrying out extensive geological analysis of exploration, appraisal and development wells drilled in the Middle and Lower Indus Basins, along with identification of fairway(s) and sweet spots in the study area. The Study covers the Lower parts of the Middle Indus basins located in Sindh, southern Punjab & eastern parts of the Baluchistan provinces, with a total sedimentary area of 271,795 km2. Initially, 1611 wells were reviewed, including 1324 wells drilled through different shale formations. Based on the availability of required technical data, a detailed petrophysical analysis of 124 wells (21 Confidential & 103 in the public domain) has been conducted for the shale gas/oil potential of the above-referred formations. The core & cuttings samples of 32 wells and 33 geochemical reports of prospective Shale Formations were available, which were analyzed to calibrate the results of petrophysical analysis with petrographic/ laboratory analyses to increase the credibility of the Shale Gas Resource assessment. This study has identified the most prospective intervals, mainly in Sembar and Lower Goru Formations, for shale gas/oil exploration in the Middle and Lower Indus Basins of Pakistan. The study recommends seven (07) sweet spots for undertaking pilot projects, which will enable to evaluate of the actual production capability and production sustainability of shale oil/gas reservoirs of Pakistan for formulating future strategies to explore and exploit shale/oil resources of Pakistan including fiscal incentives required for developing shale oil/gas resources of Pakistan. Some E&P Companies are being persuaded to make a consortium for undertaking pilot projects that have shown their willingness to participate in the pilot project at appropriate times. The location for undertaking the pilot project has been finalized as a result of a series of technical sessions by geoscientists of the potential consortium members after the review and evaluation of available studies.Keywords: conventional resources, petrographic analysis, petrophysical analysis, unconventional resources, shale gas & oil, sweet spots
Procedia PDF Downloads 477669 Evaluation of Groundwater Suitability for Irrigation Purposes: A Case Study for an Arid Region
Authors: Mustafa M. Bob, Norhan Rahman, Abdalla Elamin, Saud Taher
Abstract:
The objective of this study was to assess the suitability of Madinah city groundwater for irrigation purposes. Of the twenty three wells that were drilled in different locations in the city for the purposes of this study, twenty wells were sampled for water quality analyses. The United States Department of Agriculture (USDA) classification of irrigation water that is based on Sodium hazard (SAR) and salinity hazard was used for suitability assessment. In addition, the residual sodium carbonate (RSC) was calculated for all samples and also used for irrigation suitability assessment. Results showed that all groundwater samples are in the acceptable quality range for irrigation based on RSC values. When SAR and salinity hazard were assessed, results showed that while all groundwater samples (except one) fell in the acceptable range of SAR, they were either in the high or very high salinity zone which indicates that care should be taken regarding the type of soil and crops in the study area.Keywords: irrigation suitability, TDS, salinity, SAR
Procedia PDF Downloads 3717668 Predictive Analysis of the Stock Price Market Trends with Deep Learning
Authors: Suraj Mehrotra
Abstract:
The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.Keywords: machine learning, testing set, artificial intelligence, stock analysis
Procedia PDF Downloads 947667 The Students' Mathematical Competency and Attitude towards Mathematics Using the Trachtenberg Speed Math System
Authors: Marlone D. Severo
Abstract:
A pre- and post-test quasi-experimental design was used to test the intervention of Trachtenberg Speed Math on the mathematical competency of sixty (60) matched-paired students with a poor performing grade in Mathematics from one of the biggest public national high school at the South of Metro Manila. Both control and experimental group were administered with the Attitude Towards Mathematics Inventory (ATMI) before the pretest were given and both group showed high dislike for Mathematics. Pretest showed a 53 percent accuracy for the control group and 51 percent for the experimental group using a 15-item long multiplication test without any aid of a computing device. The experimental group were taught how to use the Trachtenberg number-keys and techniques in multiplication between October 2014 to March 2015. Post-test showed an improvement in the experimental group with 96 percent accuracy for the control group and a dismal 57 percent for the control group in long-multiplication. Post-test ATMI were administered. The control group showed a great dislike towards Mathematics, while the experimental group showed a positive attitude towards the subject.Keywords: attitude towards mathematics, mathematical competency, number-keys, trachtenberg speed math
Procedia PDF Downloads 3677666 Seismic Vulnerability Assessment of High-Rise Structures in Addis Ababa, Ethiopia: Implications for Urban Resilience Along the East African Rift Margin
Authors: Birhanu Abera Kibret
Abstract:
The abstract highlights findings from a seismicity study conducted in the Ethiopian Rift Valley and adjacent cities, including Semera, Adama, and Hawasa, located in Afar and the Main Ethiopian Rift system. The region experiences high seismicity, characterized by small to moderate earthquakes situated in the mid-to-upper crust. Additionally, the capital city of Ethiopia, Addis Ababa, situated in the rift margin, experiences seismic activity, with small to relatively moderate earthquakes observed to the east and southeast of the city, alongside the rift valley. These findings underscore the seismic vulnerability of the region, emphasizing the need for comprehensive seismic risk assessment and mitigation strategies to enhance resilience and preparedness.Keywords: seismic hazard, seismicity, crustal structure, magmatic intrusion, partial meltung
Procedia PDF Downloads 667665 Assessment of Menus in a Selected Social Welfare Home with Regard to Nutritional Recommendations
Authors: E. Grochowska-Niedworok, K. Brukalo, B. Całyniuk, J. Piekorz, M. Kardas
Abstract:
The aim of the study was to assess diets of residents of nursing homes. Provided by social welfare home, 10 day menus were introduced into the computer program Diet 5 and analyzed in respect of protein, fats, carbohydrates, energy, vitamin D and calcium. The resulting mean values of 10-day menus were compared with the existing Nutrition Standards for Polish population. The analysis menus showed that the average amount of energy supplied from food is not sufficient. Carbohydrates in food supply are too high and represent 257% of normal. The average value of fats and proteins supplied with food is adequate 85.2 g/day and 75.2 g/day. The calcium content of the diet is 513.9 mg/day. The amount of vitamin D supplied in the age group 51-65 years is 2.3 µg/day. Dietary errors that have been shown are due to the lack of detailed nutritional guidelines for nursing homes, as well as state-owned care facilities in general.Keywords: assessment of diet, essential nutrients, social welfare home, nutrition
Procedia PDF Downloads 1507664 Design of the Ubiquitous Cloud Learning Management System
Authors: Panita Wannapiroon, Noppadon Phumeechanya, Sitthichai Laisema
Abstract:
This study is the research and development which is intended to: 1) design the ubiquitous cloud learning management system and: 2) assess the suitability of the design of the ubiquitous cloud learning management system. Its methods are divided into 2 phases. Phase 1 is the design of the ubiquitous cloud learning management system, phase 2 is the assessment of the suitability of the design the samples used in this study are work done by 25 professionals in the field of Ubiquitous cloud learning management systems and information and communication technology in education selected using the purposive sampling method. Data analyzed by arithmetic mean and standard deviation. The results showed that the ubiquitous cloud learning management system consists of 2 main components which are: 1) the ubiquitous cloud learning management system server (u-Cloud LMS Server) including: cloud repository, cloud information resources, social cloud network, cloud context awareness, cloud communication, cloud collaborative tools, and: 2) the mobile client. The result of the system suitability assessment from the professionals is in the highest range.Keywords: learning management system, cloud computing, ubiquitous learning, ubiquitous learning management system
Procedia PDF Downloads 5177663 Roof and Road Network Detection through Object Oriented SVM Approach Using Low Density LiDAR and Optical Imagery in Misamis Oriental, Philippines
Authors: Jigg L. Pelayo, Ricardo G. Villar, Einstine M. Opiso
Abstract:
The advances of aerial laser scanning in the Philippines has open-up entire fields of research in remote sensing and machine vision aspire to provide accurate timely information for the government and the public. Rapid mapping of polygonal roads and roof boundaries is one of its utilization offering application to disaster risk reduction, mitigation and development. The study uses low density LiDAR data and high resolution aerial imagery through object-oriented approach considering the theoretical concept of data analysis subjected to machine learning algorithm in minimizing the constraints of feature extraction. Since separating one class from another in distinct regions of a multi-dimensional feature-space, non-trivial computing for fitting distribution were implemented to formulate the learned ideal hyperplane. Generating customized hybrid feature which were then used in improving the classifier findings. Supplemental algorithms for filtering and reshaping object features are develop in the rule set for enhancing the final product. Several advantages in terms of simplicity, applicability, and process transferability is noticeable in the methodology. The algorithm was tested in the different random locations of Misamis Oriental province in the Philippines demonstrating robust performance in the overall accuracy with greater than 89% and potential to semi-automation. The extracted results will become a vital requirement for decision makers, urban planners and even the commercial sector in various assessment processes.Keywords: feature extraction, machine learning, OBIA, remote sensing
Procedia PDF Downloads 3607662 Detecting and Thwarting Interest Flooding Attack in Information Centric Network
Authors: Vimala Rani P, Narasimha Malikarjunan, Mercy Shalinie S
Abstract:
Data Networking was brought forth as an instantiation of information-centric networking. The attackers can send a colossal number of spoofs to take hold of the Pending Interest Table (PIT) named an Interest Flooding attack (IFA) since the in- interests are recorded in the PITs of the intermediate routers until they receive corresponding Data Packets are go beyond the time limit. These attacks can be detrimental to network performance. PIT expiration rate or the Interest satisfaction rate, which cannot differentiate the IFA from attacks, is the criterion Traditional IFA detection techniques are concerned with. Threshold values can casually affect Threshold-based traditional methods. This article proposes an accurate IFA detection mechanism based on a Multiple Feature-based Extreme Learning Machine (MF-ELM). Accuracy of the attack detection can be increased by presenting the entropy of Internet names, Interest satisfaction rate and PIT usage as features extracted in the MF-ELM classifier. Furthermore, we deploy a queue-based hostile Interest prefix mitigation mechanism. The inference of this real-time test bed is that the mechanism can help the network to resist IFA with higher accuracy and efficiency.Keywords: information-centric network, pending interest table, interest flooding attack, MF-ELM classifier, queue-based mitigation strategy
Procedia PDF Downloads 2047661 Transportation Mode Classification Using GPS Coordinates and Recurrent Neural Networks
Authors: Taylor Kolody, Farkhund Iqbal, Rabia Batool, Benjamin Fung, Mohammed Hussaeni, Saiqa Aleem
Abstract:
The rising threat of climate change has led to an increase in public awareness and care about our collective and individual environmental impact. A key component of this impact is our use of cars and other polluting forms of transportation, but it is often difficult for an individual to know how severe this impact is. While there are applications that offer this feedback, they require manual entry of what transportation mode was used for a given trip, which can be burdensome. In order to alleviate this shortcoming, a data from the 2016 TRIPlab datasets has been used to train a variety of machine learning models to automatically recognize the mode of transportation. The accuracy of 89.6% is achieved using single deep neural network model with Gated Recurrent Unit (GRU) architecture applied directly to trip data points over 4 primary classes, namely walking, public transit, car, and bike. These results are comparable in accuracy to results achieved by others using ensemble methods and require far less computation when classifying new trips. The lack of trip context data, e.g., bus routes, bike paths, etc., and the need for only a single set of weights make this an appropriate methodology for applications hoping to reach a broad demographic and have responsive feedback.Keywords: classification, gated recurrent unit, recurrent neural network, transportation
Procedia PDF Downloads 1367660 Single Pole-To-Earth Fault Detection and Location on the Tehran Railway System Using ICA and PSO Trained Neural Network
Authors: Masoud Safarishaal
Abstract:
Detecting the location of pole-to-earth faults is essential for the safe operation of the electrical system of the railroad. This paper aims to use a combination of evolutionary algorithms and neural networks to increase the accuracy of single pole-to-earth fault detection and location on the Tehran railroad power supply system. As a result, the Imperialist Competitive Algorithm (ICA) and Particle Swarm Optimization (PSO) are used to train the neural network to improve the accuracy and convergence of the learning process. Due to the system's nonlinearity, fault detection is an ideal application for the proposed method, where the 600 Hz harmonic ripple method is used in this paper for fault detection. The substations were simulated by considering various situations in feeding the circuit, the transformer, and typical Tehran metro parameters that have developed the silicon rectifier. Required data for the network learning process has been gathered from simulation results. The 600Hz component value will change with the change of the location of a single pole to the earth's fault. Therefore, 600Hz components are used as inputs of the neural network when fault location is the output of the network system. The simulation results show that the proposed methods can accurately predict the fault location.Keywords: single pole-to-pole fault, Tehran railway, ICA, PSO, artificial neural network
Procedia PDF Downloads 1227659 Forest Fire Risk Mapping Using Analytic Hierarchy Process and GIS-Based Application: A Case Study in Hua Sai District, Thailand
Authors: Narissara Nuthammachot, Dimitris Stratoulias
Abstract:
Fire is one of the main causes of environmental and ecosystem change. Therefore, it is a challenging task for fire risk assessment fire potential mapping. The study area is Hua Sai district, Nakorn Sri Thammarat province, which covers in a part of peat swamp forest areas. 55 fire points in peat swamp areas were reported from 2012 to 2016. Analytic Hierarchy Process (AHP) and Geographic Information System (GIS) methods were selected for this study. The risk fire area map was arranged on these factors; elevation, slope, aspect, precipitation, distance from the river, distance from town, and land use. The results showed that the predicted fire risk areas are found to be in appreciable reliability with past fire events. The fire risk map can be used for the planning and management of fire areas in the future.Keywords: analytic hierarchy process, fire risk assessment, geographic information system, peat swamp forest
Procedia PDF Downloads 2087658 Automatic Identification of Pectoral Muscle
Authors: Ana L. M. Pavan, Guilherme Giacomini, Allan F. F. Alves, Marcela De Oliveira, Fernando A. B. Neto, Maria E. D. Rosa, Andre P. Trindade, Diana R. De Pina
Abstract:
Mammography is a worldwide image modality used to diagnose breast cancer, even in asymptomatic women. Due to its large availability, mammograms can be used to measure breast density and to predict cancer development. Women with increased mammographic density have a four- to sixfold increase in their risk of developing breast cancer. Therefore, studies have been made to accurately quantify mammographic breast density. In clinical routine, radiologists perform image evaluations through BIRADS (Breast Imaging Reporting and Data System) assessment. However, this method has inter and intraindividual variability. An automatic objective method to measure breast density could relieve radiologist’s workload by providing a first aid opinion. However, pectoral muscle is a high density tissue, with similar characteristics of fibroglandular tissues. It is consequently hard to automatically quantify mammographic breast density. Therefore, a pre-processing is needed to segment the pectoral muscle which may erroneously be quantified as fibroglandular tissue. The aim of this work was to develop an automatic algorithm to segment and extract pectoral muscle in digital mammograms. The database consisted of thirty medio-lateral oblique incidence digital mammography from São Paulo Medical School. This study was developed with ethical approval from the authors’ institutions and national review panels under protocol number 3720-2010. An algorithm was developed, in Matlab® platform, for the pre-processing of images. The algorithm uses image processing tools to automatically segment and extract the pectoral muscle of mammograms. Firstly, it was applied thresholding technique to remove non-biological information from image. Then, the Hough transform is applied, to find the limit of the pectoral muscle, followed by active contour method. Seed of active contour is applied in the limit of pectoral muscle found by Hough transform. An experienced radiologist also manually performed the pectoral muscle segmentation. Both methods, manual and automatic, were compared using the Jaccard index and Bland-Altman statistics. The comparison between manual and the developed automatic method presented a Jaccard similarity coefficient greater than 90% for all analyzed images, showing the efficiency and accuracy of segmentation of the proposed method. The Bland-Altman statistics compared both methods in relation to area (mm²) of segmented pectoral muscle. The statistic showed data within the 95% confidence interval, enhancing the accuracy of segmentation compared to the manual method. Thus, the method proved to be accurate and robust, segmenting rapidly and freely from intra and inter-observer variability. It is concluded that the proposed method may be used reliably to segment pectoral muscle in digital mammography in clinical routine. The segmentation of the pectoral muscle is very important for further quantifications of fibroglandular tissue volume present in the breast.Keywords: active contour, fibroglandular tissue, hough transform, pectoral muscle
Procedia PDF Downloads 3507657 Multiple Intelligence Theory with a View to Designing a Classroom for the Future
Authors: Phalaunnaphat Siriwongs
Abstract:
The classroom of the 21st century is an ever-changing forum for new and innovative thoughts and ideas. With increasing technology and opportunity, students have rapid access to information that only decades ago would have taken weeks to obtain. Unfortunately, new techniques and technology are not a cure for the fundamental problems that have plagued the classroom ever since education was established. Class size has been an issue long debated in academia. While it is difficult to pinpoint an exact number, it is clear that in this case, more does not mean better. By looking into the success and pitfalls of classroom size, the true advantages of smaller classes becomes clear. Previously, one class was comprised of 50 students. Since they were seventeen- and eighteen-year-old students, it was sometimes quite difficult for them to stay focused. To help students understand and gain much knowledge, a researcher introduced “The Theory of Multiple Intelligence” and this, in fact, enabled students to learn according to their own learning preferences no matter how they were being taught. In this lesson, the researcher designed a cycle of learning activities involving all intelligences so that everyone had equal opportunities to learn.Keywords: multiple intelligences, role play, performance assessment, formative assessment
Procedia PDF Downloads 2817656 Fall Prevention: Evidence-Based Intervention in Exercise Program Implementation for Keeping Older Adults Safe and Active
Authors: Jennifer Holbein, Maritza Wiedel
Abstract:
Background: Aging is associated with an increased risk of falls in older adults, and as a result, falls have become public health crises. However, the incidence of falls can be reduced through healthy aging and the implementation of a regular exercise and strengthening program. Public health and healthcare professionals authorize the use of evidence‐based, exercise‐focused fall interventions, but there are major obstacles to translating and disseminating research findings into healthcare practices. The purpose of this study was to assess the feasibility of an intervention, A Matter of Balance, in terms of demand, acceptability, and implementation into current exercise programs. Subjects: Seventy-five participants from rural communities, above the age of sixty, were randomized to an intervention or attention-control of the standardized senior fitness test. Methods: Subject completes the intervention, which combines two components: (1) motivation and (2) fall-reducing physical activities with protocols derived from baseline strength and balanced assessments. Participants (n=75) took part in the program after completing baseline functional assessments as well as evaluations of their personal knowledge, health outcomes, demand, and implementation interventions. After 8-weeks of the program, participants were invited to complete follow-up assessments with results that were compared to their baseline functional analyses. Out of all the participants in the study who complete the initial assessment, approximately 80% are expected to maintain enrollment in the implemented prescription. Furthermore, those who commit to the program should show mitigation of fall risk upon completion of their final assessment.Keywords: aging population, exercise, falls, functional assessment, healthy aging
Procedia PDF Downloads 1007655 Application of Fuzzy TOPSIS in Evaluating Green Transportation Options for Dhaka Megacity
Authors: Md. Moniruzzaman, Thirayoot Limanond
Abstract:
Being the most visible indicator, the transport system of a city points out how developed the city is. Dhaka megacity holds a mixed composition of motorized and non-motorized modes of transport and the number of vehicle figure is escalating over times. And this obviously poses associated environmental costs like air pollution, noise etc. which is degrading the quality of life in the city. Eventually sustainable transport or more importantly green transport from environmental point of view has become a prime choice to the transport professionals in order to cope up the crisis. Currently the city authority is planning to execute such sustainable transport systems that could serve the pressing demand of the present and meet the future needs effectively. This study focuses on the selection and evaluation of green transportation systems among potential alternatives on a priority basis. In this paper, Fuzzy TOPSIS - a multi-criteria decision method is presented to find out the most prioritized alternative. In the first step, Twenty-one individual specific criteria for sustainability assessment are selected. In the following step, experts provide linguistic ratings to the potential alternatives with respect to the selected criteria. The approach is used to generate aggregate scores for sustainability assessment and selection of the best alternative. In the third step, a sensitivity analysis is performed to understand the influence of criteria weights on the decision making process. The key strength of fuzzy TOPSIS approach is its practical applicability having a generation of good quality solution even under uncertainty.Keywords: green transport, multi-criteria decision approach, urban transportation system, sustainability assessment, fuzzy theory, uncertainty
Procedia PDF Downloads 2897654 An Assessment of Digital Platforms, Student Online Learning, Teaching Pedagogies, Research and Training at Kenya College of Accounting University
Authors: Jasmine Renner, Alice Njuguna
Abstract:
The booming technological revolution is driving a change in the mode of delivery systems especially for e-learning and distance learning in higher education. The report and findings of the study; an assessment of digital platforms, student online learning, teaching pedagogies, research and training at Kenya College of Accounting University (hereinafter 'KCA') was undertaken as a joint collaboration project between the Carnegie African Diaspora Fellowship and input from the staff, students and faculty at KCA University. The participants in this assessment/research met for selected days during a six-week period during which, one-one consultations, surveys, questionnaires, foci groups, training, and seminars were conducted to ascertain 'online learning and teaching, curriculum development, research and training at KCA.' The project was organized into an eight-week project workflow with each week culminating in project activities designed to assess digital online teaching and learning at KCA. The project also included the training of distance learning instructors at KCA and the evaluation of KCA’s distance platforms and programs. Additionally, through a curriculum audit and redesign, the project sought to enhance the curriculum development activities related to of distance learning at KCA. The findings of this assessment/research represent the systematic deliberate process of gathering, analyzing and using data collected from DL students, DL staff and lecturers and a librarian personnel in charge of online learning resources and access at KCA. We engaged in one-on-one interviews and discussions with staff, students, and faculty and collated the findings to inform practices that are effective in the ongoing design and development of eLearning earning at KCA University. Overall findings of the project led to the following recommendations. First, there is a need to address infrastructural challenges that led to poor internet connectivity for online learning, training needs and content development for faculty and staff. Second, there is a need to manage cultural impediments within KCA; for example fears of vital change from one platform to another for effectiveness and Institutional goodwill as a vital promise of effective online learning. Third, at a practical and short-term level, the following recommendations based on systematic findings of the research conducted were as follows: there is a need for the following to be adopted at KCA University to promote the effective adoption of online learning: a) an eLearning compatible faculty lab, b) revision of policy to include an eLearn strategy or strategic management, c) faculty and staff recognitions engaged in the process of training for the adoption and implementation of eLearning and d) adequate website resources on eLearning. The report and findings represent a comprehensive approach to a systematic assessment of online teaching and learning, research and training at KCA.Keywords: e-learning, digital platforms, student online learning, online teaching pedagogies
Procedia PDF Downloads 1897653 Calculate Product Carbon Footprint through the Internet of Things from Network Science
Authors: Jing Zhang
Abstract:
To reduce the carbon footprint of mankind and become more sustainable is one of the major challenges in our era. Internet of Things (IoT) mainly resolves three problems: Things to Things (T2T), Human to Things, H2T), and Human to Human (H2H). Borrowing the classification of IoT, we can find carbon prints of industries also can be divided in these three ways. Therefore, monitoring the routes of generation and circulation of products may help calculate product carbon print. This paper does not consider any technique used by IoT itself, but the ideas of it look at the connection of products. Carbon prints are like a gene or mark of a product from raw materials to the final products, which never leave the products. The contribution of this paper is to combine the characteristics of IoT and the methodology of network science to find a way to calculate the product's carbon footprint. Life cycle assessment, LCA is a traditional and main tool to calculate the carbon print of products. LCA is a traditional but main tool, which includes three kinds.Keywords: product carbon footprint, Internet of Things, network science, life cycle assessment
Procedia PDF Downloads 114