Search results for: integrated models of reading comprehension
9587 Measurement and Monitoring of Graduate Attributes via iCGPA Implementation and ACADEMIA Programming: UNIMAS Case Study
Authors: Shanti Faridah Salleh, Azzahrah Anuar, Hamimah Ujir, Rohana Sapawi, Wan Hashim Wan Ibrahim, Noraziah Abdul Wahab, Majina Sulaiman, Raudhah Ahmadi, Al-Khalid Othman, Johari Abdullah
Abstract:
Integrated Cumulative Grade Point Average or iCGPA is an evaluation and reporting system that represents a comprehensive development of students’ achievement in their academic programs. Universiti Malaysia Sarawak, UNIMAS has started its implementation of iCGPA in 2016. iCGPA is driven by the Outcome-Based Education (OBE) system that has been long integrated into the higher education in Malaysia. iCGPA is not only a tool to enhance the OBE concept through constructive alignment but it is also an integrated mechanism to assist various stakeholders in making decisions or planning for program improvement. The outcome of this integrated system is the reporting of students’ academic performance in terms of cognitive (knowledge), psychomotor (skills), and affective (attitude) of which the students acquire throughout the duration of their study. The iCGPA reporting illustrates the attainment of student’s attribute in the eight domains of learning outcomes listed in the Malaysian Qualifications Framework (MQF). This paper discusses on the implementation of iCGPA in UNIMAS on the policy and strategy to direct the whole university to implement the iCGPA. The steps and challenges in integrating the exsting Outcome-Based Education and utilising iCGPA as a tool to quantify the students’ achievement are also highlighted in this paper. Finally, the ACADEMIA system, which is a dedicated centralised program ensure the implementation of iCGPA is a success has been developed. This paper discusses the structure and the analysis of ACADEMIA program and concludes the analysis made on the improvement made on the implementation of constructive alignment in all 40 programs involves in iCGPA implementation.Keywords: constructive alignment, holistic graduates, mapping of assessment, programme outcome
Procedia PDF Downloads 2089586 A Pedagogical Case Study on Consumer Decision Making Models: A Selection of Smart Phone Apps
Authors: Yong Bum Shin
Abstract:
This case focuses on Weighted additive difference, Conjunctive, Disjunctive, and Elimination by aspects methodologies in consumer decision-making models and the Simple additive weighting (SAW) approach in the multi-criteria decision-making (MCDM) area. Most decision-making models illustrate that the rank reversal phenomenon is unpreventable. This paper presents that rank reversal occurs in popular managerial methods such as Weighted Additive Difference (WAD), Conjunctive Method, Disjunctive Method, Elimination by Aspects (EBA) and MCDM methods as well as such as the Simple Additive Weighting (SAW) and finally Unified Commensurate Multiple (UCM) models which successfully addresses these rank reversal problems in most popular MCDM methods in decision-making area.Keywords: multiple criteria decision making, rank inconsistency, unified commensurate multiple, analytic hierarchy process
Procedia PDF Downloads 819585 A Comparative Evaluation of the SIR and SEIZ Epidemiological Models to Describe the Diffusion Characteristics of COVID-19 Polarizing Viewpoints on Online
Authors: Maryam Maleki, Esther Mead, Mohammad Arani, Nitin Agarwal
Abstract:
This study is conducted to examine how opposing viewpoints related to COVID-19 were diffused on Twitter. To accomplish this, six datasets using two epidemiological models, SIR (Susceptible, Infected, Recovered) and SEIZ (Susceptible, Exposed, Infected, Skeptics), were analyzed. The six datasets were chosen because they represent opposing viewpoints on the COVID-19 pandemic. Three of the datasets contain anti-subject hashtags, while the other three contain pro-subject hashtags. The time frame for all datasets is three years, starting from January 2020 to December 2022. The findings revealed that while both models were effective in evaluating the propagation trends of these polarizing viewpoints, the SEIZ model was more accurate with a relatively lower error rate (6.7%) compared to the SIR model (17.3%). Additionally, the relative error for both models was lower for anti-subject hashtags compared to pro-subject hashtags. By leveraging epidemiological models, insights into the propagation trends of polarizing viewpoints on Twitter were gained. This study paves the way for the development of methods to prevent the spread of ideas that lack scientific evidence while promoting the dissemination of scientifically backed ideas.Keywords: mathematical modeling, epidemiological model, seiz model, sir model, covid-19, twitter, social network analysis, social contagion
Procedia PDF Downloads 629584 From Customer Innovations to Manufactured Products: A Project Outlook
Authors: M. Holle, M. Roth, M. R. Gürtler, U. Lindemann
Abstract:
This paper gives insights into the research project "InnoCyFer" (in the form of an outlook) which is funded by the German Federal Ministry of Economics and Technology. Enabling the integrated customer individual product design as well as flexible manufacturing of these products are the main objectives of the project. To achieve this, a web-based open innovation-platform containing an integrated Toolkit will be developed. This toolkit enables the active integration of the customer’s creativity and potentials of innovation in the product development process. Furthermore, the project will show the chances and possibilities of customer individualized products by building and examining the continuous process from innovation through the customers to the flexible manufacturing of individual products.Keywords: customer individual product design, innovation networks, open innovation, open innovation platform, toolkit
Procedia PDF Downloads 3149583 Comparative Sustainability Performance Analysis of Australian Companies Using Composite Measures
Authors: Ramona Zharfpeykan, Paul Rouse
Abstract:
Organizational sustainability is important to both organizations themselves and their stakeholders. Despite its increasing popularity and increasing numbers of organizations reporting sustainability, research on evaluating and comparing the sustainability performance of companies is limited. The aim of this study was to develop models to measure sustainability performance for both cross-sectional and longitudinal comparisons across companies in the same or different industries. A secondary aim was to see if sustainability reports can be used to evaluate sustainability performance. The study used both a content analysis of Australian sustainability reports in mining and metals and financial services for 2011-2014 and a survey of Australian and New Zealand organizations. Two methods ranging from a composite index using uniform weights to data envelopment analysis (DEA) were employed to analyze the data and develop the models. The results show strong statistically significant relationships between the developed models, which suggests that each model provides a consistent, systematic and reasonably robust analysis. The results of the models show that for both industries, companies that had sustainability scores above or below the industry average stayed almost the same during the study period. These indices and models can be used by companies to evaluate their sustainability performance and compare it with previous years, or with other companies in the same or different industries. These methods can also be used by various stakeholders and sustainability ranking companies such as the Global Reporting Initiative (GRI).Keywords: data envelopment analysis, sustainability, sustainability performance measurement system, sustainability performance index, global reporting initiative
Procedia PDF Downloads 1819582 A Study of High Viscosity Oil-Gas Slug Flow Using Gamma Densitometer
Authors: Y. Baba, A. Archibong-Eso, H. Yeung
Abstract:
Experimental study of high viscosity oil-gas flows in horizontal pipelines published in literature has indicated that hydrodynamic slug flow is the dominant flow pattern observed. Investigations have shown that hydrodynamic slugging brings about high instabilities in pressure that can damage production facilities thereby making it inherent to study high viscous slug flow regime so as to improve the understanding of its flow dynamics. Most slug flow models used in the petroleum industry for the design of pipelines together with their closure relationships were formulated based on observations of low viscosity liquid-gas flows. New experimental investigations and data are therefore required to validate these models. In cases where these models underperform, improving upon or building new predictive models and correlations will also depend on the new experimental dataset and further understanding of the flow dynamics in high viscous oil-gas flows. In this study conducted at the Flow laboratory, Oil and Gas Engineering Centre of Cranfield University, slug flow variables such as pressure gradient, mean liquid holdup, frequency and slug length for oil viscosity ranging from 1..0 – 5.5 Pa.s are experimentally investigated and analysed. The study was carried out in a 0.076m ID pipe, two fast sampling gamma densitometer and pressure transducers (differential and point) were used to obtain experimental measurements. Comparison of the measured slug flow parameters to the existing slug flow prediction models available in the literature showed disagreement with high viscosity experimental data thus highlighting the importance of building new predictive models and correlations.Keywords: gamma densitometer, mean liquid holdup, pressure gradient, slug frequency and slug length
Procedia PDF Downloads 3299581 Achieving Net Zero Energy Building in a Hot Climate Using Integrated Photovoltaic and Parabolic Trough Collectors
Authors: Adel A. Ghoneim
Abstract:
In most existing buildings in hot climate, cooling loads lead to high primary energy consumption and consequently high CO2 emissions. These can be substantially decreased with integrated renewable energy systems. Kuwait is characterized by its dry hot long summer and short warm winter. Kuwait receives annual total radiation more than 5280 MJ/m2 with approximately 3347 h of sunshine. Solar energy systems consist of PV modules and parabolic trough collectors are considered to satisfy electricity consumption, domestic water heating, and cooling loads of an existing building. This paper presents the results of an extensive program of energy conservation and energy generation using integrated photovoltaic (PV) modules and parabolic trough collectors (PTC). The program conducted on an existing institutional building intending to convert it into a Net-Zero Energy Building (NZEB) or near net Zero Energy Building (nNZEB). The program consists of two phases; the first phase is concerned with energy auditing and energy conservation measures at minimum cost and the second phase considers the installation of photovoltaic modules and parabolic trough collectors. The 2-storey building under consideration is the Applied Sciences Department at the College of Technological Studies, Kuwait. Single effect lithium bromide water absorption chillers are implemented to provide air conditioning load to the building. A numerical model is developed to evaluate the performance of parabolic trough collectors in Kuwait climate. Transient simulation program (TRNSYS) is adapted to simulate the performance of different solar system components. In addition, a numerical model is developed to assess the environmental impacts of building integrated renewable energy systems. Results indicate that efficient energy conservation can play an important role in converting the existing buildings into NZEBs as it saves a significant portion of annual energy consumption of the building. The first phase results in an energy conservation of about 28% of the building consumption. In the second phase, the integrated PV completely covers the lighting and equipment loads of the building. On the other hand, parabolic trough collectors of optimum area of 765 m2 can satisfy a significant portion of the cooling load, i.e about73% of the total building cooling load. The annual avoided CO2 emission is evaluated at the optimum conditions to assess the environmental impacts of renewable energy systems. The total annual avoided CO2 emission is about 680 metric ton/year which confirms the environmental impacts of these systems in Kuwait.Keywords: building integrated renewable systems, Net-Zero energy building, solar fraction, avoided CO2 emission
Procedia PDF Downloads 6119580 Internationalization and Multilingualism in Brazil: Possibilities of Content and Language Integrated Learning and Intercomprehension Approaches
Authors: Kyria Rebeca Finardi
Abstract:
The study discusses the role of foreign languages in general and of English in particular in the process of internationalization of higher education (IHE), defined as the intentional integration of an international, intercultural or global dimension in the purpose, function or offer of higher education. The study is bibliographical and offers a brief outline of the current political, economic and educational scenarios in Brazil, before discussing some possibilities and challenges for the development of multilingualism and IHE there. The theoretical background includes a review of Brazilian language and internationalization policies. The review and discussion concludes that the use of the Content and Language Integrated Learning (CLIL) approach and the Intercomprehension approach to foreign language teaching/learning are relevant alternatives to foster multilingualism in that context.Keywords: Brazil, higher education, internationalization, multilingualism
Procedia PDF Downloads 1559579 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encryption
Authors: Waziri Victor Onomza, John K. Alhassan, Idris Ismaila, Noel Dogonyaro Moses
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy, confidentiality, availability of the users. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute theoretical presentations in high-level computational processes that are based on number theory and algebra that can easily be integrated and leveraged in the Cloud computing with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based cryptographic security algorithm.Keywords: big data analytics, security, privacy, bootstrapping, homomorphic, homomorphic encryption scheme
Procedia PDF Downloads 3809578 Wind Fragility of Window Glass in 10-Story Apartment with Two Different Window Models
Authors: Viriyavudh Sim, WooYoung Jung
Abstract:
Damage due to high wind is not limited to load resistance components such as beam and column. The majority of damage is due to breach in the building envelope such as broken roof, window, and door. In this paper, wind fragility of window glass in residential apartment was determined to compare the difference between two window configuration models. Monte Carlo Simulation method had been used to derive damage data and analytical fragilities were constructed. Fragility of window system showed that window located in leeward wall had higher probability of failure, especially those close to the edge of structure. Between the two window models, Model 2 had higher probability of failure, this was due to the number of panel in this configuration.Keywords: wind fragility, glass window, high rise building, wind disaster
Procedia PDF Downloads 2579577 Social Perception of the Benefits of Using a Solar Dryer to Conserve Fruits and Vegetables in Rural Communities in Manica - Mozambique
Authors: Constâncio Augusto Machanguana, Luís Miguel Estevão Cristóvão
Abstract:
In Mozambique, over 80% of the rural population relies on agriculture, livestock, and silviculture for their livelihoods. Unfortunately, these communities face persistent food shortages, which are exacerbated by natural disasters and post-harvest losses due to inadequate storage facilities. Addressing post-harvest loss is critical not only for ensuring food security but also for preventing financial hardships faced by farmers. The study delves into the perceptions of beneficiary communities regarding the construction of three food dryer models made from metal, wood, and clay brick. These solar dryers are part of the project titled ‘Solar Dryer Integrated with Natural Rocks as Energy Storage for Drying Fruits and Vegetables in Mozambique.’ The overarching goal is to enhance food availability beyond the typical growing season, particularly for fruits and vegetables, while simultaneously combating hunger. Given the context of climate change impacts on agriculture, this project becomes even more relevant. Structured interviews conducted with 45 members of beneficiary associations in Manica Province—primarily female heads of households—revealed that rural communities are aware of various food drying alternatives. However, reliance on traditional methods often comes at a cost: compromised product quality and reduced shelf life. To address these challenges, the project implemented energy storage solutions like rock-based thermal energy storage for food drying. This result underscores the urgent need to foster innovation and extend these sustainable practices —such as solar dryers integrated with thermal energy-storage systems made of locally abundant and affordable materials— to more local communities, especially those with significant agricultural potential within the country. By taking these actions, we can improve food security and alleviate hunger.Keywords: solar dryer, food security, rural community, small technology
Procedia PDF Downloads 309576 Non-Linear Causality Inference Using BAMLSS and Bi-CAM in Finance
Authors: Flora Babongo, Valerie Chavez
Abstract:
Inferring causality from observational data is one of the fundamental subjects, especially in quantitative finance. So far most of the papers analyze additive noise models with either linearity, nonlinearity or Gaussian noise. We fill in the gap by providing a nonlinear and non-gaussian causal multiplicative noise model that aims to distinguish the cause from the effect using a two steps method based on Bayesian additive models for location, scale and shape (BAMLSS) and on causal additive models (CAM). We have tested our method on simulated and real data and we reached an accuracy of 0.86 on average. As real data, we considered the causality between financial indices such as S&P 500, Nasdaq, CAC 40 and Nikkei, and companies' log-returns. Our results can be useful in inferring causality when the data is heteroskedastic or non-injective.Keywords: causal inference, DAGs, BAMLSS, financial index
Procedia PDF Downloads 1519575 RAPDAC: Role Centric Attribute Based Policy Driven Access Control Model
Authors: Jamil Ahmed
Abstract:
Access control models aim to decide whether a user should be denied or granted access to the user‟s requested activity. Various access control models have been established and proposed. The most prominent of these models include role-based, attribute-based, policy based access control models as well as role-centric attribute based access control model. In this paper, a novel access control model is presented called “Role centric Attribute based Policy Driven Access Control (RAPDAC) model”. RAPDAC incorporates the concept of “policy” in the “role centric attribute based access control model”. It leverages the concept of "policy‟ by precisely combining the evaluation of conditions, attributes, permissions and roles in order to allow authorization access. This approach allows capturing the "access control policy‟ of a real time application in a well defined manner. RAPDAC model allows making access decision at much finer granularity as illustrated by the case study of a real time library information system.Keywords: authorization, access control model, role based access control, attribute based access control
Procedia PDF Downloads 1599574 On Stochastic Models for Fine-Scale Rainfall Based on Doubly Stochastic Poisson Processes
Authors: Nadarajah I. Ramesh
Abstract:
Much of the research on stochastic point process models for rainfall has focused on Poisson cluster models constructed from either the Neyman-Scott or Bartlett-Lewis processes. The doubly stochastic Poisson process provides a rich class of point process models, especially for fine-scale rainfall modelling. This paper provides an account of recent development on this topic and presents the results based on some of the fine-scale rainfall models constructed from this class of stochastic point processes. Amongst the literature on stochastic models for rainfall, greater emphasis has been placed on modelling rainfall data recorded at hourly or daily aggregation levels. Stochastic models for sub-hourly rainfall are equally important, as there is a need to reproduce rainfall time series at fine temporal resolutions in some hydrological applications. For example, the study of climate change impacts on hydrology and water management initiatives requires the availability of data at fine temporal resolutions. One approach to generating such rainfall data relies on the combination of an hourly stochastic rainfall simulator, together with a disaggregator making use of downscaling techniques. Recent work on this topic adopted a different approach by developing specialist stochastic point process models for fine-scale rainfall aimed at generating synthetic precipitation time series directly from the proposed stochastic model. One strand of this approach focused on developing a class of doubly stochastic Poisson process (DSPP) models for fine-scale rainfall to analyse data collected in the form of rainfall bucket tip time series. In this context, the arrival pattern of rain gauge bucket tip times N(t) is viewed as a DSPP whose rate of occurrence varies according to an unobserved finite state irreducible Markov process X(t). Since the likelihood function of this process can be obtained, by conditioning on the underlying Markov process X(t), the models were fitted with maximum likelihood methods. The proposed models were applied directly to the raw data collected by tipping-bucket rain gauges, thus avoiding the need to convert tip-times to rainfall depths prior to fitting the models. One advantage of this approach was that the use of maximum likelihood methods enables a more straightforward estimation of parameter uncertainty and comparison of sub-models of interest. Another strand of this approach employed the DSPP model for the arrivals of rain cells and attached a pulse or a cluster of pulses to each rain cell. Different mechanisms for the pattern of the pulse process were used to construct variants of this model. We present the results of these models when they were fitted to hourly and sub-hourly rainfall data. The results of our analysis suggest that the proposed class of stochastic models is capable of reproducing the fine-scale structure of the rainfall process, and hence provides a useful tool in hydrological modelling.Keywords: fine-scale rainfall, maximum likelihood, point process, stochastic model
Procedia PDF Downloads 2789573 Exploring Heidegger’s Fourfold through Architecture-Dwelling for Imaginary Fictional Characters in Drawings
Authors: Hassan Wajid
Abstract:
Architecture design studio with all its accouterments, especially pedagogies, has been committed to awakening the students to the true meaning of the concept of Dwelling. The real task is how to make them unlearn the associations of “dwelling as a rented or owned accommodation by the road with a car parked in front of a garage door and replace it by the fundamental experiential-phenomenological manifestations of Light, Space, Gravity and Time through assigned readings and small theoretical challenges resulting in drawings and models. The primary challenge for teachers remained the introduction of the act or desire of ‘Dwelling’ philosophically. The academic link had been offered by Albert Hofstadter's Poetry, Language, through which Martin Heidegger’s fourfold concept of ‘Building Dwelling, Thinking’ primarily served to guide us through this trajectory in helping to build an intellectual framework as justification of the term “dwelling” in its various meanings. Gaston Bachelard’s Poetics of Space and Merleau-Ponti’s Phenomenology of Perception also got assigned as reading. Four fictional characters created by two master short story writers G Maupassant, and O Henry were introduced as DwellersClients in search of their respective dwellings as drawn imaginations in the studio four-fold of Light, Space, Gravity, and Time and at the same time aspire to understand thoroughly Heidegger’s Four-Fold of Earth, Sky, Divinities and Mortals. asserting its place in the corresponding story and its unique character as the Dweller.Keywords: dwelling, imagination, architectural manifestation, phenomenological
Procedia PDF Downloads 709572 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity
Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish
Abstract:
Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow
Procedia PDF Downloads 1329571 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-
Authors: Nieto Bernal Wilson, Carmona Suarez Edgar
Abstract:
The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects. Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse
Procedia PDF Downloads 4099570 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria
Authors: Isaac Kayode Ogunlade
Abstract:
Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device
Procedia PDF Downloads 929569 Using Traffic Micro-Simulation to Assess the Benefits of Accelerated Pavement Construction for Reducing Traffic Emissions
Authors: Sudipta Ghorai, Ossama Salem
Abstract:
Pavement maintenance, repair, and rehabilitation (MRR) processes may have considerable environmental impacts due to traffic disruptions associated with work zones. The simulation models in use to predict the emission of work zones were mostly static emission factor models (SEFD). SEFD calculates emissions based on average operation conditions e.g. average speed and type of vehicles. Although these models produce accurate results for large-scale planning studies, they are not suitable for analyzing driving conditions at the micro level such as acceleration, deceleration, idling, cruising, and queuing in a work zone. The purpose of this study is to prepare a comprehensive work zone environmental assessment (WEA) framework to calculate the emissions caused due to disrupted traffic; by integrating traffic microsimulation tools with emission models. This will help highway officials to assess the benefits of accelerated construction and opt for the most suitable TMP not only economically but also from an environmental point of view.Keywords: accelerated construction, pavement MRR, traffic microsimulation, congestion, emissions
Procedia PDF Downloads 4499568 Aggregation Scheduling Algorithms in Wireless Sensor Networks
Authors: Min Kyung An
Abstract:
In Wireless Sensor Networks which consist of tiny wireless sensor nodes with limited battery power, one of the most fundamental applications is data aggregation which collects nearby environmental conditions and aggregates the data to a designated destination, called a sink node. Important issues concerning the data aggregation are time efficiency and energy consumption due to its limited energy, and therefore, the related problem, named Minimum Latency Aggregation Scheduling (MLAS), has been the focus of many researchers. Its objective is to compute the minimum latency schedule, that is, to compute a schedule with the minimum number of timeslots, such that the sink node can receive the aggregated data from all the other nodes without any collision or interference. For the problem, the two interference models, the graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR), have been adopted with different power models, uniform-power and non-uniform power (with power control or without power control), and different antenna models, omni-directional antenna and directional antenna models. In this survey article, as the problem has proven to be NP-hard, we present and compare several state-of-the-art approximation algorithms in various models on the basis of latency as its performance measure.Keywords: data aggregation, convergecast, gathering, approximation, interference, omni-directional, directional
Procedia PDF Downloads 2299567 Fuzzy Expert Approach for Risk Mitigation on Functional Urban Areas Affected by Anthropogenic Ground Movements
Authors: Agnieszka A. Malinowska, R. Hejmanowski
Abstract:
A number of European cities are strongly affected by ground movements caused by anthropogenic activities or post-anthropogenic metamorphosis. Those are mainly water pumping, current mining operation, the collapse of post-mining underground voids or mining-induced earthquakes. These activities lead to large and small-scale ground displacements and a ground ruptures. The ground movements occurring in urban areas could considerably affect stability and safety of structures and infrastructures. The complexity of the ground deformation phenomenon in relation to the structures and infrastructures vulnerability leads to considerable constraints in assessing the threat of those objects. However, the increase of access to the free software and satellite data could pave the way for developing new methods and strategies for environmental risk mitigation and management. Open source geographical information systems (OS GIS), may support data integration, management, and risk analysis. Lately, developed methods based on fuzzy logic and experts methods for buildings and infrastructure damage risk assessment could be integrated into OS GIS. Those methods were verified base on back analysis proving their accuracy. Moreover, those methods could be supported by ground displacement observation. Based on freely available data from European Space Agency and free software, ground deformation could be estimated. The main innovation presented in the paper is the application of open source software (OS GIS) for integration developed models and assessment of the threat of urban areas. Those approaches will be reinforced by analysis of ground movement based on free satellite data. Those data would support the verification of ground movement prediction models. Moreover, satellite data will enable our mapping of ground deformation in urbanized areas. Developed models and methods have been implemented in one of the urban areas hazarded by underground mining activity. Vulnerability maps supported by satellite ground movement observation would mitigate the hazards of land displacements in urban areas close to mines.Keywords: fuzzy logic, open source geographic information science (OS GIS), risk assessment on urbanized areas, satellite interferometry (InSAR)
Procedia PDF Downloads 1599566 Performance Evaluation of Using Genetic Programming Based Surrogate Models for Approximating Simulation Complex Geochemical Transport Processes
Authors: Hamed K. Esfahani, Bithin Datta
Abstract:
Transport of reactive chemical contaminant species in groundwater aquifers is a complex and highly non-linear physical and geochemical process especially for real life scenarios. Simulating this transport process involves solving complex nonlinear equations and generally requires huge computational time for a given aquifer study area. Development of optimal remediation strategies in aquifers may require repeated solution of such complex numerical simulation models. To overcome this computational limitation and improve the computational feasibility of large number of repeated simulations, Genetic Programming based trained surrogate models are developed to approximately simulate such complex transport processes. Transport process of acid mine drainage, a hazardous pollutant is first simulated using a numerical simulated model: HYDROGEOCHEM 5.0 for a contaminated aquifer in a historic mine site. Simulation model solution results for an illustrative contaminated aquifer site is then approximated by training and testing a Genetic Programming (GP) based surrogate model. Performance evaluation of the ensemble GP models as surrogate models for the reactive species transport in groundwater demonstrates the feasibility of its use and the associated computational advantages. The results show the efficiency and feasibility of using ensemble GP surrogate models as approximate simulators of complex hydrogeologic and geochemical processes in a contaminated groundwater aquifer incorporating uncertainties in historic mine site.Keywords: geochemical transport simulation, acid mine drainage, surrogate models, ensemble genetic programming, contaminated aquifers, mine sites
Procedia PDF Downloads 2779565 Discrete Choice Modeling in Education: Evaluating Early Childhood Educators’ Practices
Authors: Michalis Linardakis, Vasilis Grammatikopoulos, Athanasios Gregoriadis, Kalliopi Trouli
Abstract:
Discrete choice models belong to the family of Conjoint analysis that are applied on the preferences of the respondents towards a set of scenarios that describe alternative choices. The scenarios have been pre-designed to cover all the attributes of the alternatives that may affect the choices. In this study, we examine how preschool educators integrate physical activities into their everyday teaching practices through the use of discrete choice models. One of the advantages of discrete choice models compared to other more traditional data collection methods (e.g. questionnaires and interviews that use ratings) is that the respondent is called to select among competitive and realistic alternatives, rather than objectively rate each attribute that the alternatives may have. We present the effort to construct and choose representative attributes that would cover all possible choices of the respondents, and the scenarios that have arisen. For the purposes of the study, we used a sample of 50 preschool educators in Greece that responded to 4 scenarios (from the total of 16 scenarios that the orthogonal design resulted), with each scenario having three alternative teaching practices. Seven attributes of the alternatives were used in the scenarios. For the analysis of the data, we used multinomial logit model with random effects, multinomial probit model and generalized mixed logit model. The conclusions drawn from the estimated parameters of the models are discussed.Keywords: conjoint analysis, discrete choice models, educational data, multivariate statistical analysis
Procedia PDF Downloads 4659564 Forecasting Model for Rainfall in Thailand: Case Study Nakhon Ratchasima Province
Authors: N. Sopipan
Abstract:
In this paper, we study of rainfall time series of weather stations in Nakhon Ratchasima province in Thailand using various statistical methods enabled to analyse the behaviour of rainfall in the study areas. Time-series analysis is an important tool in modelling and forecasting rainfall. ARIMA and Holt-Winter models based on exponential smoothing were built. All the models proved to be adequate. Therefore, could give information that can help decision makers establish strategies for proper planning of agriculture, drainage system and other water resource applications in Nakhon Ratchasima province. We found the best perform for forecasting is ARIMA(1,0,1)(1,0,1)12.Keywords: ARIMA Models, exponential smoothing, Holt-Winter model
Procedia PDF Downloads 3009563 Imaginal and in Vivo Exposure Blended with Emdr: Becoming Unstuck, an Integrated Inpatient Treatment for Post-Traumatic Stress Disorder
Authors: Merrylord Harb-Azar
Abstract:
Traditionally, PTSD treatment has involved trauma-focused cognitive behaviour therapy (TF CBT) to consolidate traumatic memories. A piloted integrated treatment of TF CBT and eye movement desensitisation reprocessing therapy (EMDR) of eight phases will fasten the rate memory is being consolidated and enhance cognitive functioning in patients with PTSD. Patients spend a considerable amount of time in treatment managing their traumas experienced firsthand, or from aversive details ranging from war, assaults, accidents, abuse, hostage related, riots, or natural disasters. The time spent in treatment or as inpatient affects overall quality of life, relationships, cognitive functioning, and overall sense of identity. EMDR is being offered twice a week in conjunction with the standard prolonged exposure as an inpatient in a private hospital. Prolonged exposure for up to 5 hours per day elicits the affect response required for EMDR sessions in the afternoon to unlock unprocessed memories and facilitate consolidation in the amygdala and hippocampus. Results are indicating faster consolidation of memories, reduction in symptoms in a shorter period of time, reduction in admission time, which is enhancing the quality of life and relationships, and improved cognition. The impact of events scale (IES) results demonstrate a significant reduction in symptoms, trauma symptoms inventory (TSI), and posttraumatic stressor disorder check list (PCL) that demonstrates large effect sizes to date. An integrated treatment approach for PTSD achieves a faster resolution of memories, improves cognition, and reduces the amount of time spent in therapy.Keywords: EMDR enhances cognitive functioning, faster consolidation of trauma memory, integrated treatment of TF CBT and EMDR, reduction in inpatient admission time
Procedia PDF Downloads 1459562 An Energy Efficient Spectrum Shaping Scheme for Substrate Integrated Waveguides Based on Spread Reshaping Code
Authors: Yu Zhao, Rainer Gruenheid, Gerhard Bauch
Abstract:
In the microwave and millimeter-wave transmission region, substrate-integrated waveguide (SIW) is a very promising candidate for the development of circuits and components. It facilitates the transmission at the data rates in excess of 200 Gbit/s. An SIW mimics a rectangular waveguide by approximating the closed sidewalls with a via fence. This structure suppresses the low frequency components and makes the channel of the SIW a bandpass or high pass filter. This channel characteristic impedes the conventional baseband transmission using non-return-to-zero (NRZ) pulse shaping scheme. Therefore, mixers are commonly proposed to be used as carrier modulator and demodulator in order to facilitate a passband transmission. However, carrier modulation is not an energy efficient solution, because modulation and demodulation at high frequencies consume a lot of energy. For the first time to our knowledge, this paper proposes a spectrum shaping scheme of low complexity for the channel of SIW, namely spread reshaping code. It aims at matching the spectrum of the transmit signal to the channel frequency response. It facilitates the transmission through the SIW channel while it avoids using carrier modulation. In some cases, it even does not need equalization. Simulations reveal a good performance of this scheme, such that, as a result, eye opening is achieved without any equalization or modulation for the respective transmission channels.Keywords: bandpass channel, eye-opening, switching frequency, substrate-integrated waveguide, spectrum shaping scheme, spread reshaping code
Procedia PDF Downloads 1609561 The Biomechanical Consequences of Pes Planus
Authors: Mariette Swanepoel, Terry Ellapen, Henriette Hammil, Juandre Williams, Timothy Qumbu
Abstract:
The biomechanical consequence of pes planus is a topic seldom reviewed in regards to energy expenditure and predisposition to injury. However its comprehension in the field of foot rehabilitation, pre-and post-surgery is fundamental to successful patient management. This short communication unites the present literature to provide the reader with better insight on the consequence of pes planus, foot mechanics and its predisposition to injury at the foot and tibiofemoral joint. Further, the consideration of synergistic dominance of the foot invertors to compensate for the ineffective torque production of the fibularis longus due pes planus is presented.Keywords: pes planus, fibularis longus, synergistic dominance, injury
Procedia PDF Downloads 2879560 Understanding Tactical Urbanisms in Derelict Areas
Authors: Berna Yaylalı, Isin Can Traunmüller
Abstract:
This paper explores the emergent bottom-up practices in the fields of architecture and urban design within comparative perspectives of two cities. As a temporary, easily affordable intervention that gives the possibility of transforming neglected spaces into vibrant public spaces, tactical urbanism, together with creative place-making strategies, presents alternative ways of creating sustainable developments in derelict and underused areas. This study examines the potential of social and physical developments through a reading of case studies of two creative spatial practices: a pop-up garden transformed from an unused derelict space in Favoriten, Vienna, and an urban community garden in Kuzguncuk, Istanbul. Two cities are chosen according to their multicultural population and diversity. Istanbul was selected as a design city by UNESCO Creative Cities Network in 2017, and Vienna was declared an open and livable city by its local government. This research will use media archives and reports, interviews with locals and local governments, site observations, and visual recordings as methods to provide a critical reading on creative public spaces from the view of local users in these neighborhoods. Reflecting on these emergent ways, this study aims at discussing the production process of tactile urbanism with the practices of locals and the decision-making process with cases from İstanbul and Vienna. The comparison between their place-making strategies in tactical urbanism will give important insights for future developments.Keywords: creative city, tactical urbanism, neglected area, public space
Procedia PDF Downloads 1039559 Interoperability Maturity Models for Consideration When Using School Management Systems in South Africa: A Scoping Review
Authors: Keneilwe Maremi, Marlien Herselman, Adele Botha
Abstract:
The main purpose and focus of this paper are to determine the Interoperability Maturity Models to consider when using School Management Systems (SMS). The importance of this is to inform and help schools with knowing which Interoperability Maturity Model is best suited for their SMS. To address the purpose, this paper will apply a scoping review to ensure that all aspects are provided. The scoping review will include papers written from 2012-2019 and a comparison of the different types of Interoperability Maturity Models will be discussed in detail, which includes the background information, the levels of interoperability, and area for consideration in each Maturity Model. The literature was obtained from the following databases: IEEE Xplore and Scopus, the following search engines were used: Harzings, and Google Scholar. The topic of the paper was used as a search term for the literature and the term ‘Interoperability Maturity Models’ was used as a keyword. The data were analyzed in terms of the definition of Interoperability, Interoperability Maturity Models, and levels of interoperability. The results provide a table that shows the focus area of concern for each Maturity Model (based on the scoping review where only 24 papers were found to be best suited for the paper out of 740 publications initially identified in the field). This resulted in the most discussed Interoperability Maturity Model for consideration (Information Systems Interoperability Maturity Model (ISIMM) and Organizational Interoperability Maturity Model for C2 (OIM)).Keywords: interoperability, interoperability maturity model, school management system, scoping review
Procedia PDF Downloads 2099558 The Material-Process Perspective: Design and Engineering
Authors: Lars Andersen
Abstract:
The development of design and engineering in large construction projects are characterized by an increased degree of flattening out of formal structures, extended use of parallel and integrated processes (‘Integrated Concurrent Engineering’) and an increased number of expert disciplines. The integration process is based on ongoing collaborations, dialogues, intercommunication and comments on each other’s work (iterations). This process based on reciprocal communication between actors and disciplines triggers value creation. However, communication between equals is not in itself sufficient to create effective decision making. The complexity of the process and time pressure contribute to an increased risk of a deficit of decisions and loss of process control. The paper refers to a study that aims at developing a resilient decision-making system that does not come in conflict with communication processes based on equality between the disciplines in the process. The study includes the construction of a hospital, following the phases design, engineering and physical building. The Research method is a combination of formative process research, process tracking and phenomenological analyses. The study tracked challenges and problems in the building process to the projection substrates (drawing and models) and further to the organization of the engineering and design phase. A comparative analysis of traditional and new ways of organizing the projecting made it possible to uncover an implicit material order or structure in the process. This uncovering implied a development of a material process perspective. According to this perspective the complexity of the process is rooted in material-functional differentiation. This differentiation presupposes a structuring material (the skeleton of the building) that coordinates the other types of material. Each expert discipline´s competence is related to one or a set of materials. The architect, consulting engineer construction etc. have their competencies related to structuring material, and inherent in this; coordination competence. When dialogues between the disciplines concerning the coordination between them do not result in agreement, the disciplines with responsibility for the structuring material decide the interface issues. Based on these premises, this paper develops a self-organized expert-driven interdisciplinary decision-making system.Keywords: collaboration, complexity, design, engineering, materiality
Procedia PDF Downloads 221