Search results for: the creative learning process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21237

Search results for: the creative learning process

14757 Learning from Dendrites: Improving the Point Neuron Model

Authors: Alexander Vandesompele, Joni Dambre

Abstract:

The diversity in dendritic arborization, as first illustrated by Santiago Ramon y Cajal, has always suggested a role for dendrites in the functionality of neurons. In the past decades, thanks to new recording techniques and optical stimulation methods, it has become clear that dendrites are not merely passive electrical components. They are observed to integrate inputs in a non-linear fashion and actively participate in computations. Regardless, in simulations of neural networks dendritic structure and functionality are often overlooked. Especially in a machine learning context, when designing artificial neural networks, point neuron models such as the leaky-integrate-and-fire (LIF) model are dominant. These models mimic the integration of inputs at the neuron soma, and ignore the existence of dendrites. In this work, the LIF point neuron model is extended with a simple form of dendritic computation. This gives the LIF neuron increased capacity to discriminate spatiotemporal input sequences, a dendritic functionality as observed in another study. Simulations of the spiking neurons are performed using the Bindsnet framework. In the common LIF model, incoming synapses are independent. Here, we introduce a dependency between incoming synapses such that the post-synaptic impact of a spike is not only determined by the weight of the synapse, but also by the activity of other synapses. This is a form of short term plasticity where synapses are potentiated or depressed by the preceding activity of neighbouring synapses. This is a straightforward way to prevent inputs from simply summing linearly at the soma. To implement this, each pair of synapses on a neuron is assigned a variable,representing the synaptic relation. This variable determines the magnitude ofthe short term plasticity. These variables can be chosen randomly or, more interestingly, can be learned using a form of Hebbian learning. We use Spike-Time-Dependent-Plasticity (STDP), commonly used to learn synaptic strength magnitudes. If all neurons in a layer receive the same input, they tend to learn the same through STDP. Adding inhibitory connections between the neurons creates a winner-take-all (WTA) network. This causes the different neurons to learn different input sequences. To illustrate the impact of the proposed dendritic mechanism, even without learning, we attach five input neurons to two output neurons. One output neuron isa regular LIF neuron, the other output neuron is a LIF neuron with dendritic relationships. Then, the five input neurons are allowed to fire in a particular order. The membrane potentials are reset and subsequently the five input neurons are fired in the reversed order. As the regular LIF neuron linearly integrates its inputs at the soma, the membrane potential response to both sequences is similar in magnitude. In the other output neuron, due to the dendritic mechanism, the membrane potential response is different for both sequences. Hence, the dendritic mechanism improves the neuron’s capacity for discriminating spa-tiotemporal sequences. Dendritic computations improve LIF neurons even if the relationships between synapses are established randomly. Ideally however, a learning rule is used to improve the dendritic relationships based on input data. It is possible to learn synaptic strength with STDP, to make a neuron more sensitive to its input. Similarly, it is possible to learn dendritic relationships with STDP, to make the neuron more sensitive to spatiotemporal input sequences. Feeding structured data to a WTA network with dendritic computation leads to a significantly higher number of discriminated input patterns. Without the dendritic computation, output neurons are less specific and may, for instance, be activated by a sequence in reverse order.

Keywords: dendritic computation, spiking neural networks, point neuron model

Procedia PDF Downloads 138
14756 Integration Network ASI in Lab Automation and Networks Industrial in IFCE

Authors: Jorge Fernandes Teixeira Filho, André Oliveira Alcantara Fontenele, Érick Aragão Ribeiro

Abstract:

The constant emergence of new technologies used in automated processes makes it necessary for teachers and traders to apply new technologies in their classes. This paper presents an application of a new technology that will be employed in a didactic plant, which represents an effluent treatment process located in a laboratory of a federal educational institution. At work were studied in the first place, all components to be placed on automation laboratory in order to determine ways to program, parameterize and organize the plant. New technologies that have been implemented to the process are basically an AS-i network and a Profinet network, a SCADA system, which represented a major innovation in the laboratory. The project makes it possible to carry out in the laboratory various practices of industrial networks and SCADA systems.

Keywords: automation, industrial networks, SCADA systems, lab automation

Procedia PDF Downloads 554
14755 Optimization of Wire EDM Parameters for Fabrication of Micro Channels

Authors: Gurinder Singh Brar, Sarbjeet Singh, Harry Garg

Abstract:

Wire Electric Discharge Machining (WEDM) is thermal machining process capable of machining very hard electrically conductive material irrespective of their hardness. WEDM is being widely used to machine micro-scale parts with the high dimensional accuracy and surface finish. The objective of this paper is to optimize the process parameters of wire EDM to fabricate the microchannels and to calculate the surface finish and material removal rate of microchannels fabricated using wire EDM. The material used is aluminum 6061 alloy. The experiments were performed using CNC wire cut electric discharge machine. The effect of various parameters of WEDM like pulse on time (TON) with the levels (100, 150, 200), pulse off time (TOFF) with the levels (25, 35, 45) and current (IP) with the levels (105, 110, 115) were investigated to study the effect on output parameter i.e. Surface Roughness and Material Removal Rate (MRR). Each experiment was conducted under different conditions of a pulse on time, pulse off time and peak current. For material removal rate, TON and Ip were the most significant process parameter. MRR increases with the increase in TON and Ip and decreases with the increase in TOFF. For surface roughness, TON and Ip have the maximum effect and TOFF was found out to be less effective.

Keywords: microchannels, Wire Electric Discharge Machining (WEDM), Metal Removal Rate (MRR), surface finish

Procedia PDF Downloads 501
14754 The Construction of Knowledge and Social Wisdom on Local Community in the Process of Disaster Management

Authors: Oman Sukmana

Abstract:

Geographically, Indonesia appears to be disaster-prone areas, whether for natural, nonnatural (man-made), or social disasters. This study aimed to construct the knowledge and social wisdom on the local community in the process of disaster management after the eruption of Mt. Kelud. This study, moreover, encompassed two major concerns: (1) the construction of knowledge and social wisdom on the local community in the process of disaster management after the eruption of Mt. Kelud; (2) the conceptual framework of disaster management on the basis of knowledge and social wisdom on the local community. The study was conducted by means of qualitative approach. The data were analyzed by using the qualitative-descriptive technique. The data collection techniques used in this study were in-depth interview, focus group discussion, observation, and documentation. It was conducted at Pandansari Village, Sub-district Ngantang, District Malang as the most at risk area of Mt. Kelud’s eruption. The purposive sampling was applied ad hoc to select the respondents including: the apparatus of Pandansari Village, the local figures of Pandansari Village, the Chief and Boards of the Forum of Disaster Risk Reduction (FPRB), the Head of Malang Regional Disaster Management Agency, and other agencies. The findings of this study showed that the local community has already possessed the adequate knowledge and social wisdom to overcome the disaster. Through the social wisdom, the local community could predict the potential eruption.

Keywords: knowledge, social and local wisdom, disaster management

Procedia PDF Downloads 372
14753 Adoption of Performance Management System in a Saudi Telecom Company: An Institutional Perspective

Authors: Mohammed Buhaya

Abstract:

Purpose: The purpose of this study is to analyze the decision, implementation process and the outcomes of the introduction of the balanced scorecard in a developing country having particular regard to the impacts of agency and institutional, endogenous and exogenous. Design/methodology/approach: This study builds on a longitudinal explanatory case study, an institutional framework, especially Ter-Bogt and Scapens (2014) framework. Findings: Empirical findings drawn from a telecom company indicate that the dynamics of change of the company are influenced by the interconnection of external institutions and the company's situation and internal institutions encompassing issues of power, politics, and culture. Organizational practice introduced to secure external legitimacy is not always the case. The adoption of the balanced scorecard was the instrumental manner and had revolutionary change. Originality/value: In contrast to much previous research on management accounting practice, the paper analyses the process of change in one of developing country. The study also sheds new light on the power of religion as one of institutional logics and how this logic rises to potential to influence management accounting change among actors and achieving the company’s targets. This paper highlights how the culture and values can play a vital role in making the process of change smoother.

Keywords: balanced scorecard, institutional, management accounting practice, rules, and routines

Procedia PDF Downloads 168
14752 A Machine Learning Approach for Earthquake Prediction in Various Zones Based on Solar Activity

Authors: Viacheslav Shkuratskyy, Aminu Bello Usman, Michael O’Dea, Saifur Rahman Sabuj

Abstract:

This paper examines relationships between solar activity and earthquakes; it applied machine learning techniques: K-nearest neighbour, support vector regression, random forest regression, and long short-term memory network. Data from the SILSO World Data Center, the NOAA National Center, the GOES satellite, NASA OMNIWeb, and the United States Geological Survey were used for the experiment. The 23rd and 24th solar cycles, daily sunspot number, solar wind velocity, proton density, and proton temperature were all included in the dataset. The study also examined sunspots, solar wind, and solar flares, which all reflect solar activity and earthquake frequency distribution by magnitude and depth. The findings showed that the long short-term memory network model predicts earthquakes more correctly than the other models applied in the study, and solar activity is more likely to affect earthquakes of lower magnitude and shallow depth than earthquakes of magnitude 5.5 or larger with intermediate depth and deep depth.

Keywords: k-nearest neighbour, support vector regression, random forest regression, long short-term memory network, earthquakes, solar activity, sunspot number, solar wind, solar flares

Procedia PDF Downloads 77
14751 Developing an IT Management Policy: A Proposal

Authors: Robert Gilliland

Abstract:

In any organization, a potential issue can arise and become a problem when management deviates from the standard norms set in the system development process of an IT system and the policies that pertain to it. In these instances, cybersecurity is a big challenge that organizations have to face in safeguarding the data that they generate and use. When a new idea, task, or process begins, specific standards must be followed, along with the policies and procedures that ensure the safeguard of data in the information system within the company. A good IT Strategy and Policy should have individuals who are in charge of overseeing the design, development, implementation, and auditing of these policies. Auditors are people who check to make sure that the issue conforms with the plan that is in place. Management has the ability through the role of the manager to potentially abuse power is given and to direct specific ideas, events, projects, and outcomes that are contrary to the vision or goals of the company.

Keywords: strategic policy, policy management, new policy, strategic planning

Procedia PDF Downloads 140
14750 Determining Variables in Mathematics Performance According to Gender in Mexican Elementary School

Authors: Nora Gavira Duron, Cinthya Moreda Gonzalez-Ortega, Reyna Susana Garcia Ruiz

Abstract:

This paper objective is to analyze the mathematics performance in the Learning Evaluation National Plan (PLANEA for its Spanish initials: Plan Nacional para la Evaluación de los Aprendizajes), applied to Mexican students who are enrolled in the last elementary-school year over the 2017-2018 academic year. Such test was conducted nationwide in 3,573 schools, using a sample of 108,083 students, whose average in mathematics, on a scale of 0 to 100, was 45.6 points. 75% of the sample analyzed did not reach the sufficiency level (60 points). It should be noted that only 2% got a 90 or higher score result. The performance is analyzed while considering whether there are differences in gender, marginalization level, public or private school enrollment, parents’ academic background, and living-with-parents situation. Likewise, this variable impact (among other variables) on school performance by gender is evaluated, considering multivariate logistic (Logit) regression analysis. The results show there are no significant differences in mathematics performance regarding gender in elementary school; nevertheless, the impact exerted by mothers who studied at least high school is of great relevance for students, particularly for girls. Other determining variables are students’ resilience, their parents’ economic status, and the fact they attend private schools, strengthened by the mother's education.

Keywords: multivariate regression analysis, academic performance, learning evaluation, mathematics result per gender

Procedia PDF Downloads 151
14749 Developing a Performance Measurement System for Arts-Based Initiatives: Action Research on Italian Corporate Museums

Authors: Eleonora Carloni, Michela Arnaboldi

Abstract:

In academia, the investigation of the relationship between cultural heritage and corporations is ubiquitous in several fields of studies. In practice corporations are more and more integrating arts and cultural heritage in their strategies for disparate benefits, such as: to foster customer’s purchase intention with authentic and aesthetic experiences, to improve their reputation towards local communities, and to motivate employees with creative thinking. There are diverse forms under which corporations set these artistic interventions, from sponsorships to arts-based training centers for employees, but scholars agree that the maximum expression of this cultural trend are corporate museums, growing in number and relevance. Corporate museums are museum-like settings, hosting artworks of corporations’ history and interests. In academia they have been ascribed as strategic asset and they have been associated with diverse uses for corporations’ benefits, from place for preservation of cultural heritage, to tools for public relations and cultural flagship stores. Previous studies have thus extensively but fragmentally studied the diverse benefits of corporate museum opening to corporations, with a lack of comprehensive approach and a digression on how to evaluate and report corporate museum’s performances. Stepping forward, the present study aims to investigate: 1) what are the key performance measures corporate museums need to report to the associated corporations; 2) how are the key performance measures reported to the concerned corporations. This direction of study is not only suggested as future direction in academia but it has solid basis in practice, aiming to answer to the need of corporate museums’ directors to account for corporate museum’s activities to the concerned corporation. Coherently, at an empirical level the study relies on action research method, whose distinctive feature is to develop practical knowledge through a participatory process. This paper indeed relies on the experience of a collaborative project between the researchers and a set of corporate museums in Italy, aimed at co-developing a performance measurement system. The project involved two steps: a first step, in which researchers derived the potential performance measures from literature along with exploratory interviews; a second step, in which researchers supported the pool of corporate museums’ directors in co-developing a set of key performance indicators for reporting. Preliminary empirical findings show that while scholars insist on corporate museums’ capability to develop networking relations, directors insist on the role of museums as internal supplier of knowledge for innovation goals. Moreover, directors stress museums’ cultural mission and outcomes as potential benefits for corporation, by remarking to include both cultural and business measures in the final tool. In addition, they give relevant attention to the wording used in humanistic terms while struggling to express all measures in economic terms. The paper aims to contribute to corporate museums’ and more broadly to arts-based initiatives’ literature in two directions. Firstly, it elaborates key performance measures with related indicators to report on cultural initiatives for corporations. Secondly, it provides evidence of challenges and practices to handle reporting on these initiatives, because of tensions arising from the co-existence of diverse perspectives, namely arts and business worlds.

Keywords: arts-based initiative, corporate museum, hybrid organization, performance measurement

Procedia PDF Downloads 181
14748 Production of Hydrogen and Carbon Monoxide Fuel Gas From Pine Needles

Authors: Despina Vamvuka, Despina Pentari

Abstract:

Forestry wastes are readily available in large quantities around the world. Based on European Green Deal for the deployment of renewable and decarbonized energy by 2050, as well as global energy crisis, energy recovery from such wastes reducing greenhouse gas emissions is very attractive. Gasification has superior environmental performance to combustion, producing a clean fuel gas utilized in internal combustion engines, gas turbines, solid oxide fuel cells, or for synthesis of liquid bio-fuels and value-added chemicals. In this work, pine needles, which are abundantly found in Mediterranean countries, were gasified by either steam or carbon dioxide via a two-step process to improve reactivity and eliminate tar, employing a fixed bed unit and a thermal analysis system. Solid, liquid and gaseous products from the whole process were characterized and their energy potential was determined. Thermal behaviour, reactivity, conversion and energy recovery were examined. The gasification process took place above 650°C. At 950°C conversion and energy recovery were 77% dry and 2 under a flow of steam and 85% dry and 2.9 under a flow of carbon dioxide, respectively. Organic matter was almost completely converted to syngas, the yield of which varied between 89% and 99%. The higher heating values of biochar, bio-oil and pyrolysis gas were 27.8 MJ/kg, 33.5 MJ/kg and 13.6 MJ/m3. Upon steam or carbon dioxide gasification, the higher heating value of syngas produced was 11.5 MJ/m3 and 12.7 MJ/m3, respectively.

Keywords: gasification, biomass, steam, carbon dioxide

Procedia PDF Downloads 102
14747 Reconstructability Analysis for Landslide Prediction

Authors: David Percy

Abstract:

Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.

Keywords: reconstructability analysis, machine learning, landslides, raster analysis

Procedia PDF Downloads 73
14746 Development of Advanced Virtual Radiation Detection and Measurement Laboratory (AVR-DML) for Nuclear Science and Engineering Students

Authors: Lily Ranjbar, Haori Yang

Abstract:

Online education has been around for several decades, but the importance of online education became evident after the COVID-19 pandemic. Eventhough the online delivery approach works well for knowledge building through delivering content and oversight processes, it has limitations in developing hands-on laboratory skills, especially in the STEM field. During the pandemic, many education institutions faced numerous challenges in delivering lab-based courses, especially in the STEM field. Also, many students worldwide were unable to practice working with lab equipment due to social distancing or the significant cost of highly specialized equipment. The laboratory plays a crucial role in nuclear science and engineering education. It can engage students and improve their learning outcomes. In addition, online education and virtual labs have gained substantial popularity in engineering and science education. Therefore, developing virtual labs is vital for institutions to deliver high-class education to their students, including their online students. The School of Nuclear Science and Engineering (NSE) at Oregon State University, in partnership with SpectralLabs company, has developed an Advanced Virtual Radiation Detection and Measurement Lab (AVR-DML) to offer a fully online Master of Health Physics program. It was essential for us to use a system that could simulate nuclear modules that accurately replicate the underlying physics, the nature of radiation and radiation transport, and the mechanics of the instrumentations used in the real radiation detection lab. It was all accomplished using a Realistic, Adaptive, Interactive Learning System (RAILS). RAILS is a comprehensive software simulation-based learning system for use in training. It is comprised of a web-based learning management system that is located on a central server, as well as a 3D-simulation package that is downloaded locally to user machines. Users will find that the graphics, animations, and sounds in RAILS create a realistic, immersive environment to practice detecting different radiation sources. These features allow students to coexist, interact and engage with a real STEM lab in all its dimensions. It enables them to feel like they are in a real lab environment and to see the same system they would in a lab. Unique interactive interfaces were designed and developed by integrating all the tools and equipment needed to run each lab. These interfaces provide students full functionality for data collection, changing the experimental setup, and live data collection with real-time updates for each experiment. Students can manually do all experimental setups and parameter changes in this lab. Experimental results can then be tracked and analyzed in an oscilloscope, a multi-channel analyzer, or a single-channel analyzer (SCA). The advanced virtual radiation detection and measurement laboratory developed in this study enabled the NSE school to offer a fully online MHP program. This flexibility of course modality helped us to attract more non-traditional students, including international students. It is a valuable educational tool as students can walk around the virtual lab, make mistakes, and learn from them. They have an unlimited amount of time to repeat and engage in experiments. This lab will also help us speed up training in nuclear science and engineering.

Keywords: advanced radiation detection and measurement, virtual laboratory, realistic adaptive interactive learning system (rails), online education in stem fields, student engagement, stem online education, stem laboratory, online engineering education

Procedia PDF Downloads 94
14745 Effects of Fe Addition and Process Parameters on the Wear and Corrosion Characteristics of Icosahedral Al-Cu-Fe Coatings on Ti-6Al-4V Alloy

Authors: Olawale S. Fatoba, Stephen A. Akinlabi, Esther T. Akinlabi, Rezvan Gharehbaghi

Abstract:

The performance of material surface under wear and corrosion environments cannot be fulfilled by the conventional surface modifications and coatings. Therefore, different industrial sectors need an alternative technique for enhanced surface properties. Titanium and its alloys possess poor tribological properties which limit their use in certain industries. This paper focuses on the effect of hybrid coatings Al-Cu-Fe on a grade five titanium alloy using laser metal deposition (LMD) process. Icosahedral Al-Cu-Fe as quasicrystals is a relatively new class of materials which exhibit unusual atomic structure and useful physical and chemical properties. A 3kW continuous wave ytterbium laser system (YLS) attached to a KUKA robot which controls the movement of the cladding process was utilized for the fabrication of the coatings. The titanium cladded surfaces were investigated for its hardness, corrosion and tribological behaviour at different laser processing conditions. The samples were cut to corrosion coupons, and immersed into 3.65% NaCl solution at 28oC using Electrochemical Impedance Spectroscopy (EIS) and Linear Polarization (LP) techniques. The cross-sectional view of the samples was analysed. It was found that the geometrical properties of the deposits such as width, height and the Heat Affected Zone (HAZ) of each sample remarkably increased with increasing laser power due to the laser-material interaction. It was observed that there are higher number of aluminum and titanium presented in the formation of the composite. The indentation testing reveals that for both scanning speed of 0.8 m/min and 1m/min, the mean hardness value decreases with increasing laser power. The low coefficient of friction, excellent wear resistance and high microhardness were attributed to the formation of hard intermetallic compounds (TiCu, Ti2Cu, Ti3Al, Al3Ti) produced through the in situ metallurgical reactions during the LMD process. The load-bearing capability of the substrate was improved due to the excellent wear resistance of the coatings. The cladded layer showed a uniform crack free surface due to optimized laser process parameters which led to the refinement of the coatings.

Keywords: Al-Cu-Fe coating, corrosion, intermetallics, laser metal deposition, Ti-6Al-4V alloy, wear resistance

Procedia PDF Downloads 180
14744 Glaucoma Detection in Retinal Tomography Using the Vision Transformer

Authors: Sushish Baral, Pratibha Joshi, Yaman Maharjan

Abstract:

Glaucoma is a chronic eye condition that causes vision loss that is irreversible. Early detection and treatment are critical to prevent vision loss because it can be asymptomatic. For the identification of glaucoma, multiple deep learning algorithms are used. Transformer-based architectures, which use the self-attention mechanism to encode long-range dependencies and acquire extremely expressive representations, have recently become popular. Convolutional architectures, on the other hand, lack knowledge of long-range dependencies in the image due to their intrinsic inductive biases. The aforementioned statements inspire this thesis to look at transformer-based solutions and investigate the viability of adopting transformer-based network designs for glaucoma detection. Using retinal fundus images of the optic nerve head to develop a viable algorithm to assess the severity of glaucoma necessitates a large number of well-curated images. Initially, data is generated by augmenting ocular pictures. After that, the ocular images are pre-processed to make them ready for further processing. The system is trained using pre-processed images, and it classifies the input images as normal or glaucoma based on the features retrieved during training. The Vision Transformer (ViT) architecture is well suited to this situation, as it allows the self-attention mechanism to utilise structural modeling. Extensive experiments are run on the common dataset, and the results are thoroughly validated and visualized.

Keywords: glaucoma, vision transformer, convolutional architectures, retinal fundus images, self-attention, deep learning

Procedia PDF Downloads 194
14743 Numerical Simulation for a Shallow Braced Excavation of Campus Building

Authors: Sao-Jeng Chao, Wen-Cheng Chen, Wei-Humg Lu

Abstract:

In order to prevent encountering unpredictable factors, geotechnical engineers always conduct numerical analysis for braced excavation design. Simulation work in advance can predict the response of subsequent excavation and thus will be designed to increase the security coefficient of construction. The parameters that are considered include geological conditions, soil properties, soil distributions, loading types, and the analysis and design methods. National Ilan University is located on the LanYang plain, mainly deposited by clayey soil and loose sand, and thus is vulnerable to external influence displacement. National Ilan University experienced a construction of braced excavation with a complete program of monitoring excavation. This study takes advantage of a one-dimensional finite element method RIDO to simulate the excavation process. The predicted results from numerical simulation analysis are compared with the monitored results of construction to explore the differences between them. Numerical simulation analysis of the excavation process can be used to analyze retaining structures for the purpose of understanding the relationship between the displacement and supporting system. The resulting deformation and stress distribution from the braced excavation cab then be understand in advance. The problems can be prevented prior to the construction process, and thus acquire all the affected important factors during design and construction.

Keywords: excavation, numerical simulation, RIDO, retaining structure

Procedia PDF Downloads 265
14742 3D Carbon Structures (Globugraphite) with Hierarchical Pore Morphology for the Application in Energy Storage Systems

Authors: Hubert Beisch, Janik Marx, Svenja Garlof, Roman Shvets, Ivan Grygorchak, Andriy Kityk, Bodo Fiedler

Abstract:

Three-dimensional carbon materials can be used as electrode materials for energy storage systems such as batteries and supercapacitors. Fast charging and discharging times are realizable without reducing the performance due to aging processes. Furthermore high specific surface area (SSA) of three-dimensional carbon structures leads to high specific capacities. One newly developed carbon foam is Globugraphite. This interconnected globular carbon morphology with statistically distributed hierarchical pores is manufactured by a chemical vapor deposition (CVD) process from ceramic templates resulting from a sintering process. Via scanning electron (SEM) and transmission electron microscopy (TEM), the morphology is characterized. Moreover, the SSA was measured by the Brunauer–Emmett–Teller (BET) theory. Measurements of Globugraphite in an organic and inorganic electrolyte show high energy densities and power densities resulting from ion absorption by forming an electrochemical double layer. A comparison of the specific values is summarized in a Ragone diagram. Energy densities up to 48 Wh/kg and power densities to 833 W/kg could be achieved for an SSA from 376 m²/g to 859 m²/g. For organic electrolyte, a specific capacity of 100 F/g at a density of 20 mg/cm³ was achieved.

Keywords: BET, carbon foam, CVD process, electrochemical cell, Ragone diagram, SEM, TEM

Procedia PDF Downloads 237
14741 Bio-Mimetic Foam Fractionation Technology for the Treatment of Per- and PolyFluoroAlkyl Substances (PFAS) in Contaminated Water

Authors: Hugo Carronnier, Wassim Almouallem, Eric Branquet

Abstract:

Per- and polyfluoroalkyl Substances (PFAS) are a group of man-made refractory compounds that have been widely used in a variety of industrial and commercial products since the 1940s, leading to contamination of groundwater and surface water systems. They are persistent, bioaccumulative and toxic chemicals. Foam fractionation is a potential remedial technique for treating PFAS-contaminated water, taking advantage of the high surface activity to remove them from the solution by adsorption onto the surface of the air bubbles. Nevertheless, traditional foam fractionation technology developed for PFAS is challenging and found to be ineffective in treating the less surface-active compounds. Different chemicals were the subject of investigation as amendments to achieve better removal. However, most amendments are toxic, expensive and complicated to use. In this situation, patent-pending PFAS technology overcomes these challenges by using rather biological amendments. Results from the first laboratory trial showed remarkable results using a simple and cheap BioFoam Fractionation (BioFF) process based on biomimetics. The study showed that the BioFF process is effective in removing greater than 99% of PFOA (C8), PFOS (C8), PFHpS (C7) and PFHxS (C6) in PFAS-contaminated water. For other PFAS such as PFDA (C10) and 6:2 FTAB, a slightly less stable removal between 94% and 96% was achieved while between 34% and 73% removal efficiency was observed for PFBA (C4), PFBS (C4), PFHxA (C6), and Gen-X. In sum, the advantages of the BioFF presented as a low-waste production, a cost and energy-efficient operation and the use of a biodegradable amendment requiring no separation step after treatment, coupled with these first findings, suggest that the BioFF process is a highly applicable treatment technology for PFAS contaminated water. Additional investigations are currently carried on in order to optimize the process and establish a promising strategy for on-site PFAS remediation.

Keywords: PFAS, treatment, foam fractionation, contaminated amendments

Procedia PDF Downloads 81
14740 Optimal Pricing Based on Real Estate Demand Data

Authors: Vanessa Kummer, Maik Meusel

Abstract:

Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.

Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning

Procedia PDF Downloads 292
14739 Degradation of Poly -β- Hydroxybutyrate by Trichoderma asperellum

Authors: Nuha Mansour Alhazmi

Abstract:

Replacement of petro-based plastics by a biodegradable plastic are vastly growing process. Poly-β-hydroxybutyrate (PHB) is a biodegradable biopolymer, synthesized by some bacterial genera. The objective of the current study is to explore the ability of some fungi to biodegrade PHB. The degradation of (PHB) was detected in Petri dish by the formation of a clear zone around the fungal colonies due to the production of depolymerase enzyme which has an interesting role in the PHB degradation process. Among 10 tested fungi, the most active PHB biodegraded fungi were identified as Trichoderma asperellum using morphological and molecular characters. The highest PHB degradation was at 25°C, pH 7.5 after 7 days of incubation for the tested fungi. Finally, the depolymerase enzyme was isolated, purified using column chromatography and characterized. In conclusion, PHB can be biodegraded in solid and liquid medium using depolymerase enzyme from T. asperellum.

Keywords: degradation, depolymerase enzyme, PHB, Trichoderma asperellum

Procedia PDF Downloads 186
14738 Planning for Sustainability in the Built Environment

Authors: Adedayo Jeremiah Adeyekun, Samuel Oluwagbemiga Ishola

Abstract:

This paper aimed to identify the significance of sustainability in the built environment, the economic and environmental importance to building and construction projects. Sustainability in the built environment has been a key objective of research over the past several decades. Sustainability in the built environment requires reconciliation between economic, environmental and social impacts of design and planning decisions made during the life cycle of a project from inception to termination. Planning for sustainability in the built environment needs us to go beyond our individual disciplines to consider the variety of economic, social and environmental impacts of our decisions in the long term. A decision to build a green residential development in an isolated location may pass some of the test of sustainability through its reduction in stormwater runoff, energy efficiency, and ecological sustainability in the building, but it may fail to be sustainable from a transportation perspective. Sustainability is important to the planning, design, construction, and preservation of the built environment; because it helps these activities reflect multiple values and considerations. In fact, the arts and sciences of the built environment have traditionally integrated values and fostered creative expression, capabilities that can and should lead the sustainability movement as society seeks ways to live in dynamic balance with its own diverse needs and the natural world. This research aimed to capture the state-of-the-art in the development of innovative sustainable design and planning strategies for building and construction projects. Therefore, there is a need for a holistic selection and implication approach for identifying potential sustainable strategies applicable to a particular project and evaluating the overall life cycle impact of each alternative by accounting for different applicable impacts and making the final selection among various viable alternatives.

Keywords: sustainability, built environment, planning, design, construction

Procedia PDF Downloads 182
14737 Impact of Integrated Signals for Doing Human Activity Recognition Using Deep Learning Models

Authors: Milagros Jaén-Vargas, Javier García Martínez, Karla Miriam Reyes Leiva, María Fernanda Trujillo-Guerrero, Francisco Fernandes, Sérgio Barroso Gonçalves, Miguel Tavares Silva, Daniel Simões Lopes, José Javier Serrano Olmedo

Abstract:

Human Activity Recognition (HAR) is having a growing impact in creating new applications and is responsible for emerging new technologies. Also, the use of wearable sensors is an important key to exploring the human body's behavior when performing activities. Hence, the use of these dispositive is less invasive and the person is more comfortable. In this study, a database that includes three activities is used. The activities were acquired from inertial measurement unit sensors (IMU) and motion capture systems (MOCAP). The main objective is differentiating the performance from four Deep Learning (DL) models: Deep Neural Network (DNN), Convolutional Neural Network (CNN), Recurrent Neural Network (RNN) and hybrid model Convolutional Neural Network-Long Short-Term Memory (CNN-LSTM), when considering acceleration, velocity and position and evaluate if integrating the IMU acceleration to obtain velocity and position represent an increment in performance when it works as input to the DL models. Moreover, compared with the same type of data provided by the MOCAP system. Despite the acceleration data is cleaned when integrating, results show a minimal increase in accuracy for the integrated signals.

Keywords: HAR, IMU, MOCAP, acceleration, velocity, position, feature maps

Procedia PDF Downloads 106
14736 Education Delivery in Youth Justice Centres: Inside-Out Prison Exchange Program Pedagogy in an Australian Context

Authors: Tarmi A'Vard

Abstract:

This paper discusses the transformative learning experience for students participating in the Inside-Out Prison Exchange Program (Inside-out) and explores the value this pedagogical approach may have in youth justice centers. Inside-Out is a semester-long university course which is unique as it takes 15 university students, with their textbook and theory-based knowledge, behind the walls to study alongside 15 incarcerated students, who have the lived experience of the criminal justice system. Inside-out is currently offered in three Victorian prisons, expanding to five in 2020. The Inside-out pedagogy which is based on transformative dialogic learning is reliant upon the participants sharing knowledge and experiences to develop an understanding and appreciation of the diversity and uniqueness of one another. Inside-out offers the class an opportunity to create its own guidelines for dialogue, which can lead to the student’s sense of equality, which is fundamental in the success of this program. Dialogue allows active participation by all parties in reconciling differences, collaborating ideas, critiquing and developing hypotheses and public policies, and encouraging self-reflection and exploration. The structure of the program incorporates the implementation of circular seating (where the students alternate between inside and outside), activities, individual reflective tasks, group work, and theory analysis. In this circle everyone is equal, this includes the educator, who serves as a facilitator more so than the traditional teacher role. A significant function of the circle is to develop a group consciousness, allowing the whole class to see itself as a collective, and no one person holds a superior role. This also encourages participants to be responsible and accountable for their behavior and contributions. Research indicates completing academic courses, like Inside-Out, contributes positively to reducing recidivism. Inside-Out’s benefits and success in many adult correctional institutions have been outlined in evaluation reports and scholarly articles. The key findings incorporate the learning experiences for the students in both an academic capability and professional practice and development. Furthermore, stereotypes and pre-determined ideas are challenged, and there is a promotion of critical thinking and evidence of self-discovery and growth. There is empirical data supporting positive outcomes of education in youth justice centers in reducing recidivism and increasing the likelihood of returning to education upon release. Hence, this research could provide the opportunity to increase young people’s engagement in education which is a known protective factor for assisting young people to move away from criminal behavior. In 2016, Tarmi completed the Inside-Out educator training in Philadelphia, Pennsylvania, and has developed an interest in exploring the pedagogy of Inside-Out, specifically targeting young offenders in a Youth Justice Centre.

Keywords: dialogic transformative learning, inside-out prison exchange program, prison education, youth justice

Procedia PDF Downloads 129
14735 The Effect of Type of Nanoparticles on the Quenching Process

Authors: Dogan Ciloglu, Abdurrahim Bolukbasi, Harun Cifci

Abstract:

In this study, the experiments were carried out to determine the best coolant for the quenching process among water-based silica, alumina, titania and copper oxide nanofluids (0.1 vol%). A sphere made up off brass material was used in the experiments. After the spherical test specimen was heated at high temperatures, it was suddenly plunged into the nanofluid suspensions. All experiments were performed at saturated conditions and under atmospheric pressure. Using the temperature-time data of the specimen, the cooling curves were obtained. The experimental results showed that the cooling performance of test specimen depended on the type of nanofluids. The silica nanoparticles enhanced the performance of boiling heat transfer and it is the best coolant for the quenching among other nanoparticles.

Keywords: quenching, nanofluid, pool boiling, heat transfer

Procedia PDF Downloads 297
14734 CAD Tool for Parametric Design modification of Yacht Hull Surface Models

Authors: Shahroz Khan, Erkan Gunpinar, Kemal Mart

Abstract:

Recently parametric design techniques became a vital concept in the field of Computer Aided Design (CAD), which helps to provide sophisticated platform to the designer in order to automate the design process in efficient time. In these techniques, design process starts by parameterizing the important features of design models (typically the key dimensions), with the implementation of design constraints. The design constraints help to retain the overall shape of the model while modifying its parameters. However, the process of initializing an appropriate number of design parameters and constraints is the crucial part of parametric design techniques, especially for complex surface models such as yacht hull. This paper introduces a method to create complex surface models in favor of parametric design techniques, a method to define the right number of parameters and respective design constraints, and a system to implement design parameters in contract to design constraints schema. For this, in our proposed approach the design process starts by dividing the yacht hull into three sections. Each section consists of different shape lines, which form the overall shape of yacht hull. The shape lines are created using Cubic Bezier Curves, which allow larger design flexibility. Design parameters and constraints are defined on the shape lines in 3D design space to facilitate the designers for better and individual handling of parameters. Afterwards, shape modifiers are developed, which allow the modification of each parameter while satisfying the respective set of criteria and design constraints. Such as, geometric continuities should be maintained between the shape lines of the three sections, fairness of the hull surfaces should be preserved after modification and while design modification, effect of a single parameter should be negligible on other parameters. The constraints are defined individually on shape lines of each section and mutually between the shape lines of two connecting sections. In order to validate and visualize design results of our shape modifiers, a real time graphic interface is created.

Keywords: design parameter, design constraints, shape modifies, yacht hull

Procedia PDF Downloads 303
14733 Internet of Things Based Process Model for Smart Parking System

Authors: Amjaad Alsalamah, Liyakathunsia Syed

Abstract:

Transportation is an essential need for many people to go to their work, school, and home. In particular, the main common method inside many cities is to drive the car. Driving a car can be an easy job to reach the destination and load all stuff in a reasonable time. However, deciding to find a parking lot for a car can take a long time using the traditional system that can issue a paper ticket for each customer. The old system cannot guarantee a parking lot for all customers. Also, payment methods are not always available, and many customers struggled to find their car among a numerous number of cars. As a result, this research focuses on providing an online smart parking system in order to save time and budget. This system provides a flexible management system for both parking owner and customers by receiving all request via the online system and it gets an accurate result for all available parking and its location.

Keywords: smart parking system, IoT, tracking system, process model, cost, time

Procedia PDF Downloads 340
14732 Reduction of Toxic Matter from Marginal Water Treatment Using Sludge Recycling from Combination of Stepped Cascade Weir with Limestone Trickling Filter

Authors: Dheyaa Wajid Abbood, Ali Mohammed Tawfeeq Baqer, Eitizaz Awad Jasim

Abstract:

The aim of this investigation is to confirm the activity of a sludge recycling process in trickling filter filled with limestone as an alternative biological process over conventional high-cost treatment process with regard to toxic matter reduction from marginal water. The combination system of stepped cascade weir with limestone trickling filter has been designed and constructed in the Environmental Hydraulic Laboratory, Al-Mustansiriya University, College of Engineering. A set of experiments has been conducted during the period from August 2013 to July 2014. Seven days of continuous operation with different continuous flow rates (0.4m3/hr, 0.5 m3/hr, 0.6 m3/hr, 0.7m3/hr,0.8 m3/hr, 0.9 m3/hr, and 1m3/hr) after ten days of acclimatization experiments were carried out. Results indicate that the concentrations of toxic matter were decreasing with increasing of operation time, sludge recirculation ratio, and flow rate. The toxic matter measured includes (Mineral oils, Petroleum products, Phenols, Biocides, Polychlorinated biphenyls (PCBs), and Surfactants) which are used in these experiments were ranged between (0.074 nm-0.156 nm). Results indicated that the overall reduction efficiency after 4, 28, 52, 76, 100, 124, and 148 hours of operation were (55%, 48%, 42%, 50%, 59%, 61%, and 64%) when the combination of stepped cascade weir with limestone trickling filter is used.

Keywords: Marginal water , Toxic matter, Stepped Cascade weir, limestone trickling filter

Procedia PDF Downloads 400
14731 Numerical Investigation of Thermal Energy Storage System with Phase Change Materials

Authors: Mrityunjay Kumar Sinha, Mayank Srivastava

Abstract:

The position of interface and temperature variation of phase change thermal energy storage system under constant heat injection and radiative heat injection is analysed during charging/discharging process by Heat balance integral method. The charging/discharging process is solely governed by conduction. Phase change material is kept inside a rectangular cavity. Time-dependent fixed temperature and radiative boundary condition applied on one wall, all other walls are thermally insulated. Interface location and temperature variation are analysed by using MATLAB.

Keywords: conduction, melting/solidification, phase change materials, Stefan’s number

Procedia PDF Downloads 397
14730 Effects of Polymer Adsorption and Desorption on Polymer Flooding in Waterflooded Reservoir

Authors: Sukruthai Sapniwat, Falan Srisuriyachai

Abstract:

Polymer Flooding is one of the most well-known methods in Enhanced Oil Recovery (EOR) technology which can be implemented after either primary or secondary recovery, resulting in favorable conditions for the displacement mechanism in order to lower the residual oil in the reservoir. Polymer substances can lower the mobility ratio of the whole process by increasing the viscosity of injected water. Therefore, polymer flooding can increase volumetric sweep efficiency, which leads to a better recovery factor. Moreover, polymer adsorption onto rock surface can help decrease reservoir permeability contrast with high heterogeneity. Due to the reduction of the absolute permeability, effective permeability to water, representing flow ability of the injected fluid, is also reduced. Once polymer is adsorbed onto rock surface, polymer molecule can be desorbed when different fluids are injected. This study is performed to evaluate the effects of the adsorption and desorption process of polymer solutions to yield benefits on the oil recovery mechanism. A reservoir model is constructed by reservoir simulation program called STAR® commercialized by the Computer Modeling Group (CMG). Various polymer concentrations, starting times of polymer flooding process and polymer injection rates were evaluated with selected values of polymer desorption degrees including 0, 25, 50, 75 and 100%. The higher the value, the more adsorbed polymer molecules to return back to flowing fluid. According to the results, polymer desorption lowers polymer consumption, especially at low concentrations. Furthermore, starting time of polymer flooding and injection rate affect the oil production. The results show that waterflooding followed by earlier polymer flooding can increase the oil recovery factor while the higher injection rate also enhances the recovery. Polymer concentration is related to polymer consumption due to the two main benefits of polymer flooding control described above. Therefore, polymer slug size should be optimized based on polymer concentration. Polymer desorption causes polymer re-employment that is previously adsorbed onto rock surface, resulting in an increase of sweep efficiency in the further period of polymer flooding process. Even though waterflooding supports polymer injectivity, water cut at the producer can prematurely terminate the oil production. The injection rate decreases polymer adsorption due to decreased retention time of polymer flooding process.

Keywords: enhanced oil recovery technology, polymer adsorption and desorption, polymer flooding, reservoir simulation

Procedia PDF Downloads 338
14729 Progressing Institutional Quality Assurance and Accreditation of Higher Education Programmes

Authors: Dominique Parrish

Abstract:

Globally, higher education institutions are responsible for the quality assurance and accreditation of their educational programmes (Courses). The primary purpose of these activities is to ensure that the educational standards of the governing higher education authority are met and the quality of the education provided to students is assured. Despite policies and frameworks being established in many countries, to improve the veracity and accountability of quality assurance and accreditation processes, there are reportedly still mistakes, gaps and deficiencies in these processes. An analysis of Australian universities’ quality assurance and accreditation processes noted that significant improvements were needed in managing these processes and ensuring that review recommendations were implemented. It has also been suggested that the following principles are critical for higher education quality assurance and accreditation to be effective and sustainable: academic standards and performance outcomes must be defined, attainable and monitored; those involved in providing the higher education must assume responsibility for the associated quality assurance and accreditation; potential academic risks must be identified and management solutions developed; and the expectations of the public, governments and students should be considered and incorporated into Course enhancements. This phenomenological study, which was conducted in a Faculty of Science, Medicine and Health in an Australian university, sought to systematically and iteratively develop an effective quality assurance and accreditation process that integrated the evidence-based principles of success and promoted meaningful and sustainable change. Qualitative evaluative feedback was gathered, over a period of eleven months (January - November 2014), from faculty staff engaged in the quality assurance and accreditation of forty-eight undergraduate and postgraduate Courses. Reflexive analysis was used to analyse the data and inform ongoing modifications and developments to the assurance and accreditation process as well as the associated supporting resources. The study resulted in the development of a formal quality assurance and accreditation process together with a suite of targeted resources that were identified as critical for success. The research findings also provided some insights into the institutional enablers that were antecedents to successful quality assurance and accreditation processes as well as meaningful change in the educational practices of academics. While longitudinal data will be collected to further assess the value of the assurance and accreditation process on educational quality, early indicators are that there has been a change in the pedagogical perspectives and activities of academic staff and growing momentum to explore opportunities to further enhance and develop Courses. This presentation will explain the formal quality assurance and accreditation process as well as the component parts, which resulted from this study. The targeted resources that were developed will be described, the pertinent factors that contributed to the success of the process will be discussed and early indicators of sustainable academic change as well as suggestions for future research will be outlined.

Keywords: academic standards, quality assurance and accreditation, phenomenological study, process, resources

Procedia PDF Downloads 382
14728 Supply Chain Optimization for Silica Sand in a Glass Manufacturing Company

Authors: Ramon Erasmo Verdin Rodriguez

Abstract:

Many has been the ways that historically the managers and gurus has been trying to get closer to the perfect supply chain, but since this topic is so vast and very complex the bigger the companies are, the duty has not been certainly easy. On this research, you are going to see thru the entrails of the logistics that happens at a glass manufacturing company with the number one raw material of the process that is the silica sand. After a very quick passage thru the supply chain, this document is going to focus on the way that raw materials flow thru the system, so after that, an analysis and research can take place to improve the logistics. Thru Operations Research techniques, it will be analyzed the current scheme of distribution and inventories of raw materials at a glass company’s plants, so after a mathematical conceptualization process, the supply chain could be optimized with the purpose of reducing the uncertainty of supply and obtaining an economic benefit at the very end of this research.

Keywords: inventory management, operations research, optimization, supply chain

Procedia PDF Downloads 330