Search results for: new process model
27221 Investigating Elements That Influence Higher Education Institutions’ Digital Maturity
Authors: Zarah M. Bello, Nathan Baddoo, Mariana Lilley, Paul Wernick
Abstract:
In this paper, we present findings from a multi-part study to evaluate candidate elements reflecting the level of digital capability maturity (DCM) in higher education and the relationship between these elements. We will use these findings to propose a model of DCM for educational institutions. We suggest that the success of learning in higher education is dependent in part on the level of maturity of digital capabilities of institutions as well as the abilities of learners and those who support the learning process. It is therefore important to have a good understanding of the elements that underpin this maturity as well as their impact and interactions in order to better exploit the benefits that technology presents to the modern learning environment and support its continued improvement. Having identified ten candidate elements of digital capability that we believe support the level of a University’s maturity in this area as well as a number of relevant stakeholder roles, we conducted two studies utilizing both quantitative and qualitative research methods. In the first of these studies, 85 electronic questionnaires were completed by various stakeholders in a UK university, with a 100% response rate. We also undertook five in-depth interviews with management stakeholders in the same university. We then utilized statistical analysis to process the survey data and conducted a textual analysis of the interview transcripts. Our findings support our initial identification of candidate elements and support our contention that these elements interact in a multidimensional manner. This multidimensional dynamic suggests that any proposal for improvement in digital capability must reflect the interdependency and cross-sectional relationship of the elements that contribute to DCM. Our results also indicate that the notion of DCM is strongly data-centric and that any proposed maturity model must reflect the role of data in driving maturity and improvement. We present these findings as a key step towards the design of an operationalisable DCM maturity model for universities.Keywords: digital capability, elements, maturity, maturity framework, university
Procedia PDF Downloads 14127220 Efficient Chiller Plant Control Using Modern Reinforcement Learning
Authors: Jingwei Du
Abstract:
The need of optimizing air conditioning systems for existing buildings calls for control methods designed with energy-efficiency as a primary goal. The majority of current control methods boil down to two categories: empirical and model-based. To be effective, the former heavily relies on engineering expertise and the latter requires extensive historical data. Reinforcement Learning (RL), on the other hand, is a model-free approach that explores the environment to obtain an optimal control strategy often referred to as “policy”. This research adopts Proximal Policy Optimization (PPO) to improve chiller plant control, and enable the RL agent to collaborate with experienced engineers. It exploits the fact that while the industry lacks historical data, abundant operational data is available and allows the agent to learn and evolve safely under human supervision. Thanks to the development of language models, renewed interest in RL has led to modern, online, policy-based RL algorithms such as the PPO. This research took inspiration from “alignment”, a process that utilizes human feedback to finetune the pretrained model in case of unsafe content. The methodology can be summarized into three steps. First, an initial policy model is generated based on minimal prior knowledge. Next, the prepared PPO agent is deployed so feedback from both critic model and human experts can be collected for future finetuning. Finally, the agent learns and adapts itself to the specific chiller plant, updates the policy model and is ready for the next iteration. Besides the proposed approach, this study also used traditional RL methods to optimize the same simulated chiller plants for comparison, and it turns out that the proposed method is safe and effective at the same time and needs less to no historical data to start up.Keywords: chiller plant, control methods, energy efficiency, proximal policy optimization, reinforcement learning
Procedia PDF Downloads 2627219 Marosok Tradition in the Process of Buying and Selling Cattle in Payakumbuh: A Comparative Study between Adat Law and Positive Law of Indonesia
Authors: Mhd. Zakiul Fikri, M. Agus Maulidi
Abstract:
Indonesia is a constitutional state. As the constitutional state, Indonesia is not only using a single legal system, but also adopting three legal systems consist of: The European continental legal system or positive law of Indonesia, adat law system, and legal system of religion. This study will discuss Marosok tradition in the process of buying and selling cattle in Payakumbuh: a comparative study between adat law and positive law of Indonesia. The objectives of this research are: First, to find the meaning of the philosophical of Marosok tradition in Payakumbuh. Second, to find the legal implications of the Marosok tradition reviewed aspects of adat law and positive law of Indonesia. Third, to find legal procedure in arbitrating the dispute wich is potentially appear in the post-process of buying and selling cattle based on positive law and adat law adopted in Indonesia. This research is empirical legal research that using two model approaches which are statute approach and conceptual approach. Data was obtained through interviews, observations, and documents or books. Then a method of data analysis used is inductive analysis. Finally, this study found that: First, tradition of Marosok contains the meaning of harmonization of social life that keep people from negative debate, envy, and arrogant. Second, Marosok tradition is one of the adat law in Indonesia; it is one of contract law in the process of buying and selling. If the comparison between the practice Marosok tradition as adat law with the provisions of Article 1320 book of civil code about the terms of the validity of a contract, the elements contained in the provisions of these regulations are met in practice Marosok. Thus, the practice of Marosok in buying and selling cattle process in Payakumbuh justified in view of the positive law of Indonesia. Last of all, all kinds of disputes arising due to contracts made by Marosok tradition can be resolved by positive law and adat law of Indonesia.Keywords: Adat law, contract, Indonesia, Marosok
Procedia PDF Downloads 32127218 Verification and Proposal of Information Processing Model Using EEG-Based Brain Activity Monitoring
Authors: Toshitaka Higashino, Naoki Wakamiya
Abstract:
Human beings perform a task by perceiving information from outside, recognizing them, and responding them. There have been various attempts to analyze and understand internal processes behind the reaction to a given stimulus by conducting psychological experiments and analysis from multiple perspectives. Among these, we focused on Model Human Processor (MHP). However, it was built based on psychological experiments and thus the relation with brain activity was unclear so far. To verify the validity of the MHP and propose our model from a viewpoint of neuroscience, EEG (Electroencephalography) measurements are performed during experiments in this study. More specifically, first, experiments were conducted where Latin alphabet characters were used as visual stimuli. In addition to response time, ERPs (event-related potentials) such as N100 and P300 were measured by using EEG. By comparing cycle time predicted by the MHP and latency of ERPs, it was found that N100, related to perception of stimuli, appeared at the end of the perceptual processor. Furthermore, by conducting an additional experiment, it was revealed that P300, related to decision making, appeared during the response decision process, not at the end. Second, by experiments using Japanese Hiragana characters, i.e. Japan's own phonetic symbols, those findings were confirmed. Finally, Japanese Kanji characters were used as more complicated visual stimuli. A Kanji character usually has several readings and several meanings. Despite the difference, a reading-related task and a meaning-related task exhibited similar results, meaning that they involved similar information processing processes of the brain. Based on those results, our model was proposed which reflects response time and ERP latency. It consists of three processors: the perception processor from an input of a stimulus to appearance of N100, the cognitive processor from N100 to P300, and the decision-action processor from P300 to response. Using our model, an application system which reflects brain activity can be established.Keywords: brain activity, EEG, information processing model, model human processor
Procedia PDF Downloads 9727217 Exergetic Optimization on Solid Oxide Fuel Cell Systems
Authors: George N. Prodromidis, Frank A. Coutelieris
Abstract:
Biogas can be currently considered as an alternative option for electricity production, mainly due to its high energy content (hydrocarbon-rich source), its renewable status and its relatively low utilization cost. Solid Oxide Fuel Cell (SOFC) stacks convert fuel’s chemical energy to electricity with high efficiencies and reveal significant advantages on fuel flexibility combined with lower emissions rate, especially when utilize biogas. Electricity production by biogas constitutes a composite problem which incorporates an extensive parametric analysis on numerous dynamic variables. The main scope of the presented study is to propose a detailed thermodynamic model on the optimization of SOFC-based power plants’ operation based on fundamental thermodynamics, energy and exergy balances. This model named THERMAS (THERmodynamic MAthematical Simulation model) incorporates each individual process, during electricity production, mathematically simulated for different case studies that represent real life operational conditions. Also, THERMAS offers the opportunity to choose a great variety of different values for each operational parameter individually, thus allowing for studies within unexplored and experimentally impossible operational ranges. Finally, THERMAS innovatively incorporates a specific criterion concluded by the extensive energy analysis to identify the most optimal scenario per simulated system in exergy terms. Therefore, several dynamical parameters as well as several biogas mixture compositions have been taken into account, to cover all the possible incidents. Towards the optimization process in terms of an innovative OPF (OPtimization Factor), presented here, this research study reveals that systems supplied by low methane fuels can be comparable to these supplied by pure methane. To conclude, such an innovative simulation model indicates a perspective on the optimal design of a SOFC stack based system, in the direction of the commercialization of systems utilizing biogas.Keywords: biogas, exergy, efficiency, optimization
Procedia PDF Downloads 36827216 Heterogeneous Artifacts Construction for Software Evolution Control
Authors: Mounir Zekkaoui, Abdelhadi Fennan
Abstract:
The software evolution control requires a deep understanding of the changes and their impact on different system heterogeneous artifacts. And an understanding of descriptive knowledge of the developed software artifacts is a prerequisite condition for the success of the evolutionary process. The implementation of an evolutionary process is to make changes more or less important to many heterogeneous software artifacts such as source code, analysis and design models, unit testing, XML deployment descriptors, user guides, and others. These changes can be a source of degradation in functional, qualitative or behavioral terms of modified software. Hence the need for a unified approach for extraction and representation of different heterogeneous artifacts in order to ensure a unified and detailed description of heterogeneous software artifacts, exploitable by several software tools and allowing to responsible for the evolution of carry out the reasoning change concerned.Keywords: heterogeneous software artifacts, software evolution control, unified approach, meta model, software architecture
Procedia PDF Downloads 44227215 Language Activation Theory: Unlocking Bilingual Language Processing
Authors: Leorisyl D. Siarot
Abstract:
It is conventional to see and hear Filipinos, in general, speak two or more languages. This phenomenon brings us to a closer look on how our minds process the input and produce an output with a specific chosen language. This study aimed to generate a theoretical model which explained the interaction of the first and the second languages in the human mind. After a careful analysis of the gathered data, a theoretical prototype called Language Activation Model was generated. For every string, there are three specialized banks: lexico-semantics, morphono-syntax, and pragmatics. These banks are interrelated to other banks of other language strings. As the bilingual learns more languages, a new string is replicated and is filled up with the information of the new language learned. The principles of the first and second languages' interaction are drawn; these are expressed in laws, namely: law of dominance, law of availability, law of usuality and law of preference. Furthermore, difficulties encountered in the learning of second languages were also determined.Keywords: bilingualism, psycholinguistics, second language learning, languages
Procedia PDF Downloads 51027214 Molecular Dynamic Simulation of Cold Spray Process
Authors: Aneesh Joshi, Sagil James
Abstract:
Cold Spray (CS) process is deposition of solid particles over a substrate above a certain critical impact velocity. Unlike thermal spray processes, CS process does not melt the particles thus retaining their original physical and chemical properties. These characteristics make CS process ideal for various engineering applications involving metals, polymers, ceramics and composites. The bonding mechanism involved in CS process is extremely complex considering the dynamic nature of the process. Though CS process offers great promise for several engineering applications, the realization of its full potential is limited by the lack of understanding of the complex mechanisms involved in this process and the effect of critical process parameters on the deposition efficiency. The goal of this research is to understand the complex nanoscale mechanisms involved in CS process. The study uses Molecular Dynamics (MD) simulation technique to understand the material deposition phenomenon during the CS process. Impact of a single crystalline copper nanoparticle on copper substrate is modelled under varying process conditions. The quantitative results of the impacts at different velocities, impact angle and size of the particles are evaluated using flattening ratio, von Mises stress distribution and local shear strain. The study finds that the flattening ratio and hence the quality of deposition was highest for an impact velocity of 700 m/s, particle size of 20 Å and an impact angle of 90°. The stress and strain analysis revealed regions of shear instabilities in the periphery of impact and also revealed plastic deformation of the particles after the impact. The results of this study can be used to augment our existing knowledge in the field of CS processes.Keywords: cold spray process, molecular dynamics simulation, nanoparticles, particle impact
Procedia PDF Downloads 36527213 Statistical Analysis and Optimization of a Process for CO2 Capture
Authors: Muftah H. El-Naas, Ameera F. Mohammad, Mabruk I. Suleiman, Mohamed Al Musharfy, Ali H. Al-Marzouqi
Abstract:
CO2 capture and storage technologies play a significant role in contributing to the control of climate change through the reduction of carbon dioxide emissions into the atmosphere. The present study evaluates and optimizes CO2 capture through a process, where carbon dioxide is passed into pH adjusted high salinity water and reacted with sodium chloride to form a precipitate of sodium bicarbonate. This process is based on a modified Solvay process with higher CO2 capture efficiency, higher sodium removal, and higher pH level without the use of ammonia. The process was tested in a bubble column semi-batch reactor and was optimized using response surface methodology (RSM). CO2 capture efficiency and sodium removal were optimized in terms of major operating parameters based on four levels and variables in Central Composite Design (CCD). The operating parameters were gas flow rate (0.5–1.5 L/min), reactor temperature (10 to 50 oC), buffer concentration (0.2-2.6%) and water salinity (25-197 g NaCl/L). The experimental data were fitted to a second-order polynomial using multiple regression and analyzed using analysis of variance (ANOVA). The optimum values of the selected variables were obtained using response optimizer. The optimum conditions were tested experimentally using desalination reject brine with salinity ranging from 65,000 to 75,000 mg/L. The CO2 capture efficiency in 180 min was 99% and the maximum sodium removal was 35%. The experimental and predicted values were within 95% confidence interval, which demonstrates that the developed model can successfully predict the capture efficiency and sodium removal using the modified Solvay method.Keywords: CO2 capture, water desalination, Response Surface Methodology, bubble column reactor
Procedia PDF Downloads 28527212 Multi-Objective Evolutionary Computation Based Feature Selection Applied to Behaviour Assessment of Children
Authors: F. Jiménez, R. Jódar, M. Martín, G. Sánchez, G. Sciavicco
Abstract:
Abstract—Attribute or feature selection is one of the basic strategies to improve the performances of data classification tasks, and, at the same time, to reduce the complexity of classifiers, and it is a particularly fundamental one when the number of attributes is relatively high. Its application to unsupervised classification is restricted to a limited number of experiments in the literature. Evolutionary computation has already proven itself to be a very effective choice to consistently reduce the number of attributes towards a better classification rate and a simpler semantic interpretation of the inferred classifiers. We present a feature selection wrapper model composed by a multi-objective evolutionary algorithm, the clustering method Expectation-Maximization (EM), and the classifier C4.5 for the unsupervised classification of data extracted from a psychological test named BASC-II (Behavior Assessment System for Children - II ed.) with two objectives: Maximizing the likelihood of the clustering model and maximizing the accuracy of the obtained classifier. We present a methodology to integrate feature selection for unsupervised classification, model evaluation, decision making (to choose the most satisfactory model according to a a posteriori process in a multi-objective context), and testing. We compare the performance of the classifier obtained by the multi-objective evolutionary algorithms ENORA and NSGA-II, and the best solution is then validated by the psychologists that collected the data.Keywords: evolutionary computation, feature selection, classification, clustering
Procedia PDF Downloads 36927211 A Cohort and Empirical Based Multivariate Mortality Model
Authors: Jeffrey Tzu-Hao Tsai, Yi-Shan Wong
Abstract:
This article proposes a cohort-age-period (CAP) model to characterize multi-population mortality processes using cohort, age, and period variables. Distinct from the factor-based Lee-Carter-type decomposition mortality model, this approach is empirically based and includes the age, period, and cohort variables into the equation system. The model not only provides a fruitful intuition for explaining multivariate mortality change rates but also has a better performance in forecasting future patterns. Using the US and the UK mortality data and performing ten-year out-of-sample tests, our approach shows smaller mean square errors in both countries compared to the models in the literature.Keywords: longevity risk, stochastic mortality model, multivariate mortality rate, risk management
Procedia PDF Downloads 5227210 Human Resources and Business Result: An Empirical Approach Based on RBV Theory
Authors: Xhevrie Mamaqi
Abstract:
Organization capacity learning is a process referring to the sum total of individual and collective learning through training programs, experience and experimentation, among others. Today, in-business ongoing training is one of the most important strategies for human capital development and it is crucial to sustain and improve workers’ knowledge and skills. Many organizations, firms and business are adopting a strategy of continuous learning, encouraging employees to learn new skills continually to be innovative and to try new processes and work in order to achieve a competitive advantage and superior business results. This paper uses the Resource Based View and Capacities (RBV) approach to construct a hypothetical relationships model between training and business results. The test of the model is applied on transversal data. A sample of 266 business of Spanish sector service has been selected. A Structural Equation Model (SEM) is used to estimate the relationship between ongoing training, represented by two latent dimension denominated Human and Social Capital resources and economic business results. The coefficients estimated have shown the efficient of some training aspects explaining the variation in business results.Keywords: business results, human and social capital resources, training, RBV theory, SEM
Procedia PDF Downloads 29727209 Effect of Model Dimension in Numerical Simulation on Assessment of Water Inflow to Tunnel in Discontinues Rock
Authors: Hadi Farhadian, Homayoon Katibeh
Abstract:
Groundwater inflow to the tunnels is one of the most important problems in tunneling operation. The objective of this study is the investigation of model dimension effects on tunnel inflow assessment in discontinuous rock masses using numerical modeling. In the numerical simulation, the model dimension has an important role in prediction of water inflow rate. When the model dimension is very small, due to low distance to the tunnel border, the model boundary conditions affect the estimated amount of groundwater flow into the tunnel and results show a very high inflow to tunnel. Hence, in this study, the two-dimensional universal distinct element code (UDEC) used and the impact of different model parameters, such as tunnel radius, joint spacing, horizontal and vertical model domain extent has been evaluated. Results show that the model domain extent is a function of the most significant parameters, which are tunnel radius and joint spacing.Keywords: water inflow, tunnel, discontinues rock, numerical simulation
Procedia PDF Downloads 52327208 FEM Investigation of Inhomogeneous Wall Thickness Backward Extrusion for Aerosol Can Manufacturing
Authors: Jemal Ebrahim Dessie, Zsolt Lukacs
Abstract:
The wall of the aerosol can is extruded from the backward extrusion process. Necking is another forming process stage developed on the can shoulder after the backward extrusion process. Due to the thinner thickness of the wall, buckling is the critical challenge for current pure aluminum aerosol can industries. Design and investigation of extrusion with inhomogeneous wall thickness could be the best solution for reducing and optimization of neck retraction numbers. FEM simulation of inhomogeneous wall thickness has been simulated through this investigation. From axisymmetric Deform-2D backward extrusion, an aerosol can with a thickness of 0.4 mm at the top and 0.33 mm at the bottom of the aerosol can have been developed. As the result, it can optimize the number of retractions of the necking process and manufacture defect-free aerosol can shoulder due to the necking process.Keywords: aerosol can, backward extrusion, Deform-2D, necking
Procedia PDF Downloads 18727207 Traditional Drawing, BIM and Erudite Design Process
Authors: Maryam Kalkatechi
Abstract:
Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.Keywords: erudite, data-sketch, algorithm design in architecture, design process
Procedia PDF Downloads 27327206 Proposing an Improved Managerial-Based Business Process Framework
Authors: Alireza Nikravanshallmani, Jamshid Dehmeshki, Mojtaba Ahmadi
Abstract:
Modeling of business processes, based on BPMN (Business Process Modeling Notation), helps analysts and managers to understand business processes, and, identify their shortages. These models provide a context to make rational decision of organizing business processes activities in an understandable manner. The purpose of this paper is to provide a framework for better understanding of business processes and their problems by reducing the cognitive load of displayed information for their audience at different managerial levels while keeping the essential information which are needed by them. For this reason, we integrate business process diagrams across the different managerial levels to develop a framework to improve the performance of business process management (BPM) projects. The proposed framework is entitled ‘Business process improvement framework based on managerial levels (BPIML)’. This framework, determine a certain type of business process diagrams (BPD) based on BPMN with respect to the objectives and tasks of the various managerial levels of organizations and their roles in BPM projects. This framework will make us able to provide the necessary support for making decisions about business processes. The framework is evaluated with a case study in a real business process improvement project, to demonstrate its superiority over the conventional method. A questionnaire consisted of 10 questions using Likert scale was designed and given to the participants (managers of Bank Refah Kargaran three managerial levels). By examining the results of the questionnaire, it can be said that the proposed framework provide support for correct and timely decisions by increasing the clarity and transparency of the business processes which led to success in BPM projects.Keywords: business process management (BPM), business process modeling, business process reengineering (BPR), business process optimizing, BPMN
Procedia PDF Downloads 45027205 Finite Dynamic Programming to Decision Making in the Use of Industrial Residual Water Treatment Plants
Authors: Oscar Vega Camacho, Andrea Vargas, Ellery Ariza
Abstract:
This paper presents the application of finite dynamic programming, specifically the "Markov Chain" model, as part of the decision making process of a company in the cosmetics sector located in the vicinity of Bogota DC. The objective of this process was to decide whether the company should completely reconstruct its waste water treatment plant or instead optimize the plant through the addition of equipment. The goal of both of these options was to make the required improvements in order to comply with parameters established by national legislation regarding the treatment of waste before it is released into the environment. This technique will allow the company to select the best option and implement a solution for the processing of waste to minimize environmental damage and the acquisition and implementation costs.Keywords: decision making, markov chain, optimization, waste water
Procedia PDF Downloads 41027204 Parametric Optimization of Electric Discharge Machining Process Using Taguchi's Method and Grey Relation Analysis
Authors: Pushpendra S. Bharti
Abstract:
Process yield of electric discharge machining (EDM) is directly related to optimal combination(s) of process parameters. Optimization of process parameters of EDM is a multi-objective optimization problem owing to the contradictory behavior of performance measures. This paper employs Grey Relation Analysis (GRA) method as a multi-objective optimization technique for the optimal selection of process parameters combination. In GRA, multi-response optimization is converted into optimization of a single response grey relation grade which ultimately gives the optimal combination of process parameters. Experiments were carried out on die-sinking EDM by taking D2 steel as work piece and copper as electrode material. Taguchi's orthogonal array L36 was used for the design of experiments. On the experimental values, GRA was employed for the parametric optimization. A significant improvement has been observed and reported in the process yield by taking the parametric combination(s) obtained through GRA.Keywords: electric discharge machining, grey relation analysis, material removal rate, optimization
Procedia PDF Downloads 40727203 A Second Spark Ignition Timing for the High Power Aircraft Radial Engine Using a CFD Transient Modeling
Authors: Tytus Tulwin, Adam Majczak
Abstract:
In aviation most important systems that impact the aircraft flight safety are duplicated. The ASz-62IR aircraft radial engine consists of two spark plugs powered by two separate magnetos. The relative difference in spark timing has an influence on the combustion process. The retardation of the second spark relative to the first spark was analyzed. The CFD simulation was developed as a multicycle transient model. Two independent spark sources imitate two flame fronts after an ignition period. It makes the combustion process shorter but only for certain range of second spark retardation. The model was validated by the in-cylinder pressure comparison. Combustion parameters were analyzed for different second spark retardation values. It was found that the most advantageous ignition timing in means of performance is simultaneous ignition. Nevertheless, for this engine the ignition time of the second spark plug is greatly retarded eliminating the advantageous performance influence. The reason behind this is maintaining high ignition certainty for all engine running conditions and for whole operating rpm range. In aviation the engine reliability is more important than its performance. Introducing electronic ignition system can yield from simultaneous ignition timing by increasing the engine performance and providing good reliability for all flight conditions. This work has been financed by the Polish National Centre for Research and Development, INNOLOT, under Grant Agreement No. INNOLOT/I/1/NCBR/2013.Keywords: CFD, combustion, ignition, simulation, timing
Procedia PDF Downloads 38127202 Assessment of Pre-Processing Influence on Near-Infrared Spectra for Predicting the Mechanical Properties of Wood
Authors: Aasheesh Raturi, Vimal Kothiyal, P. D. Semalty
Abstract:
We studied mechanical properties of Eucalyptus tereticornis using FT-NIR spectroscopy. Firstly, spectra were pre-processed to eliminate useless information. Then, prediction model was constructed by partial least squares regression. To study the influence of pre-processing on prediction of mechanical properties for NIR analysis of wood samples, we applied various pretreatment methods like straight line subtraction, constant offset elimination, vector-normalization, min-max normalization, multiple scattering. Correction, first derivative, second derivatives and their combination with other treatment such as First derivative + straight line subtraction, First derivative+ vector normalization and First derivative+ multiplicative scattering correction. The data processing methods in combination of preprocessing with different NIR regions, RMSECV, RMSEP and optimum factors/rank were obtained by optimization process of model development. More than 350 combinations were obtained during optimization process. More than one pre-processing method gave good calibration/cross-validation and prediction/test models, but only the best calibration/cross-validation and prediction/test models are reported here. The results show that one can safely use NIR region between 4000 to 7500 cm-1 with straight line subtraction, constant offset elimination, first derivative and second derivative preprocessing method which were found to be most appropriate for models development.Keywords: FT-NIR, mechanical properties, pre-processing, PLS
Procedia PDF Downloads 35727201 Simulating the Effect of Chlorine on Dynamic of Main Aquatic Species in Urban Lake with a Mini System Dynamic Model
Authors: Zhiqiang Yan, Chen Fan, Beicheng Xia
Abstract:
Urban lakes play an invaluable role in urban water systems such as flood control, landscape, entertainment, and energy utilization, and have suffered from severe eutrophication over the past few years. To investigate the ecological response of main aquatic species and system stability to chlorine interference in shallow urban lakes, a mini system dynamic model, based on the competition and predation of main aquatic species and TP circulation, was developed. The main species of submerged macrophyte, phytoplankton, zooplankton, benthos and TP in water and sediment were simulated as variables in the model with the interference of chlorine which effect function was attenuation equation. The model was validated by the data which was investigated in the Lotus Lake in Guangzhou from October 1, 2015 to January 31, 2016. Furthermore, the eco-exergy was used to analyze the change in complexity of the shallow urban lake. The results showed the correlation coefficient between observed and simulated values of all components presented significant. Chlorine showed a significant inhibitory effect on Microcystis aeruginosa,Rachionus plicatilis, Diaphanosoma brachyurum Liévin and Mesocyclops leuckarti (Claus).The outbreak of Spiroggra spp. inhibited the growth of Vallisneria natans (Lour.) Hara, caused a gradual decrease of eco-exergy, reflecting the breakdown of ecosystem internal equilibria. It was concluded that the study gives important insight into using chlorine to achieve eutrophication control and understand mechanism process.Keywords: system dynamic model, urban lake, chlorine, eco-exergy
Procedia PDF Downloads 20927200 The Impact of Election Observation on Electoral Reforms in Nigeria
Authors: Abubakar Sulaiman
Abstract:
The paper examines how election observation influences electoral reforms in Nigeria. Over the years, election observation continues to play critical role in the electoral process specifically in Nigeria and Africa at large. Election observation keeps an eye on the electoral process and all the stakeholders during elections, to ensure that the process is fair to all contestants. While literature abound on this role of election observation on electoral process in Nigeria, scanty scholarly efforts have been made to appraise how election observation influences electoral reforms in Nigeria. Also, while election observation may play a role in ensuring that the electoral process is credible, specifically, its role in prvoking and eliciting various electoral reforms in the country has not been explored. The paper adopts the explanatory research design using secondary data and document analysis. Preliminary findings show that election observation has influenced electoral reforms in Nigeria in no small measure. The paper concludes that election observation is critical for result oriented electoral reforms in Nigeria, albeit, such reforms have to be implemented to the latter.Keywords: electoral reforms, election observation, electoral process, developing country
Procedia PDF Downloads 16427199 An Interdisciplinary Maturity Model for Accompanying Sustainable Digital Transformation Processes in a Smart Residential Quarter
Authors: Wesley Preßler, Lucie Schmidt
Abstract:
Digital transformation is playing an increasingly important role in the development of smart residential quarters. In order to accompany and steer this process and ultimately make the success of the transformation efforts measurable, it is helpful to use an appropriate maturity model. However, conventional maturity models for digital transformation focus primarily on the evaluation of processes and neglect the information and power imbalances between the stakeholders, which affects the validity of the results. The Multi-Generation Smart Community (mGeSCo) research project is developing an interdisciplinary maturity model that integrates the dimensions of digital literacy, interpretive patterns, and technology acceptance to address this gap. As part of the mGeSCo project, the technological development of selected dimensions in the Smart Quarter Jena-Lobeda (Germany) is being investigated. A specific maturity model, based on Cohen's Smart Cities Wheel, evaluates the central dimensions Working, Living, Housing and Caring. To improve the reliability and relevance of the maturity assessment, the factors Digital Literacy, Interpretive Patterns and Technology Acceptance are integrated into the developed model. The digital literacy dimension examines stakeholders' skills in using digital technologies, which influence their perception and assessment of technological maturity. Digital literacy is measured by means of surveys, interviews, and participant observation, using the European Commission's Digital Literacy Framework (DigComp) as a basis. Interpretations of digital technologies provide information about how individuals perceive technologies and ascribe meaning to them. However, these are not mere assessments, prejudices, or stereotyped perceptions but collective patterns, rules, attributions of meaning and the cultural repertoire that leads to these opinions and attitudes. Understanding these interpretations helps in assessing the overarching readiness of stakeholders to digitally transform a/their neighborhood. This involves examining people's attitudes, beliefs, and values about technology adoption, as well as their perceptions of the benefits and risks associated with digital tools. These insights provide important data for a holistic view and inform the steps needed to prepare individuals in the neighborhood for a digital transformation. Technology acceptance is another crucial factor for successful digital transformation to examine the willingness of individuals to adopt and use new technologies. Surveys or questionnaires based on Davis' Technology Acceptance Model can be used to complement interpretive patterns to measure neighborhood acceptance of digital technologies. Integrating the dimensions of digital literacy, interpretive patterns and technology acceptance enables the development of a roadmap with clear prerequisites for initiating a digital transformation process in the neighborhood. During the process, maturity is measured at different points in time and compared with changes in the aforementioned dimensions to ensure sustainable transformation. Participation, co-creation, and co-production are essential concepts for a successful and inclusive digital transformation in the neighborhood context. This interdisciplinary maturity model helps to improve the assessment and monitoring of sustainable digital transformation processes in smart residential quarters. It enables a more comprehensive recording of the factors that influence the success of such processes and supports the development of targeted measures to promote digital transformation in the neighborhood context.Keywords: digital transformation, interdisciplinary, maturity model, neighborhood
Procedia PDF Downloads 7727198 A Multi-agent System Framework for Stakeholder Analysis of Local Energy Systems
Authors: Mengqiu Deng, Xiao Peng, Yang Zhao
Abstract:
The development of local energy systems requires the collective involvement of different actors from various levels of society. However, the stakeholder analysis of local energy systems still has been under-developed. This paper proposes an multi-agent system (MAS) framework to facilitate the development of stakeholder analysis of local energy systems. The framework takes into account the most influencing stakeholders, including prosumers/consumers, system operators, energy companies and government bodies. Different stakeholders are modeled based on agent architectures for example the belief-desire-intention (BDI) to better reflect their motivations and interests in participating in local energy systems. The agent models of different stakeholders are then integrated in one model of the whole energy system. An illustrative case study is provided to elaborate how to develop a quantitative agent model for different stakeholders, as well as to demonstrate the practicability of the proposed framework. The findings from the case study indicate that the suggested framework and agent model can serve as analytical instruments for enhancing the government’s policy-making process by offering a systematic view of stakeholder interconnections in local energy systems.Keywords: multi-agent system, BDI agent, local energy systems, stakeholders
Procedia PDF Downloads 8727197 Reliability Enhancement by Parameter Design in Ferrite Magnet Process
Abstract:
Ferrite magnet is widely used in many automotive components such as motors and alternators. Magnets used inside the components must be in good quality to ensure the high level of performance. The purpose of this study is to design input parameters that optimize the ferrite magnet production process to ensure the quality and reliability of manufactured products. Design of Experiments (DOE) and Statistical Process Control (SPC) are used as mutual supplementations to optimize the process. DOE and SPC are quality tools being used in the industry to monitor and improve the manufacturing process condition. These tools are practically used to maintain the process on target and within the limits of natural variation. A mixed Taguchi method is utilized for optimization purpose as a part of DOE analysis. SPC with proportion data is applied to assess the output parameters to determine the optimal operating conditions. An example of case involving the monitoring and optimization of ferrite magnet process was presented to demonstrate the effectiveness of this approach. Through the utilization of these tools, reliable magnets can be produced by following the step by step procedures of proposed framework. One of the main contributions of this study was producing the crack free magnets by applying the proposed parameter design.Keywords: ferrite magnet, crack, reliability, process optimization, Taguchi method
Procedia PDF Downloads 51727196 Modeling and Analysis of Laser Sintering Process Scanning Time for Optimal Planning and Control
Authors: Agarana Michael C., Akinlabi Esther T., Pule Kholopane
Abstract:
In order to sustain the advantages of an advanced manufacturing technique, such as laser sintering, minimization of total processing cost of the parts being produced is very important. An efficient time management would usually very important in optimal cost attainment which would ultimately result in an efficient advanced manufacturing process planning and control. During Laser Scanning Process Scanning (SLS) procedures it is possible to adjust various manufacturing parameters which are used to influence the improvement of various mechanical and other properties of the products. In this study, Modelling and mathematical analysis, including sensitivity analysis, of the laser sintering process time were carried out. The results of the analyses were represented with graphs, from where conclusions were drawn. It was specifically observed that achievement of optimal total scanning time is key for economic efficiency which is required for sustainability of the process.Keywords: modeling and analysis, optimal planning and control, laser sintering process, scanning time
Procedia PDF Downloads 9727195 Open Innovation for Crowdsourced Product Development: The Case Study of Quirky.com
Authors: Ana Bilandzic, Marcus Foth, Greg Hearn
Abstract:
In a narrow sense, innovation is the invention and commercialisation of a new product or service in the marketplace. The literature suggests places that support knowledge exchange and social interaction, e.g. coffee shops, to nurture innovative ideas. With the widespread success of Internet, interpersonal communication and interaction changed. Online platforms complement physical places for idea exchange and innovation – the rise of hybrid, ‘net localities.’ Further, since its introduction in 2003 by Chesbrough, the concept of open innovation received increased attention as a topic in academic research as well as an innovation strategy applied by companies. Open innovation allows companies to seek and release intellectual property and new ideas from outside of their own company. As a consequence, the innovation process is no longer only managed within the company, but it is pursued in a co-creation process with customers, suppliers, and other stakeholders. Quirky.com (Quirky), a company founded by Ben Kaufman in 2009, recognised the opportunity given by the Internet for knowledge exchange and open innovation. Quirky developed an online platform that makes innovation available to everyone. This paper reports on a study that analysed Quirky’s business process in an extended event-driven process chain (eEPC). The aim was to determine how the platform enabled crowdsourced innovation for physical products on the Internet. The analysis reveals that key elements of the business model are based on open innovation. Quirky is an example of how open innovation can support crowdsourced and crowdfunded product ideation, development and selling. The company opened up various stages in the innovation process to its members to contribute in the product development, e.g. product ideation, design, and market research. Throughout the process, members earn influence through participating in the product development. Based on the influence they receive, shares on the product’s turnover. The outcomes of the study’s analysis highlighted certain benefits of open innovation for product development. The paper concludes with recommendations for future research to look into opportunities of open innovation approaches to be adopted by tertiary institutions as a novel way to commercialise research intellectual property.Keywords: business process, crowdsourced innovation, open innovation, Quirky
Procedia PDF Downloads 22727194 Contemporary Materialities
Authors: Fabian Saptouw
Abstract:
In the past decade there was a resurgence of interest in the value of ‘process’ and ‘craft’ within the social and artistic community. Theorist like Barbara Bolt and Paul Carter have eloquently argued for the importance of ‘theorizing out of practice’ and ‘material thinking’ in response to this trend. Time and labour intensive artistic production processes are however not generally included in this bracket and often labelled as either obsessive or absurd. Neither of these terms adequately conveys the conceptual importance of labour in relation to ‘process’ as manifested through this production method. This issue will be addressed by critically assessing the work of eight South African artists through the lens of contemporary process-based production. This will result in a more integrated view of the art-object, its art-historical trajectory, its materialisation as well as its production process. This paper will conclude by tying the characteristics of these artworks to international trends and provide a platform for the overall reconsideration of unalienated artistic labour.Keywords: materiality, process art, practice-led research, unalienated labour
Procedia PDF Downloads 34027193 A Study of the Adaptive Reuse for School Land Use Strategy: An Application of the Analytic Network Process and Big Data
Authors: Wann-Ming Wey
Abstract:
In today's popularity and progress of information technology, the big data set and its analysis are no longer a major conundrum. Now, we could not only use the relevant big data to analysis and emulate the possible status of urban development in the near future, but also provide more comprehensive and reasonable policy implementation basis for government units or decision-makers via the analysis and emulation results as mentioned above. In this research, we set Taipei City as the research scope, and use the relevant big data variables (e.g., population, facility utilization and related social policy ratings) and Analytic Network Process (ANP) approach to implement in-depth research and discussion for the possible reduction of land use in primary and secondary schools of Taipei City. In addition to enhance the prosperous urban activities for the urban public facility utilization, the final results of this research could help improve the efficiency of urban land use in the future. Furthermore, the assessment model and research framework established in this research also provide a good reference for schools or other public facilities land use and adaptive reuse strategies in the future.Keywords: adaptive reuse, analytic network process, big data, land use strategy
Procedia PDF Downloads 20327192 An As-Is Analysis and Approach for Updating Building Information Models and Laser Scans
Authors: Rene Hellmuth
Abstract:
Factory planning has the task of designing products, plants, processes, organization, areas, and the construction of a factory. The requirements for factory planning and the building of a factory have changed in recent years. Regular restructuring of the factory building is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity & Ambiguity) lead to more frequent restructuring measures within a factory. A building information model (BIM) is the planning basis for rebuilding measures and becomes an indispensable data repository to be able to react quickly to changes. Use as a planning basis for restructuring measures in factories only succeeds if the BIM model has adequate data quality. Under this aspect and the industrial requirement, three data quality factors are particularly important for this paper regarding the BIM model: up-to-dateness, completeness, and correctness. The research question is: how can a BIM model be kept up to date with required data quality and which visualization techniques can be applied in a short period of time on the construction site during conversion measures? An as-is analysis is made of how BIM models and digital factory models (including laser scans) are currently being kept up to date. Industrial companies are interviewed, and expert interviews are conducted. Subsequently, the results are evaluated, and a procedure conceived how cost-effective and timesaving updating processes can be carried out. The availability of low-cost hardware and the simplicity of the process are of importance to enable service personnel from facility mnagement to keep digital factory models (BIM models and laser scans) up to date. The approach includes the detection of changes to the building, the recording of the changing area, and the insertion into the overall digital twin. Finally, an overview of the possibilities for visualizations suitable for construction sites is compiled. An augmented reality application is created based on an updated BIM model of a factory and installed on a tablet. Conversion scenarios with costs and time expenditure are displayed. A user interface is designed in such a way that all relevant conversion information is available at a glance for the respective conversion scenario. A total of three essential research results are achieved: As-is analysis of current update processes for BIM models and laser scans, development of a time-saving and cost-effective update process and the conception and implementation of an augmented reality solution for BIM models suitable for construction sites.Keywords: building information modeling, digital factory model, factory planning, restructuring
Procedia PDF Downloads 112