Search results for: component based development
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38157

Search results for: component based development

37857 Detection of Abnormal Process Behavior in Copper Solvent Extraction by Principal Component Analysis

Authors: Kirill Filianin, Satu-Pia Reinikainen, Tuomo Sainio

Abstract:

Frequent measurements of product steam quality create a data overload that becomes more and more difficult to handle. In the current study, plant history data with multiple variables was successfully treated by principal component analysis to detect abnormal process behavior, particularly, in copper solvent extraction. The multivariate model is based on the concentration levels of main process metals recorded by the industrial on-stream x-ray fluorescence analyzer. After mean-centering and normalization of concentration data set, two-dimensional multivariate model under principal component analysis algorithm was constructed. Normal operating conditions were defined through control limits that were assigned to squared score values on x-axis and to residual values on y-axis. 80 percent of the data set were taken as the training set and the multivariate model was tested with the remaining 20 percent of data. Model testing showed successful application of control limits to detect abnormal behavior of copper solvent extraction process as early warnings. Compared to the conventional techniques of analyzing one variable at a time, the proposed model allows to detect on-line a process failure using information from all process variables simultaneously. Complex industrial equipment combined with advanced mathematical tools may be used for on-line monitoring both of process streams’ composition and final product quality. Defining normal operating conditions of the process supports reliable decision making in a process control room. Thus, industrial x-ray fluorescence analyzers equipped with integrated data processing toolbox allows more flexibility in copper plant operation. The additional multivariate process control and monitoring procedures are recommended to apply separately for the major components and for the impurities. Principal component analysis may be utilized not only in control of major elements’ content in process streams, but also for continuous monitoring of plant feed. The proposed approach has a potential in on-line instrumentation providing fast, robust and cheap application with automation abilities.

Keywords: abnormal process behavior, failure detection, principal component analysis, solvent extraction

Procedia PDF Downloads 281
37856 Exploratory Factor Analysis of Natural Disaster Preparedness Awareness of Thai Citizens

Authors: Chaiyaset Promsri

Abstract:

Based on the synthesis of related literatures, this research found thirteen related dimensions that involved the development of natural disaster preparedness awareness including hazard knowledge, hazard attitude, training for disaster preparedness, rehearsal and practice for disaster preparedness, cultural development for preparedness, public relations and communication, storytelling, disaster awareness game, simulation, past experience to natural disaster, information sharing with family members, and commitment to the community (time of living).  The 40-item of natural disaster preparedness awareness questionnaire was developed based on these thirteen dimensions. Data were collected from 595 participants in Bangkok metropolitan and vicinity. Cronbach's alpha was used to examine the internal consistency for this instrument. Reliability coefficient was 97, which was highly acceptable.  Exploratory Factor Analysis where principal axis factor analysis was employed. The Kaiser-Meyer-Olkin index of sampling adequacy was .973, indicating that the data represented a homogeneous collection of variables suitable for factor analysis. Bartlett's test of Sphericity was significant for the sample as Chi-Square = 23168.657, df = 780, and p-value < .0001, which indicated that the set of correlations in the correlation matrix was significantly different and acceptable for utilizing EFA. Factor extraction was done to determine the number of factors by using principal component analysis and varimax.  The result revealed that four factors had Eigen value greater than 1 with more than 60% cumulative of variance. Factor #1 had Eigen value of 22.270, and factor loadings ranged from 0.626-0.760. This factor was named as "Knowledge and Attitude of Natural Disaster Preparedness".  Factor #2 had Eigen value of 2.491, and factor loadings ranged from 0.596-0.696. This factor was named as "Training and Development". Factor #3 had Eigen value of 1.821, and factor loadings ranged from 0.643-0.777. This factor was named as "Building Experiences about Disaster Preparedness".  Factor #4 had Eigen value of 1.365, and factor loadings ranged from 0.657-0.760. This was named as "Family and Community". The results of this study provided support for the reliability and construct validity of natural disaster preparedness awareness for utilizing with populations similar to sample employed.

Keywords: natural disaster, disaster preparedness, disaster awareness, Thai citizens

Procedia PDF Downloads 348
37855 Comparison of Web Development Using Framework over Library

Authors: Syamsul Syafiq, Maslina Daud, Hafizah Hasan, Ahmad Zairi, Shazil Imri, Ezaini Akmar, Norbazilah Rahim

Abstract:

Over recent years, web development has changed significantly. Driven largely by the rise of trends like mobiles, the world of development is rapidly evolving. The rise of the Internet makes web applications crucial nowadays. The web application has been an interface for a company and one of the ways they present their portfolio to the client. On the other hand, the web has become part of the file management system which takes over the role of paper. Due to high demand in web applications, developers are required to develop a web application that are cost-effective, secure and well coded. A framework has been proposed to develop an application rather than using library style development. The framework is helping the developer in creating the structure of a web automatically. This paper will compare the advantages and disadvantages of web development using framework against library-style development. This comparison is based on a previous research paper focusing on two main indicators, which are the impact to management and impact to the developer.

Keywords: framework, library style development, web application development, traditional web, static web, dynamic web

Procedia PDF Downloads 198
37854 A Combination of Anisotropic Diffusion and Sobel Operator to Enhance the Performance of the Morphological Component Analysis for Automatic Crack Detection

Authors: Ankur Dixit, Hiroaki Wagatsuma

Abstract:

The crack detection on a concrete bridge is an important and constant task in civil engineering. Chronically, humans are checking the bridge for inspection of cracks to maintain the quality and reliability of bridge. But this process is very long and costly. To overcome such limitations, we have used a drone with a digital camera, which took some images of bridge deck and these images are processed by morphological component analysis (MCA). MCA technique is a very strong application of sparse coding and it explores the possibility of separation of images. In this paper, MCA has been used to decompose the image into coarse and fine components with the effectiveness of two dictionaries namely anisotropic diffusion and wavelet transform. An anisotropic diffusion is an adaptive smoothing process used to adjust diffusion coefficient by finding gray level and gradient as features. These cracks in image are enhanced by subtracting the diffused coarse image into the original image and the results are treated by Sobel edge detector and binary filtering to exhibit the cracks in a fine way. Our results demonstrated that proposed MCA framework using anisotropic diffusion followed by Sobel operator and binary filtering may contribute to an automation of crack detection even in open field sever conditions such as bridge decks.

Keywords: anisotropic diffusion, coarse component, fine component, MCA, Sobel edge detector and wavelet transform

Procedia PDF Downloads 150
37853 Efficiency and Reliability Analysis of SiC-Based and Si-Based DC-DC Buck Converters in Thin-Film PV Systems

Authors: Elaid Bouchetob, Bouchra Nadji

Abstract:

This research paper compares the efficiency and reliability (R(t)) of SiC-based and Si-based DC-DC buck converters in thin layer PV systems with an AI-based MPPT controller. Using Simplorer/Simulink simulations, the study assesses their performance under varying conditions. Results show that the SiC-based converter outperforms the Si-based one in efficiency and cost-effectiveness, especially in high temperature and low irradiance conditions. It also exhibits superior reliability, particularly at high temperature and voltage. Reliability calculation (R(t)) is analyzed to assess system performance over time. The SiC-based converter demonstrates better reliability, considering factors like component failure rates and system lifetime. The research focuses on the buck converter's role in charging a Lithium battery within the PV system. By combining the SiC-based converter and AI-based MPPT controller, higher charging efficiency, improved reliability, and cost-effectiveness are achieved. The SiC-based converter proves superior under challenging conditions, emphasizing its potential for optimizing PV system charging. These findings contribute insights into the efficiency, reliability, and reliability calculation of SiC-based and Si-based converters in PV systems. SiC technology's advantages, coupled with advanced control strategies, promote efficient and sustainable energy storage using Lithium batteries. The research supports PV system design and optimization for reliable renewable energy utilization.

Keywords: efficiency, reliability, artificial intelligence, sic device, thin layer, buck converter

Procedia PDF Downloads 36
37852 LaPEA: Language for Preprocessing of Edge Applications in Smart Factory

Authors: Masaki Sakai, Tsuyoshi Nakajima, Kazuya Takahashi

Abstract:

In order to improve the productivity of a factory, it is often the case to create an inference model by collecting and analyzing operational data off-line and then to develop an edge application (EAP) that evaluates the quality of the products or diagnoses machine faults in real-time. To accelerate this development cycle, an edge application framework for the smart factory is proposed, which enables to create and modify EAPs based on prepared inference models. In the framework, the preprocessing component is the key part to make it work. This paper proposes a language for preprocessing of edge applications, called LaPEA, which can flexibly process several sensor data from machines into explanatory variables for an inference model, and proves that it meets the requirements for the preprocessing.

Keywords: edge application framework, edgecross, preprocessing language, smart factory

Procedia PDF Downloads 118
37851 A Comparative Study of Innovative Regions in the World Based on the Theory of Innovation Ecosystem: Cases of the Silicon Valley, Cambridge, Tsukuba and Zhongguancun

Authors: Xinlan Zhang, Dandong Ge, Bingying Liu, Haoyang Liang

Abstract:

With the rapid development of technology and urbanization, innovation has become an important driving force for urban development. Since the late 20th Century, a number of cities and regions have emerged in the world with innovation as the main driving force, and many of them are still the most important innovation centers in the world. Based on the perspective of innovation ecosystem theory, this paper compares Silicon Valley in the United States, Cambridge in the United Kingdom, Tsukuba in Japan and Zhongguancun in China to explore the reasons for the success of innovative regions and their respective characteristics, hoping to provide a reference for the development of other innovative cities. The main conclusions of this study are the following; firstly, different countries have different social backgrounds. The development model of innovative regions is closely related to the regional backgrounds. Secondly, the market force and the government power have important significance for the development of the innovation regions. The influence of the government power in the early stage of development is great, and in the latter stage, development is dominated by the market force. In addition, the self-organizing ability of the region has a great impact on the innovation ability of the region. Strong self-organizing ability is conducive to the development of innovation economy.

Keywords: contrastive study, development model, innovation ecosystem, innovative regions

Procedia PDF Downloads 127
37850 Entrepreneurial Support Ecosystem: Role of Research Institutes

Authors: Ayna Yusubova, Bart Clarysse

Abstract:

This paper explores role of research institutes in creation of support ecosystem for new technology-based ventures. Previous literature introduced research institutes as part of business and knowledge ecosystem, very few studies are available that consider a research institute as an ecosystem that support high-tech startups at every stage of development. Based on a resource-based view and a stage-based model of high-tech startups growth, this study aims to analyze how a research institute builds a startup support ecosystem by attracting different stakeholders in order to help startups to overcome resource. This paper is based on an in-depth case study of public research institute that focus on development of entrepreneurial ecosystem in a developed region. Analysis shows that the idea generation stage of high-tech startups that related to the invention and development of product or technology for commercialization is associated with a lack of critical knowledge resources. Second, at growth phase that related to market entrance, high-tech startups face challenges associated with the development of their business network. Accordingly, the study shows the support ecosystem that research institute creates helps high-tech startups overcome resource gaps in order to achieve a successful transition from one phase of growth to the next.

Keywords: new technology-based firms, ecosystems, resources, business incubators, research instutes

Procedia PDF Downloads 233
37849 Metabolomics Fingerprinting Analysis of Melastoma malabathricum L. Leaf of Geographical Variation Using HPLC-DAD Combined with Chemometric Tools

Authors: Dian Mayasari, Yosi Bayu Murti, Sylvia Utami Tunjung Pratiwi, Sudarsono

Abstract:

Melastoma malabathricum L. is an Indo-Pacific herb that has been traditionally used to treat several ailments such as wounds, dysentery, diarrhea, toothache, and diabetes. This plant is common across tropical Indo-Pacific archipelagos and is tolerant of a range of soils, from low-lying areas subject to saltwater inundation to the salt-free conditions of mountain slopes. How the soil and environmental variation influences secondary metabolite production in the herb, and an understanding of the plant’s utility as traditional medicine, remain largely unknown and unexplored. The objective of this study is to evaluate the variability of the metabolic profiles of M. malabathricum L. across its geographic distribution. By employing high-performance liquid chromatography-diode array detector (HPLC-DAD), a highly established, simple, sensitive, and reliable method was employed for establishing the chemical fingerprints of 72 samples of M. malabathricum L. leaves from various geographical locations in Indonesia. Specimens collected from six terrestrial and archipelago regions of Indonesia were analyzed by HPLC to generate chromatogram peak profiles that could be compared across each region. Data corresponding to the common peak areas of HPLC chromatographic fingerprint were analyzed by hierarchical component analysis (HCA) and principal component analysis (PCA) to extract information on the most significant variables contributing to characterization and classification of analyzed samples data. Principal component values were identified as PC1 and PC2 with 41.14% and 19.32%, respectively. Based on variety and origin, the high-performance liquid chromatography method validated the chemical fingerprint results used to screen the in vitro antioxidant activity of M. malabathricum L. The result shows that the developed method has potential values for the quality of similar M. malabathrium L. samples. These findings provide a pathway for the development and utilization of references for the identification of M. malabathricum L. Our results indicate the importance of considering geographic distribution during field-collection efforts as they demonstrate regional metabolic variation in secondary metabolites of M. malabathricum L., as illustrated by HPLC chromatogram peaks and their antioxidant activities. The results also confirm the utility of this simple approach to a rapid evaluation of metabolic variation between plants and their potential ethnobotanical properties, potentially due to the environments from whence they were collected. This information will facilitate the optimization of growth conditions to suit particular medicinal qualities.

Keywords: fingerprint, high performance liquid chromatography, Melastoma malabathricum l., metabolic profiles, principal component analysis

Procedia PDF Downloads 128
37848 Detection of Cardiac Arrhythmia Using Principal Component Analysis and Xgboost Model

Authors: Sujay Kotwale, Ramasubba Reddy M.

Abstract:

Electrocardiogram (ECG) is a non-invasive technique used to study and analyze various heart diseases. Cardiac arrhythmia is a serious heart disease which leads to death of the patients, when left untreated. An early-time detection of cardiac arrhythmia would help the doctors to do proper treatment of the heart. In the past, various algorithms and machine learning (ML) models were used to early-time detection of cardiac arrhythmia, but few of them have achieved better results. In order to improve the performance, this paper implements principal component analysis (PCA) along with XGBoost model. The PCA was implemented to the raw ECG signals which suppress redundancy information and extracted significant features. The obtained significant ECG features were fed into XGBoost model and the performance of the model was evaluated. In order to valid the proposed technique, raw ECG signals obtained from standard MIT-BIH database were employed for the analysis. The result shows that the performance of proposed method is superior to the several state-of-the-arts techniques.

Keywords: cardiac arrhythmia, electrocardiogram, principal component analysis, XGBoost

Procedia PDF Downloads 89
37847 Design of a Professional Development Framework in Teaching and Learning for Engineering Educators

Authors: Orla McConnell, Cormac MacMahon, Jen Harvey

Abstract:

Ireland’s national professional development framework for those who teach in higher education, aims to provide guidance and leadership in the planning, developing and engaging in professional development practices. A series of pilot projects have been initiated to help explore the framework’s likely utility and acceptance by educators and their institutions. These projects require engagement with staff in the interpretation and adaption of the framework within their working contexts. The purpose of this paper is to outline the development of one such project with engineering educators at three Institutes of Technology seeking designation as a technological university. The initiative aims to gain traction in the acceptance of the framework with the engineering education community by linking core and discipline-specific teaching and learning competencies with professional development activities most valued by engineering educators. Informed by three strands of literature: professional development in higher education; engineering education; and teaching and learning training provisions, the project begins with a survey of all those involved in teaching and learning in engineering across the three institutes. Based on engagement with key stakeholders, subsequent qualitative research informs the contextualization of the national framework for discipline-specific and institutional piloting. The paper concludes by exploring engineering educator perceptions of the national framework’s utility based on their engagement with the pilot process. Feedback from the pilot indicates that there is a significant gap between the professional development needs of engineering educators and the current professional development provision in teaching and learning.

Keywords: engineering education, pilot, professional development, teaching and learning

Procedia PDF Downloads 306
37846 Development of MEMS Based 3-Axis Accelerometer for Hand Movement Monitoring

Authors: Zohra Aziz Ali Manjiyani, Renju Thomas Jacob, Keerthan Kumar

Abstract:

This project develops a hand movement monitoring system, which feeds the data into the computer and gives the 3D image rotation according to the direction of the tilt and hence monitoring the movement of the hand in context to its tilt. Advancement of MEMS Technology has enabled us to get very small and low-cost accelerometer ICs which is based on capacitive principle. Accelerometer based Tilt sensor ADXL335 is used in this paper, based on MEMS technology and the project emphasis on the development of the MEMS-based accelerometer to measure the tilt, interfacing the hardware with the LabVIEW and showing the 3D rotation to the user, which is in his understandable form and tilt data can be saved in the computer. It provides an experience of working on emerging technologies like MEMS and design software like LabVIEW.

Keywords: MEMS accelerometer, tilt sensor ADXL335, LabVIEW simulation, 3D animation

Procedia PDF Downloads 488
37845 The Simultaneous Effect of Horizontal and Vertical Earthquake Components on the Seismic Response of Buckling-Restrained Braced Frame

Authors: Mahdi Shokrollahi

Abstract:

Over the past years, much research has been conducted on the vulnerability of structures to earthquakes, which only horizontal components of the earthquake were considered in their seismic analysis and vertical earthquake acceleration especially in near-fault area was less considered. The investigation of the mappings shows that vertical earthquake acceleration can be significantly closer to the maximum horizontal earthquake acceleration, and even exceeds it in some cases. This study has compared the behavior of different members of three steel moment frame with a buckling-restrained brace (BRB), one time only by considering the horizontal component and again by considering simultaneously the horizontal and vertical components under the three mappings of the near-fault area and the effect of vertical acceleration on structural responses is investigated. Finally, according to the results, the vertical component of the earthquake has a greater effect on the axial force of the columns and the vertical displacement of the middle of the beams of the different classes and less on the lateral displacement of the classes.

Keywords: vertical earthquake acceleration, near-fault area, steel frame, horizontal and vertical component of earthquake, buckling-restrained brace

Procedia PDF Downloads 153
37844 Vibration Propagation in Structures Through Structural Intensity Analysis

Authors: Takhchi Jamal, Ouisse Morvan, Sadoulet-Reboul Emeline, Bouhaddi Noureddine, Gagliardini Laurent, Bornet Frederic, Lakrad Faouzi

Abstract:

Structural intensity is a technique that can be used to indicate both the magnitude and direction of power flow through a structure from the excitation source to the dissipation sink. However, current analysis is limited to the low frequency range. At medium and high frequencies, a rotational component appear in the field, masking the energy flow and make its understanding difficult or impossible. The objective of this work is to implement a methodology to filter out the rotational components of the structural intensity field in order to fully understand the energy flow in complex structures. The approach is based on the Helmholtz decomposition. It allows to decompose the structural intensity field into rotational, irrotational, and harmonic components. Only the irrotational component is needed to describe the net power flow from a source to a dissipative zone in the structure. The methodology has been applied on academic structures, and it allows a good analysis of the energy transfer paths.

Keywords: structural intensity, power flow, helmholt decomposition, irrotational intensity

Procedia PDF Downloads 143
37843 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)

Authors: Gule Teri

Abstract:

The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.

Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing

Procedia PDF Downloads 40
37842 GeoWeb at the Service of Household Waste Collection in Urban Areas

Authors: Abdessalam Hijab, Eric Henry, Hafida Boulekbache

Abstract:

The complexity of the city makes sustainable management of the urban environment more difficult. Managers are required to make significant human and technical investments, particularly in household waste collection (focus of our research). The aim of this communication is to propose a collaborative geographic multi-actor device (MGCD) based on the link between information and communication technologies (ICT) and geo-web tools in order to involve urban residents in household waste collection processes. Our method is based on a collaborative/motivational concept between the city and its residents. It is a geographic collaboration dedicated to the general public (citizens, residents, and any other participant), based on real-time allocation and geographic location of topological, geographic, and multimedia data in the form of local geo-alerts (location-specific problems) related to household waste in an urban environment. This contribution allows us to understand the extent to which residents can assist and contribute to the development of household waste collection processes for a better protected urban environment. This suggestion provides a good idea of how residents can contribute to the data bank for future uses. Moreover, it will contribute to the transformation of the population into a smart inhabitant as an essential component of a smart city. The proposed model will be tested in the Lamkansa sampling district in Casablanca, Morocco.

Keywords: information and communication technologies, ICTs, GeoWeb, geo-collaboration, city, inhabitant, waste, collection, environment

Procedia PDF Downloads 92
37841 Developing Community-Based Ecotourism Framework for Sustainability in Kota Kinabalu, Sabah, Malaysia

Authors: Fauziahtion A. G. Samad, Imelda Albert Gisip

Abstract:

Community-Based Ecotourism (CBET) is one of the most significant parts of the sustainability in tourism. To achieve the goal of sustainability, the Framework for Sustainable Community Based Ecotourism (FSCBE) was developed from the experience in setting and implementing Community-Based Ecotourism (CBE) under IMPAK (Community-Based Tourism Development Initiative, Kota Kinabalu City Hall) program. Desa Cinta Kobuni located in Inanam, a sub-district of Kota Kinabalu city was the first project under this program. The goal was to transform the village into a sustainable tourism destination. After five years of the program, there are three tourism destination were established included Homestay Id Kalangadan and Homestay Darau Wetland. They currently are still in the growth stage and now becoming a model for other inspiring villages to emulate. There are three major impacts to the villages, which are 1) the increment of secondary income; 2) the advancement of women’s empowerment; and 3) the enhanced sustainability initiatives of the villagers. The experience in developing the CBET has resulted the Kota Kinabalu City Hall to produce the Framework for Sustainable Community Based Ecotourism (FSCBE) that integrates Sustainable Development Goals and Global Sustainable Tourism Criteria (GSTC) for future CBET development in other villages in the city.

Keywords: community-based ecoturism, sustainability, Sabah, Malaysia

Procedia PDF Downloads 12
37840 Technology Futures in Global Militaries: A Forecasting Method Using Abstraction Hierarchies

Authors: Mark Andrew

Abstract:

Geopolitical tensions are at a thirty-year high, and the pace of technological innovation is driving asymmetry in force capabilities between nation states and between non-state actors. Technology futures are a vital component of defence capability growth, and investments in technology futures need to be informed by accurate and reliable forecasts of the options for ‘systems of systems’ innovation, development, and deployment. This paper describes a method for forecasting technology futures developed through an analysis of four key systems’ development stages, namely: technology domain categorisation, scanning results examining novel systems’ signals and signs, potential system-of systems’ implications in warfare theatres, and political ramifications in terms of funding and development priorities. The method has been applied to several technology domains, including physical systems (e.g., nano weapons, loitering munitions, inflight charging, and hypersonic missiles), biological systems (e.g., molecular virus weaponry, genetic engineering, brain-computer interfaces, and trans-human augmentation), and information systems (e.g., sensor technologies supporting situation awareness, cyber-driven social attacks, and goal-specification challenges to proliferation and alliance testing). Although the current application of the method has been team-centred using paper-based rapid prototyping and iteration, the application of autonomous language models (such as GPT-3) is anticipated as a next-stage operating platform. The importance of forecasting accuracy and reliability is considered a vital element in guiding technology development to afford stronger contingencies as ideological changes are forecast to expand threats to ecology and earth systems, possibly eclipsing the traditional vulnerabilities of nation states. The early results from the method will be subjected to ground truthing using longitudinal investigation.

Keywords: forecasting, technology futures, uncertainty, complexity

Procedia PDF Downloads 83
37839 Wood Ashes from Electrostatic Filter as a Replacement for the Fly Ashes in Concrete

Authors: Piotr-Robert Lazik, Harald Garrecht

Abstract:

Many concrete technologists are looking for a solution to replace Fly Ashes that would be unavailable in a few years as an element that occurs as a major component of many types of concrete. The importance of such component is clear - it saves cement and reduces the amount of CO2 in the atmosphere that occurs during cement production. Wood Ashes from electrostatic filter can be used as a valuable substitute in concrete. The laboratory investigations showed that the wood ash concrete had a compressive strength comparable to coal fly ash concrete. These results indicate that wood ash can be used to manufacture normal concrete.

Keywords: wood ashes, fly ashes, electric filter, replacement, concrete technology

Procedia PDF Downloads 103
37838 Conceptual Model for Logistics Information System

Authors: Ana María Rojas Chaparro, Cristian Camilo Sarmiento Chaves

Abstract:

Given the growing importance of logistics as a discipline for efficient management of materials flow and information, the adoption of tools that permit to create facilities in making decisions based on a global perspective of the system studied has been essential. The article shows how from a concepts-based model is possible to organize and represent in appropriate way the reality, showing accurate and timely information, features that make this kind of models an ideal component to support an information system, recognizing that information as relevant to establish particularities that allow get a better performance about the evaluated sector.

Keywords: system, information, conceptual model, logistics

Procedia PDF Downloads 464
37837 Mathematical Modeling and Optimization of Burnishing Parameters for 15NiCr6 Steel

Authors: Tarek Litim, Ouahiba Taamallah

Abstract:

The present paper is an investigation of the effect of burnishing on the surface integrity of a component made of 15NiCr6 steel. This work shows a statistical study based on regression, and Taguchi's design has allowed the development of mathematical models to predict the output responses as a function of the technological parameters studied. The response surface methodology (RSM) showed a simultaneous influence of the burnishing parameters and observe the optimal processing parameters. ANOVA analysis of the results resulted in the validation of the prediction model with a determination coefficient R=90.60% and 92.41% for roughness and hardness, respectively. Furthermore, a multi-objective optimization allowed to identify a regime characterized by P=10kgf, i=3passes, and f=0.074mm/rev, which favours minimum roughness and maximum hardness. The result was validated by the desirability of D= (0.99 and 0.95) for roughness and hardness, respectively.

Keywords: 15NiCr6 steel, burnishing, surface integrity, Taguchi, RSM, ANOVA

Procedia PDF Downloads 165
37836 Assessment of Social Vulnerability of Urban Population to Floods – a Case Study of Mumbai

Authors: Sherly M. A., Varsha Vijaykumar, Subhankar Karmakar, Terence Chan, Christian Rau

Abstract:

This study aims at proposing an indicator-based framework for assessing social vulnerability of any coastal megacity to floods. The final set of indicators of social vulnerability are chosen from a set of feasible and available indicators which are prepared using a Geographic Information System (GIS) framework on a smaller scale considering 1-km grid cell to provide an insight into the spatial variability of vulnerability. The optimal weight for each individual indicator is assigned using data envelopment analysis (DEA) as it avoids subjective weights and improves the confidence on the results obtained. In order to de-correlate and reduce the dimension of multivariate data, principal component analysis (PCA) has been applied. The proposed methodology is demonstrated on twenty four wards of Mumbai under the jurisdiction of Municipal Corporation of Greater Mumbai (MCGM). This framework of vulnerability assessment is not limited to the present study area, and may be applied to other urban damage centers.

Keywords: urban floods, vulnerability, data envelopment analysis, principal component analysis

Procedia PDF Downloads 333
37835 Strategic Partnerships for Sustainable Tourism Development in Papua New Guinea

Authors: Zainab Olabisi Tairu

Abstract:

Strategic partnerships are a core requirement in delivering sustainable tourism for development in developing nations like Papua New Guinea. This paper unveils the strategic partnerships for sustainable tourism development in Papua New Guinea. Much emphasis is made among tourism stakeholders, on the importance of strategic partnership and positioning in developing sustainable tourism development. This paper engages stakeholders’ ecotourism differentiation and power relations in the discussion of the paper through interviews and observations with tourism stakeholders in Papua New Guinea. Collaborative approaches in terms of sustaining the tourism industry, having a milestone of achieved plans, are needed for tourism growth and development. This paper adds a new insight to the body of knowledge on stakeholders’ identification, formation, power relations and an integrated approach to successful tourism development. In order to achieve responsible tourism planning and management outcomes, partnerships must be holistic in perspective and based on sustainable development principles.

Keywords: stakeholders, sustainable tourism, Papua New Guinea, partnerships

Procedia PDF Downloads 624
37834 Effects of Different Meteorological Variables on Reference Evapotranspiration Modeling: Application of Principal Component Analysis

Authors: Akinola Ikudayisi, Josiah Adeyemo

Abstract:

The correct estimation of reference evapotranspiration (ETₒ) is required for effective irrigation water resources planning and management. However, there are some variables that must be considered while estimating and modeling ETₒ. This study therefore determines the multivariate analysis of correlated variables involved in the estimation and modeling of ETₒ at Vaalharts irrigation scheme (VIS) in South Africa using Principal Component Analysis (PCA) technique. Weather and meteorological data between 1994 and 2014 were obtained both from South African Weather Service (SAWS) and Agricultural Research Council (ARC) in South Africa for this study. Average monthly data of minimum and maximum temperature (°C), rainfall (mm), relative humidity (%), and wind speed (m/s) were the inputs to the PCA-based model, while ETₒ is the output. PCA technique was adopted to extract the most important information from the dataset and also to analyze the relationship between the five variables and ETₒ. This is to determine the most significant variables affecting ETₒ estimation at VIS. From the model performances, two principal components with a variance of 82.7% were retained after the eigenvector extraction. The results of the two principal components were compared and the model output shows that minimum temperature, maximum temperature and windspeed are the most important variables in ETₒ estimation and modeling at VIS. In order words, ETₒ increases with temperature and windspeed. Other variables such as rainfall and relative humidity are less important and cannot be used to provide enough information about ETₒ estimation at VIS. The outcome of this study has helped to reduce input variable dimensionality from five to the three most significant variables in ETₒ modelling at VIS, South Africa.

Keywords: irrigation, principal component analysis, reference evapotranspiration, Vaalharts

Procedia PDF Downloads 225
37833 Integrating Sustainable Construction Principles into Curriculum Design for Built Environment Professional Programs in Nigeria

Authors: M. Yakubu, M. B. Isah, S. Bako

Abstract:

This paper presents the findings of a research which sought to investigate the readiness to integrate sustainable construction principles into curriculum design for built environment professional programs in the Nigerian Universities. Developing the knowledge and understanding that construction professionals acquire of sustainable construction practice leads to considerable improvement in the environmental performance of the construction sector. Integrating sustainable environmental issues within the built environment education curricula provide the basis of this research. An integration of sustainable development principles into the universities built environment professional programmes are carried out with a view of finding solutions to the key issues identified. The perspectives of academia have been assessed and findings tested for validity through the analysis of primary quantitative data that has been collected. The secondary data generated has shown that there are significant differences in the approach to curriculum design within the built environment professional programmes, and this reveals that there is no ‘best practice’ that is clearly identifiable. Sequel to the above, this research reveals that engaging all stakeholders would be a useful component of built environment curriculum development, and that the curriculum be negotiated with interested parties. These parties have been identified as academia, government, construction industry and built environment professionals.

Keywords: built environment, curriculum development, sustainable construction, sustainable development

Procedia PDF Downloads 389
37832 Kernel-Based Double Nearest Proportion Feature Extraction for Hyperspectral Image Classification

Authors: Hung-Sheng Lin, Cheng-Hsuan Li

Abstract:

Over the past few years, kernel-based algorithms have been widely used to extend some linear feature extraction methods such as principal component analysis (PCA), linear discriminate analysis (LDA), and nonparametric weighted feature extraction (NWFE) to their nonlinear versions, kernel principal component analysis (KPCA), generalized discriminate analysis (GDA), and kernel nonparametric weighted feature extraction (KNWFE), respectively. These nonlinear feature extraction methods can detect nonlinear directions with the largest nonlinear variance or the largest class separability based on the given kernel function. Moreover, they have been applied to improve the target detection or the image classification of hyperspectral images. The double nearest proportion feature extraction (DNP) can effectively reduce the overlap effect and have good performance in hyperspectral image classification. The DNP structure is an extension of the k-nearest neighbor technique. For each sample, there are two corresponding nearest proportions of samples, the self-class nearest proportion and the other-class nearest proportion. The term “nearest proportion” used here consider both the local information and other more global information. With these settings, the effect of the overlap between the sample distributions can be reduced. Usually, the maximum likelihood estimator and the related unbiased estimator are not ideal estimators in high dimensional inference problems, particularly in small data-size situation. Hence, an improved estimator by shrinkage estimation (regularization) is proposed. Based on the DNP structure, LDA is included as a special case. In this paper, the kernel method is applied to extend DNP to kernel-based DNP (KDNP). In addition to the advantages of DNP, KDNP surpasses DNP in the experimental results. According to the experiments on the real hyperspectral image data sets, the classification performance of KDNP is better than that of PCA, LDA, NWFE, and their kernel versions, KPCA, GDA, and KNWFE.

Keywords: feature extraction, kernel method, double nearest proportion feature extraction, kernel double nearest feature extraction

Procedia PDF Downloads 299
37831 Buffer Allocation and Traffic Shaping Policies Implemented in Routers Based on a New Adaptive Intelligent Multi Agent Approach

Authors: M. Taheri Tehrani, H. Ajorloo

Abstract:

In this paper, an intelligent multi-agent framework is developed for each router in which agents have two vital functionalities, traffic shaping and buffer allocation and are positioned in the ports of the routers. With traffic shaping functionality agents shape the traffic forward by dynamic and real time allocation of the rate of generation of tokens in a Token Bucket algorithm and with buffer allocation functionality agents share their buffer capacity between each other based on their need and the conditions of the network. This dynamic and intelligent framework gives this opportunity to some ports to work better under burst and more busy conditions. These agents work intelligently based on Reinforcement Learning (RL) algorithm and will consider effective parameters in their decision process. As RL have limitation considering much parameter in its decision process due to the volume of calculations, we utilize our novel method which invokes Principle Component Analysis (PCA) on the RL and gives a high dimensional ability to this algorithm to consider as much as needed parameters in its decision process. This implementation when is compared to our previous work where traffic shaping was done without any sharing and dynamic allocation of buffer size for each port, the lower packet drop in the whole network specifically in the source routers can be seen. These methods are implemented in our previous proposed intelligent simulation environment to be able to compare better the performance metrics. The results obtained from this simulation environment show an efficient and dynamic utilization of resources in terms of bandwidth and buffer capacities pre allocated to each port.

Keywords: principal component analysis, reinforcement learning, buffer allocation, multi- agent systems

Procedia PDF Downloads 478
37830 A Methodological Approach to Development of Mental Script for Mental Practice of Micro Suturing

Authors: Vaikunthan Rajaratnam

Abstract:

Intro: Motor imagery (MI) and mental practice (MP) can be an alternative to acquire mastery of surgical skills. One component of using this technique is the use of a mental script. The aim of this study was to design and develop a mental script for basic micro suturing training for skill acquisition using a low-fidelity rubber glove model and to describe the detailed methodology for this process. Methods: This study was based on a design and development research framework. The mental script was developed with 5 expert surgeons performing a cognitive walkthrough of the repair of a vertical opening in a rubber glove model using 8/0 nylon. This was followed by a hierarchal task analysis. A draft script was created, and face and content validity assessed with a checking-back process. The final script was validated with the recruitment of 28 participants, assessed using the Mental Imagery Questionnaire (MIQ). Results: The creation of the mental script is detailed in the full text. After assessment by the expert panel, the mental script had good face and content validity. The average overall MIQ score was 5.2 ± 1.1, demonstrating the validity of generating mental imagery from the mental script developed in this study for micro suturing in the rubber glove model. Conclusion: The methodological approach described in this study is based on an instructional design framework to teach surgical skills. This MP model is inexpensive and easily accessible, addressing the challenge of reduced opportunities to practice surgical skills. However, while motor skills are important, other non-technical expertise required by the surgeon is not addressed with this model. Thus, this model should act a surgical training augment, but not replace it.

Keywords: mental script, motor imagery, cognitive walkthrough, verbal protocol analysis, hierarchical task analysis

Procedia PDF Downloads 76
37829 Prediction of Incompatibility Between Excipients and API in Gliclazide Tablets Using Infrared Spectroscopy and Principle Component Analysis

Authors: Farzad Khajavi

Abstract:

Recognition of the interaction between active pharmaceutical ingredients (API) and excipients is a pivotal factor in the development of all pharmaceutical dosage forms. By predicting the interaction between API and excipients, we will be able to prevent the advent of impurities or at least lessen their amount. In this study, we used principle component analysis (PCA) to predict the interaction between Gliclazide as a secondary amine with Lactose in pharmaceutical solid dosage forms. The infrared spectra of binary mixtures of Gliclazide with Lactose at different mole ratios were recorded, and the obtained matrix was analyzed with PCA. By plotting score columns of the analyzed matrix, the incompatibility between Gliclazide and Lactose was observed. This incompatibility was seen experimentally. We observed the appearance of the impurity originated from the Maillard reaction between Gliclazide and Lactose at the chromatogram of the manufactured tablets in room temperature and under accelerated stability conditions. This impurity increases at the stability months. By changing Lactose to Mannitol and using Calcium Dibasic Phosphate in the tablet formulation, the amount of the impurity decreased and was in the acceptance range defined by British pharmacopeia for Gliclazide Tablets. This method is a fast and simple way to predict the existence of incompatibility between excipients and active pharmaceutical ingredients.

Keywords: PCA, gliclazide, impurity, infrared spectroscopy, interaction

Procedia PDF Downloads 172
37828 Urban Sustainable Development Based on Habitat Quality Evolution: A Case Study in Chongqing, China

Authors: Jing Ren, Kun Wu

Abstract:

Over the last decade or so, China's urbanization has shown a rapid development trend. At the same time, it has also had a great negative impact on the habitat quality. Therefore, it is of great significance to study the impact of land use change on the level of habitat quality in mountain cities for sustainable urban development. This paper analyzed the spatial and temporal land use changes in Chongqing from 2010 to 2020 using ArcGIS 10.6, as well as the evolutionary trend of habitat quality during this period based on the InVEST 3.13.0, to obtain the impact of land use changes on habitat quality. The results showed that the habitat quality in the western part of Chongqing decreased significantly between 2010 and 2020, while the northeastern and southeastern parts remained stable. The main reason for this is the continuous expansion of urban construction land in the western area, which leads to serious habitat fragmentation and the continuous decline of habitat quality. while, in the northeast and southeast areas, due to the greater emphasis on ecological priority and urban-rural coordination in the development process, land use change is characterized by a benign transfer, which maintains the urbanization process while maintaining the coordinated development of habitat quality. This study can provide theoretical support for the sustainable development of mountain cities.

Keywords: mountain cities, ecological environment, habitat quality, sustainable development

Procedia PDF Downloads 42