Search results for: models synthesis
6914 Fast Bayesian Inference of Multivariate Block-Nearest Neighbor Gaussian Process (NNGP) Models for Large Data
Authors: Carlos Gonzales, Zaida Quiroz, Marcos Prates
Abstract:
Several spatial variables collected at the same location that share a common spatial distribution can be modeled simultaneously through a multivariate geostatistical model that takes into account the correlation between these variables and the spatial autocorrelation. The main goal of this model is to perform spatial prediction of these variables in the region of study. Here we focus on a geostatistical multivariate formulation that relies on sharing common spatial random effect terms. In particular, the first response variable can be modeled by a mean that incorporates a shared random spatial effect, while the other response variables depend on this shared spatial term, in addition to specific random spatial effects. Each spatial random effect is defined through a Gaussian process with a valid covariance function, but in order to improve the computational efficiency when the data are large, each Gaussian process is approximated to a Gaussian random Markov field (GRMF), specifically to the block nearest neighbor Gaussian process (Block-NNGP). This approach involves dividing the spatial domain into several dependent blocks under certain constraints, where the cross blocks allow capturing the spatial dependence on a large scale, while each individual block captures the spatial dependence on a smaller scale. The multivariate geostatistical model belongs to the class of Latent Gaussian Models; thus, to achieve fast Bayesian inference, it is used the integrated nested Laplace approximation (INLA) method. The good performance of the proposed model is shown through simulations and applications for massive data.Keywords: Block-NNGP, geostatistics, gaussian process, GRMF, INLA, multivariate models.
Procedia PDF Downloads 976913 Reducing the Imbalance Penalty Through Artificial Intelligence Methods Geothermal Production Forecasting: A Case Study for Turkey
Authors: Hayriye Anıl, Görkem Kar
Abstract:
In addition to being rich in renewable energy resources, Turkey is one of the countries that promise potential in geothermal energy production with its high installed power, cheapness, and sustainability. Increasing imbalance penalties become an economic burden for organizations since geothermal generation plants cannot maintain the balance of supply and demand due to the inadequacy of the production forecasts given in the day-ahead market. A better production forecast reduces the imbalance penalties of market participants and provides a better imbalance in the day ahead market. In this study, using machine learning, deep learning, and, time series methods, the total generation of the power plants belonging to Zorlu Natural Electricity Generation, which has a high installed capacity in terms of geothermal, was estimated for the first one and two weeks of March, then the imbalance penalties were calculated with these estimates and compared with the real values. These modeling operations were carried out on two datasets, the basic dataset and the dataset created by extracting new features from this dataset with the feature engineering method. According to the results, Support Vector Regression from traditional machine learning models outperformed other models and exhibited the best performance. In addition, the estimation results in the feature engineering dataset showed lower error rates than the basic dataset. It has been concluded that the estimated imbalance penalty calculated for the selected organization is lower than the actual imbalance penalty, optimum and profitable accounts.Keywords: machine learning, deep learning, time series models, feature engineering, geothermal energy production forecasting
Procedia PDF Downloads 1106912 Progression of Trauma: Myth Mess Mastery, Addressing and Grooming
Authors: Stuart Bassman
Abstract:
Services that focus on the synthesis of research and clinical practice are vital in providing efficacious change for the men and women who have been victims of childhood sexual abuse. This study will address what processes have been helpful in being a catalyst in changing one’s inner life as well as providing meaningful applications and fulfilling experiences. Initially, we would focus on the Myths regarding childhood sexual abuse. This would include Grooming behaviors and Delayed Disclosures. Subsequently, we would address the Mess that follows from not recognizing the adverse impairments that result from Childhood Sexual Abuse. Finally, we would conclude by looking at the Mastery that could arise from moving from being a Victim to a Survivor and a Thriver.Keywords: trauma, childhood, somatic, treatment
Procedia PDF Downloads 566911 The Influence of Infiltration and Exfiltration Processes on Maximum Wave Run-Up: A Field Study on Trinidad Beaches
Authors: Shani Brathwaite, Deborah Villarroel-Lamb
Abstract:
Wave run-up may be defined as the time-varying position of the landward extent of the water’s edge, measured vertically from the mean water level position. The hydrodynamics of the swash zone and the accurate prediction of maximum wave run-up, play a critical role in the study of coastal engineering. The understanding of these processes is necessary for the modeling of sediment transport, beach recovery and the design and maintenance of coastal engineering structures. However, due to the complex nature of the swash zone, there remains a lack of detailed knowledge in this area. Particularly, there has been found to be insufficient consideration of bed porosity and ultimately infiltration/exfiltration processes, in the development of wave run-up models. Theoretically, there should be an inverse relationship between maximum wave run-up and beach porosity. The greater the rate of infiltration during an event, associated with a larger bed porosity, the lower the magnitude of the maximum wave run-up. Additionally, most models have been developed using data collected on North American or Australian beaches and may have limitations when used for operational forecasting in Trinidad. This paper aims to assess the influence and significance of infiltration and exfiltration processes on wave run-up magnitudes within the swash zone. It also seeks to pay particular attention to how well various empirical formulae can predict maximum run-up on contrasting beaches in Trinidad. Traditional surveying techniques will be used to collect wave run-up and cross-sectional data on various beaches. Wave data from wave gauges and wave models will be used as well as porosity measurements collected using a double ring infiltrometer. The relationship between maximum wave run-up and differing physical parameters will be investigated using correlation analyses. These physical parameters comprise wave and beach characteristics such as wave height, wave direction, period, beach slope, the magnitude of wave setup, and beach porosity. Most parameterizations to determine the maximum wave run-up are described using differing parameters and do not always have a good predictive capability. This study seeks to improve the formulation of wave run-up by using the aforementioned parameters to generate a formulation with a special focus on the influence of infiltration/exfiltration processes. This will further contribute to the improvement of the prediction of sediment transport, beach recovery and design of coastal engineering structures in Trinidad.Keywords: beach porosity, empirical models, infiltration, swash, wave run-up
Procedia PDF Downloads 3576910 Performance Comparison of Deep Convolutional Neural Networks for Binary Classification of Fine-Grained Leaf Images
Authors: Kamal KC, Zhendong Yin, Dasen Li, Zhilu Wu
Abstract:
Intra-plant disease classification based on leaf images is a challenging computer vision task due to similarities in texture, color, and shape of leaves with a slight variation of leaf spot; and external environmental changes such as lighting and background noises. Deep convolutional neural network (DCNN) has proven to be an effective tool for binary classification. In this paper, two methods for binary classification of diseased plant leaves using DCNN are presented; model created from scratch and transfer learning. Our main contribution is a thorough evaluation of 4 networks created from scratch and transfer learning of 5 pre-trained models. Training and testing of these models were performed on a plant leaf images dataset belonging to 16 distinct classes, containing a total of 22,265 images from 8 different plants, consisting of a pair of healthy and diseased leaves. We introduce a deep CNN model, Optimized MobileNet. This model with depthwise separable CNN as a building block attained an average test accuracy of 99.77%. We also present a fine-tuning method by introducing the concept of a convolutional block, which is a collection of different deep neural layers. Fine-tuned models proved to be efficient in terms of accuracy and computational cost. Fine-tuned MobileNet achieved an average test accuracy of 99.89% on 8 pairs of [healthy, diseased] leaf ImageSet.Keywords: deep convolution neural network, depthwise separable convolution, fine-grained classification, MobileNet, plant disease, transfer learning
Procedia PDF Downloads 1866909 Transport Related Air Pollution Modeling Using Artificial Neural Network
Authors: K. D. Sharma, M. Parida, S. S. Jain, Anju Saini, V. K. Katiyar
Abstract:
Air quality models form one of the most important components of an urban air quality management plan. Various statistical modeling techniques (regression, multiple regression and time series analysis) have been used to predict air pollution concentrations in the urban environment. These models calculate pollution concentrations due to observed traffic, meteorological and pollution data after an appropriate relationship has been obtained empirically between these parameters. Artificial neural network (ANN) is increasingly used as an alternative tool for modeling the pollutants from vehicular traffic particularly in urban areas. In the present paper, an attempt has been made to model traffic air pollution, specifically CO concentration using neural networks. In case of CO concentration, two scenarios were considered. First, with only classified traffic volume input and the second with both classified traffic volume and meteorological variables. The results showed that CO concentration can be predicted with good accuracy using artificial neural network (ANN).Keywords: air quality management, artificial neural network, meteorological variables, statistical modeling
Procedia PDF Downloads 5246908 Understanding Cyber Kill Chains: Optimal Allocation of Monitoring Resources Using Cooperative Game Theory
Authors: Roy. H. A. Lindelauf
Abstract:
Cyberattacks are complex processes consisting of multiple interwoven tasks conducted by a set of agents. Interdictions and defenses against such attacks often rely on cyber kill chain (CKC) models. A CKC is a framework that tries to capture the actions taken by a cyber attacker. There exists a growing body of literature on CKCs. Most of this work either a) describes the CKC with respect to one or more specific cyberattacks or b) discusses the tools and technologies used by the attacker at each stage of the CKC. Defenders, facing scarce resources, have to decide where to allocate their resources given the CKC and partial knowledge on the tools and techniques attackers use. In this presentation CKCs are analyzed through the lens of covert projects, i.e., interrelated tasks that have to be conducted by agents (human and/or computer) with the aim of going undetected. Various aspects of covert project models have been studied abundantly in the operations research and game theory domain, think of resource-limited interdiction actions that maximally delay completion times of a weapons project for instance. This presentation has investigated both cooperative and non-cooperative game theoretic covert project models and elucidated their relation to CKC modelling. To view a CKC as a covert project each step in the CKC is broken down into tasks and there are players of which each one is capable of executing a subset of the tasks. Additionally, task inter-dependencies are represented by a schedule. Using multi-glove cooperative games it is shown how a defender can optimize the allocation of his scarce resources (what, where and how to monitor) against an attacker scheduling a CKC. This study presents and compares several cooperative game theoretic solution concepts as metrics for assigning resources to the monitoring of agents.Keywords: cyber defense, cyber kill chain, game theory, information warfare techniques
Procedia PDF Downloads 1406907 Regression Analysis in Estimating Stream-Flow and the Effect of Hierarchical Clustering Analysis: A Case Study in Euphrates-Tigris Basin
Authors: Goksel Ezgi Guzey, Bihrat Onoz
Abstract:
The scarcity of streamflow gauging stations and the increasing effects of global warming cause designing water management systems to be very difficult. This study is a significant contribution to assessing regional regression models for estimating streamflow. In this study, simulated meteorological data was related to the observed streamflow data from 1971 to 2020 for 33 stream gauging stations of the Euphrates-Tigris Basin. Ordinary least squares regression was used to predict flow for 2020-2100 with the simulated meteorological data. CORDEX- EURO and CORDEX-MENA domains were used with 0.11 and 0.22 grids, respectively, to estimate climate conditions under certain climate scenarios. Twelve meteorological variables simulated by two regional climate models, RCA4 and RegCM4, were used as independent variables in the ordinary least squares regression, where the observed streamflow was the dependent variable. The variability of streamflow was then calculated with 5-6 meteorological variables and watershed characteristics such as area and height prior to the application. Of the regression analysis of 31 stream gauging stations' data, the stations were subjected to a clustering analysis, which grouped the stations in two clusters in terms of their hydrometeorological properties. Two streamflow equations were found for the two clusters of stream gauging stations for every domain and every regional climate model, which increased the efficiency of streamflow estimation by a range of 10-15% for all the models. This study underlines the importance of homogeneity of a region in estimating streamflow not only in terms of the geographical location but also in terms of the meteorological characteristics of that region.Keywords: hydrology, streamflow estimation, climate change, hydrologic modeling, HBV, hydropower
Procedia PDF Downloads 1296906 Text-to-Speech in Azerbaijani Language via Transfer Learning in a Low Resource Environment
Authors: Dzhavidan Zeinalov, Bugra Sen, Firangiz Aslanova
Abstract:
Most text-to-speech models cannot operate well in low-resource languages and require a great amount of high-quality training data to be considered good enough. Yet, with the improvements made in ASR systems, it is now much easier than ever to collect data for the design of custom text-to-speech models. In this work, our work on using the ASR model to collect data to build a viable text-to-speech system for one of the leading financial institutions of Azerbaijan will be outlined. NVIDIA’s implementation of the Tacotron 2 model was utilized along with the HiFiGAN vocoder. As for the training, the model was first trained with high-quality audio data collected from the Internet, then fine-tuned on the bank’s single speaker call center data. The results were then evaluated by 50 different listeners and got a mean opinion score of 4.17, displaying that our method is indeed viable. With this, we have successfully designed the first text-to-speech model in Azerbaijani and publicly shared 12 hours of audiobook data for everyone to use.Keywords: Azerbaijani language, HiFiGAN, Tacotron 2, text-to-speech, transfer learning, whisper
Procedia PDF Downloads 446905 Seismic Perimeter Surveillance System (Virtual Fence) for Threat Detection and Characterization Using Multiple ML Based Trained Models in Weighted Ensemble Voting
Authors: Vivek Mahadev, Manoj Kumar, Neelu Mathur, Brahm Dutt Pandey
Abstract:
Perimeter guarding and protection of critical installations require prompt intrusion detection and assessment to take effective countermeasures. Currently, visual and electronic surveillance are the primary methods used for perimeter guarding. These methods can be costly and complicated, requiring careful planning according to the location and terrain. Moreover, these methods often struggle to detect stealthy and camouflaged insurgents. The object of the present work is to devise a surveillance technique using seismic sensors that overcomes the limitations of existing systems. The aim is to improve intrusion detection, assessment, and characterization by utilizing seismic sensors. Most of the similar systems have only two types of intrusion detection capability viz., human or vehicle. In our work we could even categorize further to identify types of intrusion activity such as walking, running, group walking, fence jumping, tunnel digging and vehicular movements. A virtual fence of 60 meters at GCNEP, Bahadurgarh, Haryana, India, was created by installing four underground geophones at a distance of 15 meters each. The signals received from these geophones are then processed to find unique seismic signatures called features. Various feature optimization and selection methodologies, such as LightGBM, Boruta, Random Forest, Logistics, Recursive Feature Elimination, Chi-2 and Pearson Ratio were used to identify the best features for training the machine learning models. The trained models were developed using algorithms such as supervised support vector machine (SVM) classifier, kNN, Decision Tree, Logistic Regression, Naïve Bayes, and Artificial Neural Networks. These models were then used to predict the category of events, employing weighted ensemble voting to analyze and combine their results. The models were trained with 1940 training events and results were evaluated with 831 test events. It was observed that using the weighted ensemble voting increased the efficiency of predictions. In this study we successfully developed and deployed the virtual fence using geophones. Since these sensors are passive, do not radiate any energy and are installed underground, it is impossible for intruders to locate and nullify them. Their flexibility, quick and easy installation, low costs, hidden deployment and unattended surveillance make such systems especially suitable for critical installations and remote facilities with difficult terrain. This work demonstrates the potential of utilizing seismic sensors for creating better perimeter guarding and protection systems using multiple machine learning models in weighted ensemble voting. In this study the virtual fence achieved an intruder detection efficiency of over 97%.Keywords: geophone, seismic perimeter surveillance, machine learning, weighted ensemble method
Procedia PDF Downloads 786904 Lipase-Mediated Formation of Peroxyoctanoic Acid Used in Catalytic Epoxidation of α-Pinene
Authors: N. Wijayati, Kusoro Siadi, Hanny Wijaya, Maggy Thenawijjaja Suhartono
Abstract:
This work describes the lipase-mediated synthesis of α-pinene oxide at ambient temperature. The immobilized lipase from Pseudomonas aeruginosa is used to generate peroxyoctanoic acid directly from octanoic acid and hydrogen peroxide. The peroxy acid formed is then applied for in situ oxidation of α-pinene. High conversion of α-pinene to α-pinene oxide (approximately 78%) was achieved when using 0,1 g enzim lipase, 6 mmol H2O2, dan 5 mmol octanoic acid. Various parameters affecting the conversion of α-pinene to α pinene oxide were studied.Keywords: α-Pinene; P. aeruginosa; Octanoic acid
Procedia PDF Downloads 2786903 Strategic Tools for Entrepreneurship: Model Proposal for Manufacturing Companies
Authors: Chiara Mansanta, Daniela Sani
Abstract:
The present paper presents the further development of the application of a standard methodology to boost innovation inside real case studies of manufacturing companies. The proposed methodology provides a viable solution for manufacturing companies that have to evaluate new business ideas. The study underlined the concept of entrepreneurship and how a manager can use it to promote innovation inside their companies. Starting from a literature study on entrepreneurship, this paper examines the role of the manager in supporting a company’s development. The empirical part of the study is based on two manufacturing companies that used the proposed methodology to favour entrepreneurship through an alternative approach. The research demonstrated the need for companies to have a structured and well-defined methodology to achieve their goals. The purpose of this article is to understand the significance of business models inside companies and explore how they affect business strategy and innovation management. The idea is to use business models to support entrepreneurs in their decision-making processes, reducing risks and avoiding errors.Keywords: entrepreneurship, manufacturing companies, solution validation, strategic management
Procedia PDF Downloads 956902 Large-Scale Electroencephalogram Biometrics through Contrastive Learning
Authors: Mostafa ‘Neo’ Mohsenvand, Mohammad Rasool Izadi, Pattie Maes
Abstract:
EEG-based biometrics (user identification) has been explored on small datasets of no more than 157 subjects. Here we show that the accuracy of modern supervised methods falls rapidly as the number of users increases to a few thousand. Moreover, supervised methods require a large amount of labeled data for training which limits their applications in real-world scenarios where acquiring data for training should not take more than a few minutes. We show that using contrastive learning for pre-training, it is possible to maintain high accuracy on a dataset of 2130 subjects while only using a fraction of labels. We compare 5 different self-supervised tasks for pre-training of the encoder where our proposed method achieves the accuracy of 96.4%, improving the baseline supervised models by 22.75% and the competing self-supervised model by 3.93%. We also study the effects of the length of the signal and the number of channels on the accuracy of the user-identification models. Our results reveal that signals from temporal and frontal channels contain more identifying features compared to other channels.Keywords: brainprint, contrastive learning, electroencephalo-gram, self-supervised learning, user identification
Procedia PDF Downloads 1576901 Simulation of Red Blood Cells in Complex Micro-Tubes
Authors: Ting Ye, Nhan Phan-Thien, Chwee Teck Lim, Lina Peng, Huixin Shi
Abstract:
In biofluid flow systems, often the flow problems of fluids of complex structures, such as the flow of red blood cells (RBCs) through complex capillary vessels, need to be considered. In this paper, we aim to apply a particle-based method, Smoothed Dissipative Particle Dynamics (SDPD), to simulate the motion and deformation of RBCs in complex micro-tubes. We first present the theoretical models, including SDPD model, RBC-fluid interaction model, RBC deformation model, RBC aggregation model, and boundary treatment model. After that, we show the verification and validation of these models, by comparing our numerical results with the theoretical, experimental and previously-published numerical results. Finally, we provide some simulation cases, such as the motion and deformation of RBCs in rectangular, cylinder, curved, bifurcated, and constricted micro-tubes, respectively.Keywords: aggregation, deformation, red blood cell, smoothed dissipative particle dynamics
Procedia PDF Downloads 1746900 Polypropylene Matrix Enriched With Silver Nanoparticles From Banana Peel Extract For Antimicrobial Control Of E. coli and S. epidermidis To Maintain Fresh Food
Authors: Michail Milas, Aikaterini Dafni Tegiou, Nickolas Rigopoulos, Eustathios Giaouris, Zaharias Loannou
Abstract:
Nanotechnology, a relatively new scientific field, addresses the manipulation of nanoscale materials and devices, which are governed by unique properties, and is applied in a wide range of industries, including food packaging. The incorporation of nanoparticles into polymer matrices used for food packaging is a field that is highly researched today. One such combination is silver nanoparticles with polypropylene. In the present study, the synthesis of the silver nanoparticles was carried out by a natural method. In particular, a ripe banana peel extract was used. This method is superior to others as it stands out for its environmental friendliness, high efficiency and low-cost requirement. In particular, a 1.75 mM AgNO₃ silver nitrate solution was used, as well as a BPE concentration of 1.7% v/v, an incubation period of 48 hours at 70°C and a pH of 4.3 and after its preparation, the polypropylene films were soaked in it. For the PP films, random PP spheres were melted at 170-190°C into molds with 0.8cm diameter. This polymer was chosen as it is suitable for plastic parts and reusable plastic containers of various types that are intended to come into contact with food without compromising its quality and safety. The antimicrobial test against Escherichia coli DFSNB1 and Staphylococcus epidermidis DFSNB4 was performed on the films. It appeared that the films with silver nanoparticles had a reduction, at least 100 times, compared to those without silver nanoparticles, in both strains. The limit of detection is the lower limit of the vertical error lines in the presence of nanoparticles, which is 3.11. The main reasons that led to the adsorption of nanoparticles are the porous nature of polypropylene and the adsorption capacity of nanoparticles on the surface of the films due to hydrophobic-hydrophilic forces. The most significant parameters that contributed to the results of the experiment include the following: the stage of ripening of the banana during the preparation of the plant extract, the temperature and residence time of the nanoparticle solution in the oven, the residence time of the polypropylene films in the nanoparticle solution, the number of nanoparticles inoculated on the films and, finally, the time these stayed in the refrigerator so that they could dry and be ready for antimicrobial treatment.Keywords: antimicrobial control, banana peel extract, E. coli, natural synthesis, microbe, plant extract, polypropylene films, S.epidermidis, silver nano, random pp
Procedia PDF Downloads 1766899 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models
Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah
Abstract:
In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model
Procedia PDF Downloads 2426898 Using Structural Equation Modeling to Analyze the Impact of Remote Work on Job Satisfaction
Authors: Florian Pfeffel, Valentin Nickolai, Christian Louis Kühner
Abstract:
Digitalization has disrupted the traditional workplace environment by allowing many employees to work from anywhere at any time. This trend of working from home was further accelerated due to the COVID-19 crisis, which forced companies to rethink their workplace models. While in many companies, this shift happened out of pure necessity; many employees were left more satisfied with their job due to the opportunity to work from home. This study focuses on employees’ job satisfaction in the service sector in dependence on the different work models, which are defined as a “work from home” model, the traditional “work in office” model, and a hybrid model. Using structural equation modeling (SEM), these three work models have been analyzed based on 13 influencing factors on job satisfaction that have been further summarized in the three groups “classic influencing factors”, “influencing factors changed by remote working”, and “new remote working influencing factors”. Based on the influencing factors on job satisfaction, a survey has been conducted with n = 684 employees in the service sector. Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. Additionally, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). The SEM-analysis has shown that the most significant influencing factor on job satisfaction is “identification with the work” with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis shows that the identification with the work is the most significant factor in all three work models and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that employees who work entirely remotely or have a hybrid work model are significantly more satisfied with their job, with a job satisfaction score of 5.0 respectively on a scale from 1 (very dissatisfied) to 7 (very satisfied), than employees do not have the option to work from home with a score of 4.6. This comes as a result of the lower identification with the work in the model without any remote working. Furthermore, the responses indicate that it is important to consider the individual preferences of each employee when it comes to the work model to achieve overall higher job satisfaction. Thus, it can be argued that companies can profit off of more motivation and higher productivity by considering the individual work model preferences, therefore, increasing the identification with the respective work.Keywords: home-office, identification with work, job satisfaction, new work, remote work, structural equation modeling
Procedia PDF Downloads 826897 Role of NaOH in the Synthesis of Waste-derived Solid Hydroxy Sodalite Catalyst for the Transesterification of Waste Animal Fat to Biodiesel
Authors: Thomas Chinedu Aniokete, Gordian Onyebuchukwu Mbah, Michael Daramola
Abstract:
A sustainable NaOH integrated hydrothermal protocol was developed for the synthesis of waste-derived hydroxy sodalite catalysts for transesterification of waste animal fat (WAF) with a high per cent free fatty acid (FFA) to biodiesel. In this work, hydroxy sodalite catalyst was synthesized from two complex waste materials namely coal fly ash (CFA) and waste industrial brine (WIB). Measured amounts of South African CFA and WIB obtained from a coal mine field were mixed with NaOH solution at different concentrations contained in secured glass vessels equipped with magnetic stirrers and formed consistent slurries after aging condition at 47 oC for 48 h. The slurries were then subjected to hydrothermal treatments at 140 oC for 48 h, washed thoroughly and separated by the action of a centrifuge on the mixture. The resulting catalysts were calcined in a muffle furnace for 2 h at 200 oC and subsequently characterized for different effects using X-ray diffraction (XRD), scanning electron microscopy (SEM), Fourier transform infrared (FT-IR), and Bennett Emmet Teller (BET) adsorption-desorption techniques. The produced animal fat methyl ester (AFME) was analyzed using the gas chromatography-mass spectrometry (GC-MS) method. Results of the investigation indicate profoundly an enhanced catalyst purity, textural property and desired morphology due to the action of NaOH. Similarly, the performance evaluation with respect to catalyst activity reveals a high catalytic conversion efficiency of 98 % of the high FFA WAF to biodiesel under the following reaction conditions; a methanol-to-WAF ratio of 15:1, amount of SOD catalyst of 3 wt % with a stirring speed of 300-500 rpm, a reaction temperature of 60 oC and a reaction time of 8 h. There was a recovered 96 % stable catalyst after reactions and potentially recyclable, thus contributing to the economic savings to the process that had been a major bottleneck to the production of biodiesel. This NaOH route for synthesizing waste-derived hydroxy sodalite (SOD) catalyst is a sustainable and eco-friendly technology that speaks directly to the global quest for renewable-fossil fuel controversy enforcing sustainable development goal 7.Keywords: coal fly ash, waste industrial brine, waste-derived hydroxy sodalite catalyst, sodium hydroxide, biodiesel, transesterification, biomass conversion
Procedia PDF Downloads 346896 Estimation of the Drought Index Based on the Climatic Projections of Precipitation of the Uruguay River Basin
Authors: José Leandro Melgar Néris, Claudinéia Brazil, Luciane Teresa Salvi, Isabel Cristina Damin
Abstract:
The impact the climate change is not recent, the main variable in the hydrological cycle is the sequence and shortage of a drought, which has a significant impact on the socioeconomic, agricultural and environmental spheres. This study aims to characterize and quantify, based on precipitation climatic projections, the rainy and dry events in the region of the Uruguay River Basin, through the Standardized Precipitation Index (SPI). The database is the image that is part of the Intercomparison of Model Models, Phase 5 (CMIP5), which provides condition prediction models, organized according to the Representative Routes of Concentration (CPR). Compared to the normal set of climates in the Uruguay River Watershed through precipitation projections, seasonal precipitation increases for all proposed scenarios, with a low climate trend. From the data of this research, the idea is that this article can be used to support research and the responsible bodies can use it as a subsidy for mitigation measures in other hydrographic basins.Keywords: climate change, climatic model, dry events, precipitation projections
Procedia PDF Downloads 1446895 The Role of Dialogue in Shared Leadership and Team Innovative Behavior Relationship
Authors: Ander Pomposo
Abstract:
Purpose: The aim of this study was to investigate the impact that dialogue has on the relationship between shared leadership and innovative behavior and the importance of dialogue in innovation. This study wants to contribute to the literature by providing theorists and researchers a better understanding of how to move forward in the studies of moderator variables in the relationship between shared leadership and team outcomes such as innovation. Methodology: A systematic review of the literature, originally adopted from the medical sciences but also used in management and leadership studies, was conducted to synthesize research in a systematic, transparent and reproducible manner. A final sample of 48 empirical studies was scientifically synthesized. Findings: Shared leadership gives a better solution to team management challenges and goes beyond the classical, hierarchical, or vertical leadership models based on the individual leader approach. One of the outcomes that emerge from shared leadership is team innovative behavior. To intensify the relationship between shared leadership and team innovative behavior, and understand when is more effective, the moderating effects of other variables in this relationship should be examined. This synthesis of the empirical studies revealed that dialogue is a moderator variable that has an impact on the relationship between shared leadership and team innovative behavior when leadership is understood as a relational process. Dialogue is an activity between at least two speech partners trying to fulfill a collective goal and is a way of living open to people and ideas through interaction. Dialogue is productive when team members engage relationally with one another. When this happens, participants are more likely to take responsibility for the tasks they are involved and for the relationships they have with others. In this relational engagement, participants are likely to establish high-quality connections with a high degree of generativity. This study suggests that organizations should facilitate the dialogue of team members in shared leadership which has a positive impact on innovation and offers a more adaptive framework for the leadership that is needed in teams working in complex work tasks. These results uncover the necessity of more research on the role that dialogue plays in contributing to important organizational outcomes such as innovation. Case studies describing both best practices and obstacles of dialogue in team innovative behavior are necessary to gain a more detailed insight into the field. It will be interesting to see how all these fields of research evolve and are implemented in dialogue practices in the organizations that use team-based structures to deal with uncertainty, fast-changing environments, globalization and increasingly complex work.Keywords: dialogue, innovation, leadership, shared leadership, team innovative behavior
Procedia PDF Downloads 1816894 Metabolic Changes during Reprogramming of Wheat and Triticale Microspores
Authors: Natalia Hordynska, Magdalena Szechynska-Hebda, Miroslaw Sobczak, Elzbieta Rozanska, Joanna Troczynska, Zofia Banaszak, Maria Wedzony
Abstract:
Albinism is a common problem encountered in wheat and triticale breeding programs, which require in vitro culture steps e.g. generation of doubled haploids via androgenesis process. Genetic factor is a major determinant of albinism, however, environmental conditions such as temperature and media composition influence the frequency of albino plant formation. Cold incubation of wheat and triticale spikes induced a switch from gametophytic to sporophytic development. Further, androgenic structures formed from anthers of the genotypes susceptible to androgenesis or treated with cold stress, had a pool of structurally primitive plastids, with small starch granules or swollen thylakoids. High temperature was a factor inducing andro-genesis of wheat and triticale, but at the same time, it was a factor favoring the formation of albino plants. In genotypes susceptible to albinism or after heat stress conditions, cells formed from anthers were vacuolated, and plastids were eliminated. Partial or complete loss of chlorophyll pigments and incomplete differentiation of chloroplast membranes result in formation of tissues or whole plant unable to perform photosynthesis. Indeed, susceptibility to the andro-genesis process was associated with an increase of total concentration of photosynthetic pigments in anthers, spikes and regenerated plants. The proper balance of the synthesis of various pigments, was the starting point for their proper incorporation into photosynthetic membranes. In contrast, genotypes resistant to the androgenesis process and those treated with heat, contained 100 times lower content of photosynthetic pigments. In particular, the synthesis of violaxanthin, zeaxanthin, lutein and chlorophyll b was limited. Furthermore, deregulation of starch and lipids synthesis, which led to the formation of very complex starch granules and an increased number of oleosomes, respectively, correlated with the reduction of the efficiency of androgenesis. The content of other sugars varied depending on the genotype and the type of stress. The highest content of various sugars was found for genotypes susceptible to andro-genesis, and highly reduced for genotypes resistant to androgenesis. The most important sugars seem to be glucose and fructose. They are involved in sugar sensing and signaling pathways, which affect the expression of various genes and regulate plant development. Sucrose, on the other hand, seems to have minor effect at each stage of the androgenesis. The sugar metabolism was related to metabolic activity of microspores. The genotypes susceptible to androgenesis process had much faster mitochondrium- and chloroplast-dependent energy conversion and higher heat production by tissues. Thus, the effectiveness of metabolic processes, their balance and the flexibility under the stress was a factor determining the direction of microspore development, and in the later stages of the androgenesis process, a factor supporting the induction of androgenic structures, chloroplast formation and the regeneration of green plants. The work was financed by Ministry of Agriculture and Rural Development within Program: ‘Biological Progress in Plant Production’, project no HOR.hn.802.15.2018.Keywords: androgenesis, chloroplast, metabolism, temperature stress
Procedia PDF Downloads 2606893 Investigation and Comprehensive Benefit Analysis of 11 Typical Polar-Based Agroforestry Models Based on Analytic Hierarchy Process in Anhui Province, Eastern China
Authors: Zhihua Cao, Hongfei Zhao, Zhongneng Wu
Abstract:
The development of polar-based agroforestry was necessary due to the influence of the timber market environment in China, which can promote the coordinated development of forestry and agriculture, and gain remarkable ecological, economic and social benefits. The main agroforestry models of the main poplar planting area in Huaibei plain and along the Yangtze River plain were carried out. 11 typical management models of poplar were selected to sum up: pure poplar forest, poplar-rape-soybean, poplar-wheat-soybean, poplar-rape-cotton, poplar-wheat, poplar-chicken, poplar-duck, poplar-sheep, poplar-Agaricus blazei, poplar-oil peony, poplar-fish, represented by M0-M10, respectively. 12 indexes related with economic, ecological and social benefits (annual average cost, net income, ratio of output to investment, payback period of investment, land utilization ratio, utilization ratio of light energy, improvement and system stability of ecological and production environment, product richness, labor capacity, cultural quality of labor force, sustainability) were screened out to carry on the comprehensive evaluation and analysis to 11 kinds of typical agroforestry models based on analytic hierarchy process (AHP). The results showed that the economic benefit of each agroforestry model was in the order of: M8 > M6 > M9 > M7 > M5 > M10 > M4 > M1 > M2 > M3 > M0. The economic benefit of poplar-A. blazei model was the highest (332, 800 RMB / hm²), followed by poplar-duck and poplar-oil peony model (109, 820RMB /hm², 5, 7226 RMB /hm²). The order of comprehensive benefit was: M8 > M4 > M9 > M6 > M1 > M2 > M3 > M7 > M5 > M10 > M0. The economic benefit and comprehensive benefit of each agroforestry model were higher than that of pure poplar forest. The comprehensive benefit of poplar-A. blazei model was the highest, and that of poplar-wheat model ranked second, while its economic benefit was not high. Next were poplar-oil peony and poplar-duck models. It was suggested that the model of poplar-wheat should be adopted in the plain along the Yangtze River, and the whole cycle mode of poplar-grain, popalr-A. blazei, or poplar-oil peony should be adopted in Huaibei plain, northern Anhui. Furthermore, wheat, rape, and soybean are the main crops before the stand was closed; the agroforestry model of edible fungus or Chinese herbal medicine can be carried out when the stand was closed in order to maximize the comprehensive benefit. The purpose of this paper is to provide a reference for forest farmers in the selection of poplar agroforestry model in the future and to provide the basic data for the sustainable and efficient study of poplar agroforestry in Anhui province, eastern China.Keywords: agroforestry, analytic hierarchy process (AHP), comprehensive benefit, model, poplar
Procedia PDF Downloads 1656892 Decision Support System for the Management of the Shandong Peninsula, China
Authors: Natacha Fery, Guilherme L. Dalledonne, Xiangyang Zheng, Cheng Tang, Roberto Mayerle
Abstract:
A Decision Support System (DSS) for supporting decision makers in the management of the Shandong Peninsula has been developed. Emphasis has been given to coastal protection, coastal cage aquaculture and harbors. The investigations were done in the framework of a joint research project funded by the German Ministry of Education and Research (BMBF) and the Chinese Academy of Sciences (CAS). In this paper, a description of the DSS, the development of its components, and results of its application are presented. The system integrates in-situ measurements, process-based models, and a database management system. Numerical models for the simulation of flow, waves, sediment transport and morphodynamics covering the entire Bohai Sea are set up based on the Delft3D modelling suite (Deltares). Calibration and validation of the models were realized based on the measurements of moored Acoustic Doppler Current Profilers (ADCP) and High Frequency (HF) radars. In order to enable cost-effective and scalable applications, a database management system was developed. It enhances information processing, data evaluation, and supports the generation of data products. Results of the application of the DSS to the management of coastal protection, coastal cage aquaculture and harbors are presented here. Model simulations covering the most severe storms observed during the last decades were carried out leading to an improved understanding of hydrodynamics and morphodynamics. Results helped in the identification of coastal stretches subjected to higher levels of energy and improved support for coastal protection measures.Keywords: coastal protection, decision support system, in-situ measurements, numerical modelling
Procedia PDF Downloads 1956891 Determination Power and Sample Size Zero-Inflated Negative Binomial Dependent Death Rate of Age Model (ZINBD): Regression Analysis Mortality Acquired Immune Deficiency Deciency Syndrome (AIDS)
Authors: Mohd Asrul Affendi Bin Abdullah
Abstract:
Sample size calculation is especially important for zero inflated models because a large sample size is required to detect a significant effect with this model. This paper verify how to present percentage of power approximation for categorical and then extended to zero inflated models. Wald test was chosen to determine power sample size of AIDS death rate because it is frequently used due to its approachability and its natural for several major recent contribution in sample size calculation for this test. Power calculation can be conducted when covariates are used in the modeling ‘excessing zero’ data and assist categorical covariate. Analysis of AIDS death rate study is used for this paper. Aims of this study to determine the power of sample size (N = 945) categorical death rate based on parameter estimate in the simulation of the study.Keywords: power sample size, Wald test, standardize rate, ZINBDR
Procedia PDF Downloads 4356890 Design and Synthesis of Copper Doped Zeolite Composite for Antimicrobial Activity and Heavy Metal Removal from Waste Water
Authors: Feleke Terefe Fanta
Abstract:
The existence of heavy metals and microbial contaminants in aquatic system of Akaki river basin, a sub city of Addis Ababa, has become a public concern as human population increases and land development continues. This is because effluents from chemical and pharmaceutical industries are directly discharged onto surrounding land, irrigation fields and surface water bodies. In the present study, we synthesised zeolites and copper- zeolite composite based adsorbent through cost effective and simple approach to mitigate the problem. The study presents determination of heavy metal content and microbial contamination level of waste water sample collected from Akaki river using zeolites and copper- doped zeolites as adsorbents. The synthesis of copper- zeolite X composite was carried out by ion exchange method of copper ions into zeolites frameworks. The optimum amount of copper ions loaded into the zeolites frameworks were studied using the pore size determination concept via iodine test. The copper- loaded zeolites were characterized by X-ray diffraction (XRD). The XRD analysis showed clear difference in phase purity of zeolite before and after copper ion exchange. The concentration of Cd, Cr, and Pb were determined in waste water sample using atomic absorption spectrophotometry. The mean concentrations of Cd, Cr, and Pb in untreated sample were 0.795, 0.654 and 0.7025 mg/L respectively. The concentration of Cd, Cr, and Pb decreased to 0.005, 0.052 and BDL mg/L for sample treated with bare zeolite X while a further decrease in concentration of Cd, Cr, and Pb (0.005, BDL and BDL) mg/L respectively was observed for the sample treated with copper- zeolite composite. The antimicrobial activity was investigated by exposing the total coliform to the Zeolite X and Copper-modified Zeolite X. Zeolite X and Copper-modified Zeolite X showed complete elimination of microbilas after 90 and 50 minutes contact time respectively. This demonstrates effectiveness of copper- zeolite composite as efficient disinfectant. To understand the mode of heavy metals removal and antimicrobial activity of the copper-loaded zeolites; the adsorbent dose, contact time, temperature was studied. Overall, the results obtained in this study showed high antimicrobial disinfection and heavy metal removal efficiencies of the synthesized adsorbent.Keywords: waste water, copper doped zeolite x, adsorption heavy metal, disinfection
Procedia PDF Downloads 826889 Technology Adoption Models: A Study on Brick Kiln Firms in Punjab
Authors: Ajay Kumar, Shamily Jaggi
Abstract:
In developing countries like India development of modern technologies has been a key determinant in accelerating industrialization and urbanization. But in the pursuit of rapid economic growth, development is considered a top priority, while environmental protection is not given the same importance. Thus, a number of industries sited haphazardly have been established, leading to a deterioration of natural resources like water, soil and air. As a result, environmental pollution is tremendously increasing due to industrialization and mechanization that are serving to fulfill the demands of the population. With the increasing population, demand for bricks for construction work is also increasing, establishing the brick industry as a growing industry. Brick production requires two main resources; water as a source of life, and soil, as a living environment. Water and soil conservation is a critical issue in areas facing scarcity of water and soil resources. The purpose of this review paper is to provide a brief overview of the theoretical frameworks used in the analysis of the adoption and/or acceptance of soil and water conservation practices in the brick industry. Different frameworks and models have been used in the analysis of the adoption and/or acceptance of new technologies and practices; these include the technology acceptance model, motivational model, theory of reasoned action, innovation diffusion theory, theory of planned behavior, and the unified theory of acceptance and use of technology. However, every model has some limitations, such as not considering environmental/contextual and economic factors that may affect the individual’s intention to perform a behavior. The paper concludes that in comparing other models, the UTAUT seems a better model for understanding the dynamics of acceptance and adoption of water and soil conservation practices.Keywords: brick kiln, water conservation, soil conservation, unified theory of acceptance and use of technology, technology adoption
Procedia PDF Downloads 1046888 Efficient Layout-Aware Pretraining for Multimodal Form Understanding
Authors: Armineh Nourbakhsh, Sameena Shah, Carolyn Rose
Abstract:
Layout-aware language models have been used to create multimodal representations for documents that are in image form, achieving relatively high accuracy in document understanding tasks. However, the large number of parameters in the resulting models makes building and using them prohibitive without access to high-performing processing units with large memory capacity. We propose an alternative approach that can create efficient representations without the need for a neural visual backbone. This leads to an 80% reduction in the number of parameters compared to the smallest SOTA model, widely expanding applicability. In addition, our layout embeddings are pre-trained on spatial and visual cues alone and only fused with text embeddings in downstream tasks, which can facilitate applicability to low-resource of multi-lingual domains. Despite using 2.5% of training data, we show competitive performance on two form understanding tasks: semantic labeling and link prediction.Keywords: layout understanding, form understanding, multimodal document understanding, bias-augmented attention
Procedia PDF Downloads 1486887 Competitors’ Influence Analysis of a Retailer by Using Customer Value and Huff’s Gravity Model
Authors: Yepeng Cheng, Yasuhiko Morimoto
Abstract:
Customer relationship analysis is vital for retail stores, especially for supermarkets. The point of sale (POS) systems make it possible to record the daily purchasing behaviors of customers as an identification point of sale (ID-POS) database, which can be used to analyze customer behaviors of a supermarket. The customer value is an indicator based on ID-POS database for detecting the customer loyalty of a store. In general, there are many supermarkets in a city, and other nearby competitor supermarkets significantly affect the customer value of customers of a supermarket. However, it is impossible to get detailed ID-POS databases of competitor supermarkets. This study firstly focused on the customer value and distance between a customer's home and supermarkets in a city, and then constructed the models based on logistic regression analysis to analyze correlations between distance and purchasing behaviors only from a POS database of a supermarket chain. During the modeling process, there are three primary problems existed, including the incomparable problem of customer values, the multicollinearity problem among customer value and distance data, and the number of valid partial regression coefficients. The improved customer value, Huff’s gravity model, and inverse attractiveness frequency are considered to solve these problems. This paper presents three types of models based on these three methods for loyal customer classification and competitors’ influence analysis. In numerical experiments, all types of models are useful for loyal customer classification. The type of model, including all three methods, is the most superior one for evaluating the influence of the other nearby supermarkets on customers' purchasing of a supermarket chain from the viewpoint of valid partial regression coefficients and accuracy.Keywords: customer value, Huff's Gravity Model, POS, Retailer
Procedia PDF Downloads 1236886 Advancing Urban Sustainability through Data-Driven Machine Learning Solutions
Authors: Nasim Eslamirad, Mahdi Rasoulinezhad, Francesco De Luca, Sadok Ben Yahia, Kimmo Sakari Lylykangas, Francesco Pilla
Abstract:
With the ongoing urbanization, cities face increasing environmental challenges impacting human well-being. To tackle these issues, data-driven approaches in urban analysis have gained prominence, leveraging urban data to promote sustainability. Integrating Machine Learning techniques enables researchers to analyze and predict complex environmental phenomena like Urban Heat Island occurrences in urban areas. This paper demonstrates the implementation of data-driven approach and interpretable Machine Learning algorithms with interpretability techniques to conduct comprehensive data analyses for sustainable urban design. The developed framework and algorithms are demonstrated for Tallinn, Estonia to develop sustainable urban strategies to mitigate urban heat waves. Geospatial data, preprocessed and labeled with UHI levels, are used to train various ML models, with Logistic Regression emerging as the best-performing model based on evaluation metrics to derive a mathematical equation representing the area with UHI or without UHI effects, providing insights into UHI occurrences based on buildings and urban features. The derived formula highlights the importance of building volume, height, area, and shape length to create an urban environment with UHI impact. The data-driven approach and derived equation inform mitigation strategies and sustainable urban development in Tallinn and offer valuable guidance for other locations with varying climates.Keywords: data-driven approach, machine learning transparent models, interpretable machine learning models, urban heat island effect
Procedia PDF Downloads 376885 Variable Mapping: From Bibliometrics to Implications
Authors: Przemysław Tomczyk, Dagmara Plata-Alf, Piotr Kwiatek
Abstract:
Literature review is indispensable in research. One of the key techniques used in it is bibliometric analysis, where one of the methods is science mapping. The classic approach that dominates today in this area consists of mapping areas, keywords, terms, authors, or citations. This approach is also used in relation to the review of literature in the field of marketing. The development of technology has resulted in the fact that researchers and practitioners use the capabilities of software available on the market for this purpose. The use of science mapping software tools (e.g., VOSviewer, SciMAT, Pajek) in recent publications involves the implementation of a literature review, and it is useful in areas with a relatively high number of publications. Despite this well-grounded science mapping approach having been applied in the literature reviews, performing them is a painstaking task, especially if authors would like to draw precise conclusions about the studied literature and uncover potential research gaps. The aim of this article is to identify to what extent a new approach to science mapping, variable mapping, takes advantage of the classic science mapping approach in terms of research problem formulation and content/thematic analysis for literature reviews. To perform the analysis, a set of 5 articles on customer ideation was chosen. Next, the analysis of key words mapping results in VOSviewer science mapping software was performed and compared with the variable map prepared manually on the same articles. Seven independent expert judges (management scientists on different levels of expertise) assessed the usability of both the stage of formulating, the research problem, and content/thematic analysis. The results show the advantage of variable mapping in the formulation of the research problem and thematic/content analysis. First, the ability to identify a research gap is clearly visible due to the transparent and comprehensive analysis of the relationships between the variables, not only keywords. Second, the analysis of relationships between variables enables the creation of a story with an indication of the directions of relationships between variables. Demonstrating the advantage of the new approach over the classic one may be a significant step towards developing a new approach to the synthesis of literature and its reviews. Variable mapping seems to allow scientists to build clear and effective models presenting the scientific achievements of a chosen research area in one simple map. Additionally, the development of the software enabling the automation of the variable mapping process on large data sets may be a breakthrough change in the field of conducting literature research.Keywords: bibliometrics, literature review, science mapping, variable mapping
Procedia PDF Downloads 120