Search results for: activity-based benefit assessment approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19253

Search results for: activity-based benefit assessment approach

15023 Economic Benefit of Wild Animals: A Possible Threat to Conservation in Ovia Southwest, Edo State, Nigeria

Authors: B. G. Oguntuase, M. O. Olofinsae

Abstract:

This study was carried out to assess the contribution of bush meat to Edo people’s livelihood and the consequence of utilization on conservation. Five markets were selected in Ovia Southwest local government area of Edo State, twenty bush meat sellers were selected from each market. Direct observations were made to document the composition of wild animals under sale in the study area. A total of one hundred questionnaires were administered to the respondents. The questionnaires were all retrieved and analyzed using descriptive analysis. The results show that thirteen animal species are being traded in the area. The price for the animal species (whole animal) ranged from N200 to N9,520. Respondents reported that there is a decline in the animal population over time. Between 64% and 95% of the respondents acknowledged population decline in seven of the thirteen animal species available for sale compared to what it used to be some ten years ago. Sales of wild animal species could be regarded as a profitable business in the rural community, supporting livelihood of the community, but could have devastating effect on conservation as already observed in this study if harvesting of wild animals is not regulated on controlled or sustainable basis.

Keywords: conservation, economic benefits, hunting, population, wild animals

Procedia PDF Downloads 457
15022 Unmanned Systems in Urban Areas

Authors: Abdullah Beyazkurk, Onur Ozdemir

Abstract:

The evolution of warfare has been affected from technological developments to a large extent. Another important factor that affected the evolution of warfare is the space. Technological developments became cornerstones for the organization of the forces on the field, while space of the battlefield gained importance with the introduction of urban areas as 'battlefields'. The use of urban areas as battlefields increased the casualty, while technological developments began to play a remedial role. Thus, the unmanned systems drew attention as the remedy. Today's widely used unmanned aerial vehicles have great effects on the operations. On the other hand, with the increasing urbanization, and the wide use of urban areas as battlefields make it a necessity to benefit from unmanned systems on the ground as well. This study focuses on the use of unmanned aerial systems as well as unmanned ground systems in urban warfare, with regards to their performance and cost affectivity. The study defends that the use of unmanned vehicles will be remedial for increasing casualty rates, while their precision and superhuman capacity will manifest the performance advantage. The findings of this study will help modern armies focus on unmanned systems, especially for the urban, anti-terror, or counter insurgency operations.

Keywords: technology, warfare, urban warfare, unmanned systems, unmanned ground vehicles, unmanned aerial vehicles

Procedia PDF Downloads 340
15021 New Approach for Load Modeling

Authors: Slim Chokri

Abstract:

Load forecasting is one of the central functions in power systems operations. Electricity cannot be stored, which means that for electric utility, the estimate of the future demand is necessary in managing the production and purchasing in an economically reasonable way. A majority of the recently reported approaches are based on neural network. The attraction of the methods lies in the assumption that neural networks are able to learn properties of the load. However, the development of the methods is not finished, and the lack of comparative results on different model variations is a problem. This paper presents a new approach in order to predict the Tunisia daily peak load. The proposed method employs a computational intelligence scheme based on the Fuzzy neural network (FNN) and support vector regression (SVR). Experimental results obtained indicate that our proposed FNN-SVR technique gives significantly good prediction accuracy compared to some classical techniques.

Keywords: neural network, load forecasting, fuzzy inference, machine learning, fuzzy modeling and rule extraction, support vector regression

Procedia PDF Downloads 426
15020 Development of Industry Sector Specific Factory Standards

Authors: Peter Burggräf, Moritz Krunke, Hanno Voet

Abstract:

Due to shortening product and technology lifecycles, many companies use standardization approaches in product development and factory planning to reduce costs and time to market. Unlike large companies, where modular systems are already widely used, small and medium-sized companies often show a much lower degree of standardization due to lower scale effects and missing capacities for the development of these standards. To overcome these challenges, the development of industry sector specific standards in cooperations or by third parties is an interesting approach. This paper analyzes which branches that are mainly dominated by small or medium-sized companies might be especially interesting for the development of factory standards using the example of the German industry. For this, a key performance indicator based approach was developed that will be presented in detail with its specific results for the German industry structure.

Keywords: factory planning, factory standards, industry sector specific standardization, production planning

Procedia PDF Downloads 385
15019 The Spatial Pattern of Economic Rents of an Airport Development Area: Lessons Learned from the Suvarnabhumi International Airport, Thailand

Authors: C. Bejrananda, Y. Lee, T. Khamkaew

Abstract:

With the rise of the importance of air transportation in the 21st century, the role of economics in airport planning and decision-making has become more important to the urban structure and land value around it. Therefore, this research aims to examine the relationship between an airport and its impacts on the distribution of urban land uses and land values by applying the Alonso’s bid rent model. The New Bangkok International Airport (Suvarnabhumi International Airport) was taken as a case study. The analysis was made over three different time periods of airport development (after the airport site was proposed, during airport construction, and after the opening of the airport). The statistical results confirm that Alonso’s model can be used to explain the impacts of the new airport only for the northeast quadrant of the airport, while proximity to the airport showed the inverse relationship with the land value of all six types of land use activities through three periods of time. It indicates that the land value for commercial land use is the most sensitive to the location of the airport or has the strongest requirement for accessibility to the airport compared to the residential and manufacturing land use. Also, the bid-rent gradients of the six types of land use activities have declined dramatically through the three time periods because of the Asian Financial Crisis in 1997. Therefore, the lesson learned from this research concerns about the reliability of the data used. The major concern involves the use of different areal units for assessing land value for different time periods between zone block (1995) and grid block (2002, 2009). As a result, this affect the investigation of the overall trends of land value assessment, which are not readily apparent. In addition, the next concern is the availability of the historical data. With the lack of collecting historical data for land value assessment by the government, some of data of land values and aerial photos are not available to cover the entire study area. Finally, the different formats of using aerial photos between hard-copy (1995) and digital photo (2002, 2009) made difficult for measuring distances. Therefore, these problems also affect the accuracy of the results of the statistical analyses.

Keywords: airport development area, economic rents, spatial pattern, suvarnabhumi international airport

Procedia PDF Downloads 269
15018 Design Optimization of Miniature Mechanical Drive Systems Using Tolerance Analysis Approach

Authors: Eric Mxolisi Mkhondo

Abstract:

Geometrical deviations and interaction of mechanical parts influences the performance of miniature systems.These deviations tend to cause costly problems during assembly due to imperfections of components, which are invisible to a naked eye.They also tend to cause unsatisfactory performance during operation due to deformation cause by environmental conditions.One of the effective tools to manage the deviations and interaction of parts in the system is tolerance analysis.This is a quantitative tool for predicting the tolerance variations which are defined during the design process.Traditional tolerance analysis assumes that the assembly is static and the deviations come from the manufacturing discrepancies, overlooking the functionality of the whole system and deformation of parts due to effect of environmental conditions. This paper presents an integrated tolerance analysis approach for miniature system in operation.In this approach, a computer-aided design (CAD) model is developed from system’s specification.The CAD model is then used to specify the geometrical and dimensional tolerance limits (upper and lower limits) that vary component’s geometries and sizes while conforming to functional requirements.Worst-case tolerances are analyzed to determine the influenced of dimensional changes due to effects of operating temperatures.The method is used to evaluate the nominal conditions, and worse case conditions in maximum and minimum dimensions of assembled components.These three conditions will be evaluated under specific operating temperatures (-40°C,-18°C, 4°C, 26°C, 48°C, and 70°C). A case study on the mechanism of a zoom lens system is used to illustrate the effectiveness of the methodology.

Keywords: geometric dimensioning, tolerance analysis, worst-case analysis, zoom lens mechanism

Procedia PDF Downloads 158
15017 Clustered Regularly Interspaced Short Palindromic Repeats Interference (CRISPRi): An Approach to Inhibit Microbial Biofilm

Authors: Azna Zuberi

Abstract:

Biofilm is a sessile bacterial accretion in which bacteria adapts different physiological and morphological behavior from planktonic form. It is the root cause of about 80% microbial infections in human. Among them, E. coli biofilms are most prevalent in medical devices associated nosocomial infections. The objective of this study was to inhibit biofilm formation by targeting LuxS gene, involved in quorum sensing using CRISPRi. luxS is a synthase, involved in the synthesis of Autoinducer-2(AI-2), which in turn guides the initial stage of biofilm formation. To implement CRISPRi system, we have synthesized complementary sgRNA to target gene sequence and co-expressed with dCas9. Suppression of luxS was confirmed through qRT-PCR. The effect of luxS gene on biofilm inhibition was studied through crystal violet assay, XTT reduction assay and scanning electron microscopy. We conclude that CRISPRi system could be a potential strategy to inhibit bacterial biofilm through mechanism base approach.

Keywords: biofilm, CRISPRi, luxS, microbial

Procedia PDF Downloads 173
15016 An Algorithm of Set-Based Particle Swarm Optimization with Status Memory for Traveling Salesman Problem

Authors: Takahiro Hino, Michiharu Maeda

Abstract:

Particle swarm optimization (PSO) is an optimization approach that achieves the social model of bird flocking and fish schooling. PSO works in continuous space and can solve continuous optimization problem with high quality. Set-based particle swarm optimization (SPSO) functions in discrete space by using a set. SPSO can solve combinatorial optimization problem with high quality and is successful to apply to the large-scale problem. In this paper, we present an algorithm of SPSO with status memory to decide the position based on the previous position for solving traveling salesman problem (TSP). In order to show the effectiveness of our approach. We examine SPSOSM for TSP compared to the existing algorithms.

Keywords: combinatorial optimization problems, particle swarm optimization, set-based particle swarm optimization, traveling salesman problem

Procedia PDF Downloads 538
15015 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 157
15014 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 144
15013 Looking for a Connection between Oceanic Regions with Trends in Evaporation with Continental Ones with Trends in Precipitation through a Lagrangian Approach

Authors: Raquel Nieto, Marta Vázquez, Anita Drumond, Luis Gimeno

Abstract:

One of the hot spots of climate change is the increment of ocean evaporation. The best estimation of evaporation, OAFlux data, shows strong increasing trends in evaporation from the oceans since 1978, with peaks during the hemispheric winter and strongest along the paths of the global western boundary currents and any inner Seas. The transport of moisture from oceanic sources to the continents is the connection between evaporation from the ocean and precipitation over the continents. A key question is to try to relate evaporative source regions over the oceans where trends have occurred in the last decades with their sinks over the continents to check if there have been also any trends in the precipitation amount or its characteristics. A Lagrangian approach based on FLEXPART and ERA-interim data is used to establish this connection. The analyzed period was 1980 to 2012. Results show that there is not a general pattern, but a significant agreement was found in important areas of climate interest.

Keywords: ocean evaporation, Lagrangian approaches, contiental precipitation, Europe

Procedia PDF Downloads 247
15012 Constructivism and Situational Analysis as Background for Researching Complex Phenomena: Example of Inclusion

Authors: Radim Sip, Denisa Denglerova

Abstract:

It’s impossible to capture complex phenomena, such as inclusion, with reductionism. The most common form of reductionism is the objectivist approach, where processes and relationships are reduced to entities and clearly outlined phases, with a consequent search for relationships between them. Constructivism as a paradigm and situational analysis as a methodological research portfolio represent a way to avoid the dominant objectivist approach. They work with a situation, i.e. with the essential blending of actors and their environment. Primary transactions are taking place between actors and their surroundings. Researchers create constructs based on their need to solve a problem. Concepts therefore do not describe reality, but rather a complex of real needs in relation to the available options how such needs can be met. For examination of a complex problem, corresponding methodological tools and overall design of the research are necessary. Using an original research on inclusion in the Czech Republic as an example, this contribution demonstrates that inclusion is not a substance easily described, but rather a relationship field changing its forms in response to its actors’ behaviour and current circumstances. Inclusion consists of dynamic relationship between an ideal, real circumstances and ways to achieve such ideal under the given circumstances. Such achievement has many shapes and thus cannot be captured by description of objects. It can be expressed in relationships in the situation defined by time and space. Situational analysis offers tools to examine such phenomena. It understands a situation as a complex of dynamically changing aspects and prefers relationships and positions in the given situation over a clear and final definition of actors, entities, etc. Situational analysis assumes creation of constructs as a tool for solving a problem at hand. It emphasizes the meanings that arise in the process of coordinating human actions, and the discourses through which these meanings are negotiated. Finally, it offers “cartographic tools” (situational maps, socials worlds / arenas maps, positional maps) that are able to capture the complexity in other than linear-analytical ways. This approach allows for inclusion to be described as a complex of phenomena taking place with a certain historical preference, a complex that can be overlooked if analyzed with a more traditional approach.

Keywords: constructivism, situational analysis, objective realism, reductionism, inclusion

Procedia PDF Downloads 140
15011 Quality Improvement Template for Undergraduate Nursing Education Curriculum Review and Analysis

Authors: Jennifer Stephens, Nichole Parker, Kristin Petrovic

Abstract:

To gain a better understanding of how students enrolled in a Bachelor of Nursing (BN) program are educated, faculty members in the BN program at Athabasca University (AU) in Alberta, Canada, developed a 3-phase comprehensive curriculum review project. Phase one of this review centered around hiring an external curriculum expert to examine and analyze the current curriculum and to propose recommendations focused on identifying gaps as well as building on strengths towards meeting changing health care trends. Phase two incorporated extensive institutional document analysis as well as qualitative and quantitative data collection in reciprocated critical reflection and has yielded insights into valuable processes, challenges, and solutions inherent to the complexities of undertaking curriculum review and analysis. Results of our phase one and two analysis generated a quality improvement (QI) template that could benefit other nursing education programs engaged in curriculum review and analysis. The key processes, lessons, and insights, as well as future project phase three plans, will be presented for iterative discussion and role modelling for other institutions undergoing, or planning, content-based curriculum review and evaluation.

Keywords: curriculum, education, nursing, nursing faculty practice, quality improvement

Procedia PDF Downloads 134
15010 Radiation Protection and Licensing for an Experimental Fusion Facility: The Italian and European Approaches

Authors: S. Sandri, G. M. Contessa, C. Poggi

Abstract:

An experimental nuclear fusion device could be seen as a step toward the development of the future nuclear fusion power plant. If compared with other possible solutions to the energy problem, nuclear fusion has advantages that ensure sustainability and security. In particular considering the radioactivity and the radioactive waste produced, in a nuclear fusion plant the component materials could be selected in order to limit the decay period, making it possible the recycling in a new reactor after about 100 years from the beginning of the decommissioning. To achieve this and other pertinent goals many experimental machines have been developed and operated worldwide in the last decades, underlining that radiation protection and workers exposure are critical aspects of these facilities due to the high flux, high energy neutrons produced in the fusion reactions. Direct radiation, material activation, tritium diffusion and other related issues pose a real challenge to the demonstration that these devices are safer than the nuclear fission facilities. In Italy, a limited number of fusion facilities have been constructed and operated since 30 years ago, mainly at the ENEA Frascati Center, and the radiation protection approach, addressed by the national licensing requirements, shows that it is not always easy to respect the constraints for the workers' exposure to ionizing radiation. In the current analysis, the main radiation protection issues encountered in the Italian Fusion facilities are considered and discussed, and the technical and legal requirements are described. The licensing process for these kinds of devices is outlined and compared with that of other European countries. The following aspects are considered throughout the current study: i) description of the installation, plant and systems, ii) suitability of the area, buildings, and structures, iii) radioprotection structures and organization, iv) exposure of personnel, v) accident analysis and relevant radiological consequences, vi) radioactive wastes assessment and management. In conclusion, the analysis points out the needing of a special attention to the radiological exposure of the workers in order to demonstrate at least the same level of safety as that reached at the nuclear fission facilities.

Keywords: fusion facilities, high energy neutrons, licensing process, radiation protection

Procedia PDF Downloads 345
15009 Recommendations for Environmental Impact Assessment of Geothermal Projects on Mature Oil Fields

Authors: Daria Karasalihovic Sedlar, Lucija Jukic, Ivan Smajla, Marija Macenic

Abstract:

This paper analyses possible geothermal energy production from a mature oil reservoir based on exploitation of underlying aquifer thermal energy for the purpose of heating public buildings. Research was conducted based on the case study of the City of Ivanic-Grad public buildings energy demand and Ivanic oil filed that is situated in the same area. Since the City of Ivanic is one of the few cities in the EU where hydrocarbon exploitation has been taking place for decades almost entirely in urban area, decommissioning of oil wells is inevitable; therefore, the research goal was to investigate how to extend the life-time of the reservoir by exploiting geothermal brine beneath the oil reservoir in an environmental friendly manner. This kind of a project is extremely complex in all segments, from documentation preparation, implementation of technological solutions, and providing ecological measures for environmentally acceptable geothermal energy production and utilization. New mining activities that will be needed for the development of geothermal project at the observed Hydrocarbon Exploitation Field Ivanic will be carried out in order to prepare wells for increasing geothermal brine production. These operations involve the conversion of existing wells (well completion for conversion of the observation wells to production ones) along with workover activities, installation of new heat exchangers, and pipelines. Since the wells are in the urban area of the City of Ivanic-Grad in high density populated area, the inhabitants will be exposed to the different environmental impacts during preparation phase of the project. For the purpose of performing workovers, it will be necessary to secure access to wellheads of existing wells. This paper gives guidelines for describing potential impacts on environment components that could occur during geothermal production preparation on existing mature oil filed, recommends possible protection measures to mitigate these impacts, and gives recommendations for environmental monitoring.

Keywords: geothermal energy production, mature oil filed, environmental impact assessment, underlying aquifer thermal energy

Procedia PDF Downloads 139
15008 Virtualization of Biomass Colonization: Potential of Application in Precision Medicine

Authors: Maria Valeria De Bonis, Gianpaolo Ruocco

Abstract:

Nowadays, computational modeling is paving new design and verification ways in a number of industrial sectors. The technology is ripe to challenge some case in the Bioengineering and Medicine frameworks: for example, looking at the strategical and ethical importance of oncology research, efforts should be made to yield new and powerful resources to tumor knowledge and understanding. With these driving motivations, we approach this gigantic problem by using some standard engineering tools such as the mathematics behind the biomass transfer. We present here some bacterial colonization studies in complex structures. As strong analogies hold with some tumor proliferation, we extend our study to a benchmark case of solid tumor. By means of a commercial software, we model biomass and energy evolution in arbitrary media. The approach will be useful to cast virtualization cases of cancer growth in human organs, while augmented reality tools will be used to yield for a realistic aid to informed decision in treatment and surgery.

Keywords: bacteria, simulation, tumor, precision medicine

Procedia PDF Downloads 328
15007 Learning Dynamic Representations of Nodes in Temporally Variant Graphs

Authors: Sandra Mitrovic, Gaurav Singh

Abstract:

In many industries, including telecommunications, churn prediction has been a topic of active research. A lot of attention has been drawn on devising the most informative features, and this area of research has gained even more focus with spread of (social) network analytics. The call detail records (CDRs) have been used to construct customer networks and extract potentially useful features. However, to the best of our knowledge, no studies including network features have yet proposed a generic way of representing network information. Instead, ad-hoc and dataset dependent solutions have been suggested. In this work, we build upon a recently presented method (node2vec) to obtain representations for nodes in observed network. The proposed approach is generic and applicable to any network and domain. Unlike node2vec, which assumes a static network, we consider a dynamic and time-evolving network. To account for this, we propose an approach that constructs the feature representation of each node by generating its node2vec representations at different timestamps, concatenating them and finally compressing using an auto-encoder-like method in order to retain reasonably long and informative feature vectors. We test the proposed method on churn prediction task in telco domain. To predict churners at timestamp ts+1, we construct training and testing datasets consisting of feature vectors from time intervals [t1, ts-1] and [t2, ts] respectively, and use traditional supervised classification models like SVM and Logistic Regression. Observed results show the effectiveness of proposed approach as compared to ad-hoc feature selection based approaches and static node2vec.

Keywords: churn prediction, dynamic networks, node2vec, auto-encoders

Procedia PDF Downloads 304
15006 Water Footprint for the Palm Oil Industry in Malaysia

Authors: Vijaya Subramaniam, Loh Soh Kheang, Astimar Abdul Aziz

Abstract:

Water footprint (WFP) has gained importance due to the increase in water scarcity in the world. This study analyses the WFP for an agriculture sector, i.e., the oil palm supply chain, which produces oil palm fresh fruit bunch (FFB), crude palm oil, palm kernel, and crude palm kernel oil. The water accounting and vulnerability evaluation (WAVE) method was used. This method analyses the water depletion index (WDI) based on the local blue water scarcity. The main contribution towards the WFP at the plantation was the production of FFB from the crop itself at 0.23m³/tonne FFB. At the mill, the burden shifts to the water added during the process, which consists of the boiler and process water, which accounted for 6.91m³/tonne crude palm oil. There was a 33% reduction in the WFP when there was no dilution or water addition after the screw press at the mill. When allocation was performed, the WFP reduced by 42% as the burden was shared with the palm kernel and palm kernel shell. At the kernel crushing plant (KCP), the main contributor towards the WFP 4.96 m³/tonne crude palm kernel oil which came from the palm kernel which carried the burden from upstream followed by electricity, 0.33 m³/tonne crude palm kernel oil used for the process and 0.08 m³/tonne crude palm kernel oil for transportation of the palm kernel. A comparison was carried out for mills with biogas capture versus no biogas capture, and the WFP had no difference for both scenarios. The comparison when the KCPs operate in the proximity of mills as compared to those operating in the proximity of ports only gave a reduction of 6% for the WFP. Both these scenarios showed no difference and insignificant difference, which differed from previous life cycle assessment studies on the carbon footprint, which showed significant differences. This shows that findings change when only certain impact categories are focused on. It can be concluded that the impact from the water used by the oil palm tree is low due to the practice of no irrigation at the plantations and the high availability of water from rainfall in Malaysia. This reiterates the importance of planting oil palm trees in regions with high rainfall all year long, like the tropics. The milling stage had the most significant impact on the WFP. Mills should avoid dilution to reduce this impact.

Keywords: life cycle assessment, water footprint, crude palm oil, crude palm kernel oil, WAVE method

Procedia PDF Downloads 157
15005 Bayesian Networks Scoping the Climate Change Impact on Winter Wheat Freezing Injury Disasters in Hebei Province, China

Authors: Xiping Wang,Shuran Yao, Liqin Dai

Abstract:

Many studies report the winter is getting warmer and the minimum air temperature is obviously rising as the important climate warming evidences. The exacerbated air temperature fluctuation tending to bring more severe weather variation is another important consequence of recent climate change which induced more disasters to crop growth in quite a certain regions. Hebei Province is an important winter wheat growing province in North of China that recently endures more winter freezing injury influencing the local winter wheat crop management. A winter wheat freezing injury assessment Bayesian Network framework was established for the objectives of estimating, assessing and predicting winter wheat freezing disasters in Hebei Province. In this framework, the freezing disasters was classified as three severity degrees (SI) among all the three types of freezing, i.e., freezing caused by severe cold in anytime in the winter, long extremely cold duration in the winter and freeze-after-thaw in early season after winter. The factors influencing winter wheat freezing SI include time of freezing occurrence, growth status of seedlings, soil moisture, winter wheat variety, the longitude of target region and, the most variable climate factors. The climate factors included in this framework are daily mean and range of air temperature, extreme minimum temperature and number of days during a severe cold weather process, the number of days with the temperature lower than the critical temperature values, accumulated negative temperature in a potential freezing event. The Bayesian Network model was evaluated using actual weather data and crop records at selected sites in Hebei Province using real data. With the multi-stage influences from the various factors, the forecast and assessment of the event-based target variables, freezing injury occurrence and its damage to winter wheat production, were shown better scoped by Bayesian Network model.

Keywords: bayesian networks, climatic change, freezing Injury, winter wheat

Procedia PDF Downloads 397
15004 Influential Parameters in Estimating Soil Properties from Cone Penetrating Test: An Artificial Neural Network Study

Authors: Ahmed G. Mahgoub, Dahlia H. Hafez, Mostafa A. Abu Kiefa

Abstract:

The Cone Penetration Test (CPT) is a common in-situ test which generally investigates a much greater volume of soil more quickly than possible from sampling and laboratory tests. Therefore, it has the potential to realize both cost savings and assessment of soil properties rapidly and continuously. The principle objective of this paper is to demonstrate the feasibility and efficiency of using artificial neural networks (ANNs) to predict the soil angle of internal friction (Φ) and the soil modulus of elasticity (E) from CPT results considering the uncertainties and non-linearities of the soil. In addition, ANNs are used to study the influence of different parameters and recommend which parameters should be included as input parameters to improve the prediction. Neural networks discover relationships in the input data sets through the iterative presentation of the data and intrinsic mapping characteristics of neural topologies. General Regression Neural Network (GRNN) is one of the powerful neural network architectures which is utilized in this study. A large amount of field and experimental data including CPT results, plate load tests, direct shear box, grain size distribution and calculated data of overburden pressure was obtained from a large project in the United Arab Emirates. This data was used for the training and the validation of the neural network. A comparison was made between the obtained results from the ANN's approach, and some common traditional correlations that predict Φ and E from CPT results with respect to the actual results of the collected data. The results show that the ANN is a very powerful tool. Very good agreement was obtained between estimated results from ANN and actual measured results with comparison to other correlations available in the literature. The study recommends some easily available parameters that should be included in the estimation of the soil properties to improve the prediction models. It is shown that the use of friction ration in the estimation of Φ and the use of fines content in the estimation of E considerable improve the prediction models.

Keywords: angle of internal friction, cone penetrating test, general regression neural network, soil modulus of elasticity

Procedia PDF Downloads 409
15003 Production of New Hadron States in Effective Field Theory

Authors: Qi Wu, Dian-Yong Chen, Feng-Kun Guo, Gang Li

Abstract:

In the past decade, a growing number of new hadron states have been observed, which are dubbed as XYZ states in the heavy quarkonium mass regions. In this work, we present our study on the production of some new hadron states. In particular, we investigate the processes Υ(5S,6S)→ Zb (10610)/Zb (10650)π, Bc→ Zc (3900)/Zc (4020)π and Λb→ Pc (4312)/Pc (4440)/Pc (4457)K. (1) For the production of Zb (10610)/Zb (10650) from Υ(5S,6S) decay, two types of bottom-meson loops were discussed within a nonrelativistic effective field theory. We found that the loop contributions with all intermediate states being the S-wave ground state bottom mesons are negligible, while the loops with one bottom meson being the broad B₀* or B₁' resonance could provide the dominant contributions to the Υ(5S)→ Zb⁽'⁾ π. (2) For the production of Zc (3900)/Zc (4020) from Bc decay, the branching ratios of Bc⁺→ Z (3900)⁺ π⁰ and Bc⁺→ Zc (4020)⁺ π⁰ are estimated to be of order of 10⁽⁻⁴⁾ and 10⁽⁻⁷⁾ in an effective Lagrangian approach. The large production rate of Zc (3900) could provide an important source of the production of Zc (3900) from the semi-exclusive decay of b-flavored hadrons reported by D0 Collaboration, which can be tested by the exclusive measurements in LHCb. (3) For the production of Pc (4312), Pc (4440) and Pc (4457) from Λb decay, the ratio of the branching fraction of Λb→ Pc K was predicted in a molecular scenario by using an effective Lagrangian approach, which is weakly dependent on our model parameter. We also find the ratios of the productions of the branching fractions of Λb→ Pc K and Pc→ J/ψ p can be well interpreted in the molecular scenario. Moreover, the estimated branching fractions of Λb→ Pc K are of order 10⁽⁻⁶⁾, which could be tested by further measurements in LHCb Collaboration.

Keywords: effective Lagrangian approach, hadron loops, molecular states, new hadron states

Procedia PDF Downloads 121
15002 Relay Node Placement for Connectivity Restoration in Wireless Sensor Networks Using Genetic Algorithms

Authors: Hanieh Tarbiat Khosrowshahi, Mojtaba Shakeri

Abstract:

Wireless Sensor Networks (WSNs) consist of a set of sensor nodes with limited capability. WSNs may suffer from multiple node failures when they are exposed to harsh environments such as military zones or disaster locations and lose connectivity by getting partitioned into disjoint segments. Relay nodes (RNs) are alternatively introduced to restore connectivity. They cost more than sensors as they benefit from mobility, more power and more transmission range, enforcing a minimum number of them to be used. This paper addresses the problem of RN placement in a multiple disjoint network by developing a genetic algorithm (GA). The problem is reintroduced as the Steiner tree problem (which is known to be an NP-hard problem) by the aim of finding the minimum number of Steiner points where RNs are to be placed for restoring connectivity. An upper bound to the number of RNs is first computed to set up the length of initial chromosomes. The GA algorithm then iteratively reduces the number of RNs and determines their location at the same time. Experimental results indicate that the proposed GA is capable of establishing network connectivity using a reasonable number of RNs compared to the best existing work.

Keywords: connectivity restoration, genetic algorithms, multiple-node failure, relay nodes, wireless sensor networks

Procedia PDF Downloads 226
15001 The Case for Strategic Participation: How Facilitated Engagement Can Be Shown to Reduce Resistance and Improve Outcomes Through the Use of Strategic Models

Authors: Tony Mann

Abstract:

This paper sets out the case for involving and engaging employees/workers/stakeholders/staff in any significant change that is being considered by the senior executives of the organization. It establishes the rationale, the approach, the methodology of engagement and the benefits of a participative approach. It challenges the new norm of imposing change for fear of resistance and instead suggests that involving people has better outcomes and a longer-lasting impact. Various strategic models are introduced and illustrated to explain how the process can be most effective. The paper highlights one model in particular (the Process Iceberg® Organizational Change model) that has proven to be instrumental in developing effective change. Its use is demonstrated in its various forms and explains why so much change fails to address the key elements and how we can be more productive in managing change. ‘Participation’ in change is too often seen as negative, expensive and unwieldy. The paper aims to show that another model: UIA=O+E, can offset the difficulties and, in fact, produce much more positive and effective change.

Keywords: facilitation, stakeholders, buy-in, digital workshops

Procedia PDF Downloads 92
15000 In-service High School Teachers’ Experiences On Blended Teaching Approach Of Mathematics

Authors: Lukholo Raxangana

Abstract:

Fourth Industrial Revolution (4IR)-era teaching offers in-service mathematics teachers opportunities to use blended approaches to engage learners while teaching mathematics. This study explores in-service high school teachers' experiences with a blended teaching approach to mathematics. This qualitative case study involved eight pre-service teachers from four selected schools in the Sedibeng West District of the Gauteng Province. The study used the community of inquiry model as its analytical framework for data analysis. Data collection was through semi-structured interviews and focus-group discussions to explore in-service teachers' experiences with the influence of blended teaching (BT) on learning mathematics. The study results are the impact of load-shedding, benefits of BT, and perceptions of in-service and hindrances of BT. Based on these findings, the study recommends that further research should focus on developing data-free BT tools to assist during load-shedding, regardless of location.

Keywords: bended teaching, teachers, in-service, and mathematics

Procedia PDF Downloads 50
14999 Usage of Military Spending, Debt Servicing and Growth for Dealing with Emergency Plan of Indian External Debt

Authors: Sahbi Farhani

Abstract:

This study investigates the relationship between external debt and military spending in case of India over the period of 1970–2012. In doing so, we have applied the structural break unit root tests to examine stationarity properties of the variables. The Auto-Regressive Distributed Lag (ARDL) bounds testing approach is used to test whether cointegration exists in presence of structural breaks stemming in the series. Our results indicate the cointegration among external debt, military spending, debt servicing, and economic growth. Moreover, military spending and debt servicing add in external debt. Economic growth helps in lowering external debt. The Vector Error Correction Model (VECM) analysis and Granger causality test reveal that military spending and economic growth cause external debt. The feedback effect also exists between external debt and debt servicing in case of India.

Keywords: external debt, military spending, ARDL approach, India

Procedia PDF Downloads 284
14998 An Investigation into the Current Implementation of Design-Build Contracts in the Kingdom of Saudi Arabia

Authors: Ibrahim A. Alhammad, Suleiman A. Al-Otaibi, Khalid S. Al-Gahtani, Naïf Al-Otaibi, Abdulaziz A. Bubshait

Abstract:

In the last decade, the use of project delivery system of design build engineering contracts is increasing in North America due to the reasons of reducing the project duration and minimizing costs. The shift from traditional approach of Design-Bid-Build to Design-Build contracts have been attributed to many factors such as evolution of the regulatory and legal frameworks governing the engineering contracts and improvement in integrating design and construction. The aforementioned practice of contracting is more appropriate in North America; yet, it may not be the case in Saudi Arabia where the traditional approach of construction contracting remains dominant. The authors believe there are number of factors related to the gaps in the level of sophistication of the engineering and management of the construction projects in both countries. A step towards improving the Saudi construction practice by adopting the new trend of construction contracting, this paper identifies the reasons why Design/Build form of contracting are not frequently utilized. A field survey, which includes the questionnaire addressing the research problem, is distributed to three main parties of the construction contracts: clients, consultants, and contractors. The analyzed collected data were statistically sufficient to finding the reasons of not adopting the new trend of good practice of deign build approach in Saudi Arabia. In addition, the reasons are: (1) lack of regulation and legal framework; (2) absence of clear criteria of the owner for the trade-off between competing contractors, (3) and lack of experience, knowledge and skill.

Keywords: design built projects, Saudi Arabia, GCC, mega projects

Procedia PDF Downloads 212
14997 An Agent-Based Approach to Examine Interactions of Firms for Investment Revival

Authors: Ichiro Takahashi

Abstract:

One conundrum that macroeconomic theory faces is to explain how an economy can revive from depression, in which the aggregate demand has fallen substantially below its productive capacity. This paper examines an autonomous stabilizing mechanism using an agent-based Wicksell-Keynes macroeconomic model. This paper focuses on the effects of the number of firms and the length of the gestation period for investment that are often assumed to be one in a mainstream macroeconomic model. The simulations found the virtual economy was highly unstable, or more precisely, collapsing when these parameters are fixed at one. This finding may even suggest us to question the legitimacy of these common assumptions. A perpetual decline in capital stock will eventually encourage investment if the capital stock is short-lived because an inactive investment will result in insufficient productive capacity. However, for an economy characterized by a roundabout production method, a gradual decline in productive capacity may not be able to fall below the aggregate demand that is also shrinking. Naturally, one would then ask if our economy cannot rely on an external stimulus such as population growth and technological progress to revive investment, what factors would provide such a buoyancy for stimulating investments? The current paper attempts to answer this question by employing the artificial macroeconomic model mentioned above. The baseline model has the following three features: (1) the multi-period gestation for investment, (2) a large number of heterogeneous firms, (3) demand-constrained firms. The instability is a consequence of the following dynamic interactions. (a) A multiple-period gestation period means that once a firm starts a new investment, it continues to invest over some subsequent periods. During these gestation periods, the excess demand created by the investing firm will spill over to ignite new investment of other firms that are supplying investment goods: the presence of multi-period gestation for investment provides a field for investment interactions. Conversely, the excess demand for investment goods tends to fade away before it develops into a full-fledged boom if the gestation period of investment is short. (b) A strong demand in the goods market tends to raise the price level, thereby lowering real wages. This reduction of real wages creates two opposing effects on the aggregate demand through the following two channels: (1) a reduction in the real labor income, and (2) an increase in the labor demand due to the principle of equality between the marginal labor productivity and real wage (referred as the Walrasian labor demand). If there is only a single firm, a lower real wage will increase its Walrasian labor demand, thereby an actual labor demand tends to be determined by the derived labor demand. Thus, the second positive effect would not work effectively. In contrast, for an economy with a large number of firms, Walrasian firms will increase employment. This interaction among heterogeneous firms is a key for stability. A single firm cannot expect the benefit of such an increased aggregate demand from other firms.

Keywords: agent-based macroeconomic model, business cycle, demand constraint, gestation period, representative agent model, stability

Procedia PDF Downloads 154
14996 Implementation of an Open Source ERP for SMEs in the Automotive Sector in Peru: A Case Study

Authors: Gerson E. Cornejo, Luis A. Gamarra, David S. Mauricio

Abstract:

The Enterprise Resource Planning Systems (ERP) allows the integration of all the business processes of the functional areas of the companies, in order to automate and standardize the processes, obtain accurate information and improve decision making in time real. In Peru, 79% of medium and small companies (SMEs) do not use any management software, this is because it is believed that ERPs are expensive, complex and difficult to implement. However, for more than 20 years there have been Open Source ERPs, which are more accessible and have the same benefit as proprietary ERPs, but there is little information on the implementation process. In this work is made a case of study, in order to show the implementation process of an Open Source ERP, Odoo, based on the ASAP methodology (Accelerated SAP) and applied to a company of corrective and preventive maintenance services of vehicles. The ERP allowed the SME to standardize its business processes, increase its productivity, reducing up to 40% certain processes. The study of this case shows that it is feasible and profitable to implement an Open Source ERP in SMEs in the Automotive Sector of Peru. In addition, it is shown that the ASAP methodology is adequate to carry out Open Source ERPs implementation projects.

Keywords: ASAP, automotive sector, ERP implementation, open source

Procedia PDF Downloads 324
14995 Message Passing Neural Network (MPNN) Approach to Multiphase Diffusion in Reservoirs for Well Interconnection Assessments

Authors: Margarita Mayoral-Villa, J. Klapp, L. Di G. Sigalotti, J. E. V. Guzmán

Abstract:

Automated learning techniques are widely applied in the energy sector to address challenging problems from a practical point of view. To this end, we discuss the implementation of a Message Passing algorithm (MPNN)within a Graph Neural Network(GNN)to leverage the neighborhood of a set of nodes during the aggregation process. This approach enables the characterization of multiphase diffusion processes in the reservoir, such that the flow paths underlying the interconnections between multiple wells may be inferred from previously available data on flow rates and bottomhole pressures. The results thus obtained compare favorably with the predictions produced by the Reduced Order Capacitance-Resistance Models (CRM) and suggest the potential of MPNNs to enhance the robustness of the forecasts while improving the computational efficiency.

Keywords: multiphase diffusion, message passing neural network, well interconnection, interwell connectivity, graph neural network, capacitance-resistance models

Procedia PDF Downloads 134
14994 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach

Authors: Jean Berger, Nassirou Lo, Martin Noel

Abstract:

Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.

Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization

Procedia PDF Downloads 361