Search results for: queue size distribution at a random epoch
10664 Influence of the Non-Uniform Distribution of Filler Porosity on the Thermal Performance of Sensible Heat Thermocline Storage Tanks
Authors: Yuchao Hua, Lingai Luo
Abstract:
Thermal energy storage is of critical importance for the highly-efficient utilization of renewable energy sources. Over the past decades, single-tank thermocline technology has attracted much attention owing to its high cost-effectiveness. In the present work, we investigate the influence of the filler porosity’s non-uniform distribution on the thermal performance of the packed-bed sensible heat thermocline storage tanks on the basis of the analytical model obtained by the Laplace transform. It is found that when the total amount of filler materials (i.e., the integration of porosity) is fixed, the different porosity distributions can result in the significantly-different behaviors of outlet temperature and thus the varied charging and discharging efficiencies. Our results indicate that a non-uniform distribution of the fillers with the proper design can improve the heat storage performance without changing the total amount of the filling materials.Keywords: energy storage, heat thermocline storage tank, packed bed, transient thermal analysis
Procedia PDF Downloads 9510663 Improved Performance of Mn Substituted Ceria Nanospheres for Water Gas Shift Reaction: Influence of Preparation Conditions
Authors: Bhairi Lakshminarayana, Surajit Sarker, Ch. Subrahmanyam
Abstract:
The present study reports the development of noble metal free nano catalysts for low-temperature CO oxidation and water gas shift reaction. Mn-substituted CeO2 solid solution catalysts were synthesized by co-precipitation, combustion and hydrothermal methods. The formation of solid solution was confirmed by XRD with Rietveld refinement and the percentage of carbon and nitrogen doping was ensured by CHNS analyzer. Raman spectroscopic confirmed the oxygen vacancies. The surface area, pore volume and pore size distribution confirmed by N2 physisorption analysis, whereas, UV-visible diffuse reflectance spectroscopy and XPS data confirmed the oxidation state of the Mn ion. The particle size and morphology (spherical shape) of the material was confirmed using FESEM and HRTEM analysis. Ce0.8Mn0.2O2-δ was calcined at 400 °C, 600 °C and 800 °C. Raman spectroscopy confirmed that the catalyst calcined at 400 °C has the best redox properties. The activity of the designed catalysts for CO oxidation (0.2 vol%), carried out with GHSV of 21,000 h-1 and it has been observed that co-precipitation favored the best active catalyst towards CO oxidation and water gas shift reaction, due to the high surface area, improved reducibility, oxygen mobility and highest quantity of surface oxygen species. The activation energy of low temperature CO oxidation on Ce0.8Mn0.2O2- δ (combustion) was 5.5 kcal.K-1.mole-1. The designed catalysts were tested for water gas shift reaction. The present study demonstrates that Mn ion substituted ceria at 400 °C calcination temperature prepared by co-precipitation method promise to revive a green sustainable energy production approach.Keywords: Ce0.8Mn0.2O2-ð, CO oxidation, physicochemical characterization, water gas shift reaction (WGS)
Procedia PDF Downloads 23710662 Peeling Behavior of Thin Elastic Films Bonded to Rigid Substrate of Random Surface Topology
Authors: Ravinu Garg, Naresh V. Datla
Abstract:
We study the fracture mechanics of peeling of thin films perfectly bonded to a rigid substrate of any random surface topology using an analytical formulation. A generalized theoretical model has been developed to determine the peel strength of thin elastic films. It is demonstrated that an improvement in the peel strength can be achieved by modifying the surface characteristics of the rigid substrate. Characterization study has been performed to analyze the effect of different parameters on effective peel force from the rigid surface. Different surface profiles such as circular and sinusoidal has been considered to demonstrate the bonding characteristics of film-substrate interface. Condition for the instability in the debonding of the film is analyzed, where the localized self-debonding arises depending upon the film and surface characteristics. This study is towards improved adhesion strength of thin films to rigid substrate using different textured surfaces.Keywords: debonding, fracture mechanics, peel test, thin film adhesion
Procedia PDF Downloads 44910661 Numerical Investigation of Fluid Flow and Temperature Distribution on Power Transformer Windings Using Open Foam
Authors: Saeed Khandan Siar, Stefan Tenbohlen, Christian Breuer, Raphael Lebreton
Abstract:
The goal of this article is to investigate the detailed temperature distribution and the fluid flow of an oil cooled winding of a power transformer by means of computational fluid dynamics (CFD). The experimental setup consists of three passes of a zig-zag cooled disc type winding, in which losses are modeled by heating cartridges in each winding segment. A precise temperature sensor measures the temperature of each turn. The laboratory setup allows the exact control of the boundary conditions, e.g. the oil flow rate and the inlet temperature. Furthermore, a simulation model is solved using the open source computational fluid dynamics solver OpenFOAM and validated with the experimental results. The model utilizes the laminar and turbulent flow for the different mass flow rate of the oil. The good agreement of the simulation results with experimental measurements validates the model.Keywords: CFD, conjugated heat transfer, power transformers, temperature distribution
Procedia PDF Downloads 42410660 Molecular Dynamic Simulation of Cold Spray Process
Authors: Aneesh Joshi, Sagil James
Abstract:
Cold Spray (CS) process is deposition of solid particles over a substrate above a certain critical impact velocity. Unlike thermal spray processes, CS process does not melt the particles thus retaining their original physical and chemical properties. These characteristics make CS process ideal for various engineering applications involving metals, polymers, ceramics and composites. The bonding mechanism involved in CS process is extremely complex considering the dynamic nature of the process. Though CS process offers great promise for several engineering applications, the realization of its full potential is limited by the lack of understanding of the complex mechanisms involved in this process and the effect of critical process parameters on the deposition efficiency. The goal of this research is to understand the complex nanoscale mechanisms involved in CS process. The study uses Molecular Dynamics (MD) simulation technique to understand the material deposition phenomenon during the CS process. Impact of a single crystalline copper nanoparticle on copper substrate is modelled under varying process conditions. The quantitative results of the impacts at different velocities, impact angle and size of the particles are evaluated using flattening ratio, von Mises stress distribution and local shear strain. The study finds that the flattening ratio and hence the quality of deposition was highest for an impact velocity of 700 m/s, particle size of 20 Å and an impact angle of 90°. The stress and strain analysis revealed regions of shear instabilities in the periphery of impact and also revealed plastic deformation of the particles after the impact. The results of this study can be used to augment our existing knowledge in the field of CS processes.Keywords: cold spray process, molecular dynamics simulation, nanoparticles, particle impact
Procedia PDF Downloads 36910659 Characteristics of Sorghum (Sorghum bicolor L. Moench) Flour on the Soaking Time of Peeled Grains and Particle Size Treatment
Authors: Sri Satya Antarlina, Elok Zubaidah, Teti Istiana, Harijono
Abstract:
Sorghum bicolor (Sorghum bicolor L. Moench) has the potential as a flour for gluten-free food products. Sorghum flour production needs grain soaking treatment. Soaking can reduce the tannin content which is an anti-nutrient, so it can increase the protein digestibility. Fine particle size decreases the yield of flour, so it is necessary to study various particle sizes to increase the yield. This study aims to determine the characteristics of sorghum flour in the treatment of soaking peeled grain and particle size. The material of white sorghum varieties KD-4 from farmers in East Java, Indonesia. Factorial randomized factorial design (two factors), repeated three times, factor I were the time of grain soaking (five levels) that were 0, 12, 24, 36, and 48 hours, factor II was the size of the starch particles sifted with a fineness level of 40, 60, 80, and 100 mesh. The method of making sorghum flour is grain peeling, soaking peeled grain, drying using the oven at 60ᵒC, milling, and sieving. Physico-chemical analysis of sorghum flour. The results show that there is an interaction between soaking time of grain with the size of sorghum flour particles. Interaction in yield of flour, L* color (brightness level), whiteness index, paste properties, amylose content, protein content, bulk density, and protein digestibility. The method of making sorghum flour through the soaking of peeled grain and the difference in particle size has an important role in producing the physicochemical properties of the specific flour. Based on the characteristics of sorghum flour produced, it is determined the method of making sorghum flour through sorghum grain soaking for 24 hours, the particle size of flour 80 mesh. The sorghum flour with characteristic were 24.88% yield of flour, 88.60 color L* (brightness level), 69.95 whiteness index, 3615 Cp viscosity, 584.10 g/l of bulk density, 24.27% db protein digestibility, 90.02% db starch content, 23.4% db amylose content, 67.45% db amylopectin content, 0.22% db crude fiber content, 0.037% db tannin content, 5.30% db protein content, ash content 0.18% db, carbohydrate content 92.88 % db, and 1.94% db fat content. The sorghum flour is recommended for cookies products.Keywords: characteristic, sorghum (Sorghum bicolor L. Moench) flour, grain soaking, particle size, physicochemical properties
Procedia PDF Downloads 16310658 Spatio-Temporal Risk Analysis of Cancer to Assessed Environmental Exposures in Coimbatore, India
Authors: Janani Selvaraj, M. Prashanthi Devi, P. B. Harathi
Abstract:
Epidemiologic studies conducted over several decades have provided evidence to suggest that long-term exposure to elevated ambient levels of particulate air pollution is associated with increased mortality. Air quality risk management is significant in developing countries and it highlights the need to understand the role of ecologic covariates in the association between air pollution and mortality. Several new methods show promise in exploring the geographical distribution of disease and the identification of high risk areas using epidemiological maps. However, the addition of the temporal attribute would further give us an in depth idea of the disease burden with respect to forecasting measures. In recent years, new methods developed in the reanalysis were useful for exploring the spatial structure of the data and the impact of spatial autocorrelation on estimates of risk associated with exposure to air pollution. Based on this, our present study aims to explore the spatial and temporal distribution of the lung cancer cases in the Coimbatore district of Tamil Nadu in relation to air pollution risk areas. A spatio temporal moving average method was computed using the CrimeStat software and visualized in ArcGIS 10.1 to document the spatio temporal movement of the disease in the study region. The random walk analysis performed showed the progress of the peak cancer incidences in the intersection regions of the Coimbatore North and South taluks that include major commercial and residential regions like Gandhipuram, Peelamedu, Ganapathy, etc. Our study shows evidence that daily exposure to high air pollutant concentration zones may lead to the risk of lung cancer. The observations from the present study will be useful in delineating high risk zones of environmental exposure that contribute to the increase of cancer among daily commuters. Through our study we suggest that spatially resolved exposure models in relevant time frames will produce higher risks zones rather than solely on statistical theory about the impact of measurement error and the empirical findings.Keywords: air pollution, cancer, spatio-temporal analysis, India
Procedia PDF Downloads 51510657 Polymerase Chain Reaction Analysis and Random Amplified Polymorphic DNA of Agrobacterium Tumefaciens
Authors: Abeer M. Algeblawi
Abstract:
Fifteen isolates of Agrobacterium tumefaciens were obtained from crown gall samples collected from six locations (Tripoli, Alzahra, Ain-Zara, Alzawia, Alazezia in Libya) from Grape (Vitis vinifera L.), Pear (Pyrus communis L.), Peach (Prunus persica L.) and Alexandria in Egypt from Guava (Psidium guajava L.) trees, Artichoke (Cynara cardunculus L.) and Sugar beet (Beta vulgaris L.). Total DNA was extracted from the eight isolates as well as the identification of six isolates used into Polymerase Chain Reaction (PCR) analysis and Random Amplified Polymorphic DNA (RAPD) technique were used. High similarity (55.5%) was observed among the eight A. tumefaciens isolates (Agro1, Agro2, Agro3, Agro4, Agro5, Agro6, Agro7, and Agro8). The PCR amplification products were resulting from the use of two specific primers (virD2A-virD2C). Analysis induction six isolates of A. tumefaciens obtained from different hosts. A visible band was specific to A. tumefaciens of (220 bp, 224 bp) and 338 bp produced with total DNA extracted from bacterial cells.Keywords: Agrobacterium tumefaciens, crown gall, identification, molecular characterization, PCR, RAPD
Procedia PDF Downloads 14710656 The Effect of Different Parameters on a Single Invariant Lateral Displacement Distribution to Consider the Higher Modes Effect in a Displacement-Based Pushover Procedure
Authors: Mohamad Amin Amini, Mehdi Poursha
Abstract:
Nonlinear response history analysis (NL-RHA) is a robust analytical tool for estimating the seismic demands of structures responding in the inelastic range. However, because of its conceptual and numerical complications, the nonlinear static procedure (NSP) is being increasingly used as a suitable tool for seismic performance evaluation of structures. The conventional pushover analysis methods presented in various codes (FEMA 356; Eurocode-8; ATC-40), are limited to the first-mode-dominated structures, and cannot take higher modes effect into consideration. Therefore, since more than a decade ago, researchers developed enhanced pushover analysis procedures to take higher modes effect into account. The main objective of this study is to propose an enhanced invariant lateral displacement distribution to take higher modes effect into consideration in performing a displacement-based pushover analysis, whereby a set of laterally applied displacements, rather than forces, is monotonically applied to the structure. For this purpose, the effect of different parameters such as the spectral displacement of ground motion, the modal participation factor, and the effective modal participating mass ratio on the lateral displacement distribution is investigated to find the best distribution. The major simplification of this procedure is that the effect of higher modes is concentrated into a single invariant lateral load distribution. Therefore, only one pushover analysis is sufficient without any need to utilize a modal combination rule for combining the responses. The invariant lateral displacement distribution for pushover analysis is then calculated by combining the modal story displacements using the modal combination rules. The seismic demands resulting from the different procedures are compared to those from the more accurate nonlinear response history analysis (NL-RHA) as a benchmark solution. Two structures of different heights including 10 and 20-story special steel moment resisting frames (MRFs) were selected and evaluated. Twenty ground motion records were used to conduct the NL-RHA. The results show that more accurate responses can be obtained in comparison with the conventional lateral loads when the enhanced modal lateral displacement distributions are used.Keywords: displacement-based pushover, enhanced lateral load distribution, higher modes effect, nonlinear response history analysis (NL-RHA)
Procedia PDF Downloads 28110655 Survey of the Elimination of Red Acid Dye by Wood Dust
Authors: N. Ouslimani, T. Abadlia, M. Fadel
Abstract:
This work focused on the elimination of acid textile dye (red bermacide acid dye BN-CL-200), widely used for dyeing wool and polyamide fibers, by adsorption on a natural material, wood sawdust, in the static mode by keeping under continuous stirring, a specific mass of the adsorbent, with a dye solution of known concentration. The influence of various parameters is studied like the influence of particle size, mass, pH and time. The best results were obtained with 0.4 mm grain size, mass of 3g, Temperature of 20 °C, pH 2 and Time contact of 120 min.Keywords: acid dye, environment, wood sawdust, wastewater
Procedia PDF Downloads 44410654 The Effect of Magnetite Particle Size on Methane Production by Fresh and Degassed Anaerobic Sludge
Authors: E. Al-Essa, R. Bello-Mendoza, D. G. Wareham
Abstract:
Anaerobic batch experiments were conducted to investigate the effect of magnetite-supplementation (7 mM) on methane production from digested sludge undergoing two different microbial growth phases, namely fresh sludge (exponential growth phase) and degassed sludge (endogenous decay phase). Three different particle sizes were assessed: small (50 - 150 nm), medium (168 – 490 nm) and large (800 nm - 4.5 µm) particles. Results show that, in the case of the fresh sludge, magnetite significantly enhanced the methane production rate (up to 32%) and reduced the lag phase (by 15% - 41%) as compared to the control, regardless of the particle size used. However, the cumulative methane produced at the end of the incubation was comparable in all treatment and control bottles. In the case of the degassed sludge, only the medium-sized magnetite particles increased significantly the methane production rate (12% higher) as compared to the control. Small and large particles had little effect on the methane production rate but did result in an extended lag phase which led to significantly lower cumulative methane production at the end of the incubation period. These results suggest that magnetite produces a clear and positive effect on methane production only when an active and balanced microbial community is present in the anaerobic digester. It is concluded that, (i) the effect of magnetite particle size on increasing the methane production rate and reducing lag phase duration is strongly influenced by the initial metabolic state of the microbial consortium, and (ii) the particle size would positively affect the methane production if it is provided within the nanometer size range.Keywords: anaerobic digestion, iron oxide, methanogenesis, nanoparticle
Procedia PDF Downloads 14110653 Numerical Approach of RC Structural MembersExposed to Fire and After-Cooling Analysis
Authors: Ju-young Hwang, Hyo-Gyoung Kwak, Hong Jae Yim
Abstract:
This paper introduces a numerical analysis method for reinforced-concrete (RC) structures exposed to fire and compares the result with experimental results. The proposed analysis method for RC structure under the high temperature consists of two procedures. First step is to decide the temperature distribution across the section through the heat transfer analysis by using the time-temperature curve. After determination of the temperature distribution, the nonlinear analysis is followed. By considering material and geometrical non-linearity with the temperature distribution, nonlinear analysis predicts the behavior of RC structure under the fire by the exposed time. The proposed method is validated by the comparison with the experimental results. Finally, Prediction model to describe the status of after-cooling concrete can also be introduced based on the results of additional experiment. The product of this study is expected to be embedded for smart structure monitoring system against fire in u-City.Keywords: RC structures, heat transfer analysis, nonlinear analysis, after-cooling concrete model
Procedia PDF Downloads 36910652 Exploring the Psychosocial Brain: A Retrospective Analysis of Personality, Social Networks, and Dementia Outcomes
Authors: Felicia N. Obialo, Aliza Wingo, Thomas Wingo
Abstract:
Psychosocial factors such as personality traits and social networks influence cognitive aging and dementia outcomes both positively and negatively. The inherent complexity of these factors makes defining the underlying mechanisms of their influence difficult; however, exploring their interactions affords promise in the field of cognitive aging. The objective of this study was to elucidate some of these interactions by determining the relationship between social network size and dementia outcomes and by determining whether personality traits mediate this relationship. The longitudinal Alzheimer’s Disease (AD) database provided by Rush University’s Religious Orders Study/Memory and Aging Project was utilized to perform retrospective regression and mediation analyses on 3,591 participants. Participants who were cognitively impaired at baseline were excluded, and analyses were adjusted for age, sex, common chronic diseases, and vascular risk factors. Dementia outcome measures included cognitive trajectory, clinical dementia diagnosis, and postmortem beta-amyloid plaque (AB), and neurofibrillary tangle (NT) accumulation. Personality traits included agreeableness (A), conscientiousness (C), extraversion (E), neuroticism (N), and openness (O). The results show a positive correlation between social network size and cognitive trajectory (p-value = 0.004) and a negative relationship between social network size and odds of dementia diagnosis (p = 0.024/ Odds Ratio (OR) = 0.974). Only neuroticism mediates the positive relationship between social network size and cognitive trajectory (p < 2e-16). Agreeableness, extraversion, and neuroticism all mediate the negative relationship between social network size and dementia diagnosis (p=0.098, p=0.054, and p < 2e-16, respectively). All personality traits are independently associated with dementia diagnosis (A: p = 0.016/ OR = 0.959; C: p = 0.000007/ OR = 0.945; E: p = 0.028/ OR = 0.961; N: p = 0.000019/ OR = 1.036; O: p = 0.027/ OR = 0.972). Only conscientiousness and neuroticism are associated with postmortem AD pathologies; specifically, conscientiousness is negatively associated (AB: p = 0.001, NT: p = 0.025) and neuroticism is positively associated with pathologies (AB: p = 0.002, NT: p = 0.002). These results support the study’s objectives, demonstrating that social network size and personality traits are strongly associated with dementia outcomes, particularly the odds of receiving a clinical diagnosis of dementia. Personality traits interact significantly and beneficially with social network size to influence the cognitive trajectory and future dementia diagnosis. These results reinforce previous literature linking social network size to dementia risk and provide novel insight into the differential roles of individual personality traits in cognitive protection.Keywords: Alzheimer’s disease, cognitive trajectory, personality traits, social network size
Procedia PDF Downloads 13010651 Mecano-Reliability Approach Applied to a Water Storage Tank Placed on Ground
Authors: Amar Aliche, Hocine Hammoum, Karima Bouzelha, Arezki Ben Abderrahmane
Abstract:
Traditionally, the dimensioning of storage tanks is conducted with a deterministic approach based on partial coefficients of safety. These coefficients are applied to take into account the uncertainties related to hazards on properties of materials used and applied loads. However, the use of these safety factors in the design process does not assure an optimal and reliable solution and can sometimes lead to a lack of robustness of the structure. The reliability theory based on a probabilistic formulation of constructions safety can respond in an adapted manner. It allows constructing a modelling in which uncertain data are represented by random variables, and therefore allows a better appreciation of safety margins with confidence indicators. The work presented in this paper consists of a mecano-reliability analysis of a concrete storage tank placed on ground. The classical method of Monte Carlo simulation is used to evaluate the failure probability of concrete tank by considering the seismic acceleration as random variable.Keywords: reliability approach, storage tanks, monte carlo simulation, seismic acceleration
Procedia PDF Downloads 31010650 Loss Allocation in Radial Distribution Networks for Loads of Composite Types
Authors: Sumit Banerjee, Chandan Kumar Chanda
Abstract:
The paper presents allocation of active power losses and energy losses to consumers connected to radial distribution networks in a deregulated environment for loads of composite types. A detailed comparison among four algorithms, namely quadratic loss allocation, proportional loss allocation, pro rata loss allocation and exact loss allocation methods are presented. Quadratic and proportional loss allocations are based on identifying the active and reactive components of current in each branch and the losses are allocated to each consumer, pro rata loss allocation method is based on the load demand of each consumer and exact loss allocation method is based on the actual contribution of active power loss by each consumer. The effectiveness of the proposed comparison among four algorithms for composite load is demonstrated through an example.Keywords: composite type, deregulation, loss allocation, radial distribution networks
Procedia PDF Downloads 28710649 Passenger Flow Characteristics of Seoul Metropolitan Subway Network
Authors: Kang Won Lee, Jung Won Lee
Abstract:
Characterizing the network flow is of fundamental importance to understand the complex dynamics of networks. And passenger flow characteristics of the subway network are very relevant for an effective transportation management in urban cities. In this study, passenger flow of Seoul metropolitan subway network is investigated and characterized through statistical analysis. Traditional betweenness centrality measure considers only topological structure of the network and ignores the transportation factors. This paper proposes a weighted betweenness centrality measure that incorporates monthly passenger flow volume. We apply the proposed measure on the Seoul metropolitan subway network involving 493 stations and 16 lines. Several interesting insights about the network are derived from the new measures. Using Kolmogorov-Smirnov test, we also find out that monthly passenger flow between any two stations follows a power-law distribution and other traffic characteristics such as congestion level and throughflow traffic follow exponential distribution.Keywords: betweenness centrality, correlation coefficient, power-law distribution, Korea traffic DB
Procedia PDF Downloads 29110648 Multi-Subpopulation Genetic Algorithm with Estimation of Distribution Algorithm for Textile Batch Dyeing Scheduling Problem
Authors: Nhat-To Huynh, Chen-Fu Chien
Abstract:
Textile batch dyeing scheduling problem is complicated which includes batch formation, batch assignment on machines, batch sequencing with sequence-dependent setup time. Most manufacturers schedule their orders manually that are time consuming and inefficient. More power methods are needed to improve the solution. Motivated by the real needs, this study aims to propose approaches in which genetic algorithm is developed with multi-subpopulation and hybridised with estimation of distribution algorithm to solve the constructed problem for minimising the makespan. A heuristic algorithm is designed and embedded into the proposed algorithms to improve the ability to get out of the local optima. In addition, an empirical study is conducted in a textile company in Taiwan to validate the proposed approaches. The results have showed that proposed approaches are more efficient than simulated annealing algorithm.Keywords: estimation of distribution algorithm, genetic algorithm, multi-subpopulation, scheduling, textile dyeing
Procedia PDF Downloads 30010647 Effect of Various Capping Agents on Photocatalytic, Antibacterial and Antibiofilm of ZnO Nanoparticles
Authors: K. Akhil, J. Jayakumar, S. Sudheer Khan
Abstract:
Zinc oxide nanoparticles (ZnO NPs) are extensively used in a wide variety of commercial products including sunscreen, textile and paints. The present study evaluated the effect of surface capping agents including polyethylene glycol (EG), gelatin, polyvinyl alcohol(PVA) and poly vinyl pyrrolidone(PVP) on photocatalytic activity of ZnO NPs. The particles were also tested for its antibacterial and antibiofilm activity against Staphylococcus aureus (MTCC 3160) and Pseudomonas aeruginosa (MTCC 1688). Preliminary characterization was done by UV-Visible spectroscopy. Electron microscopic analysis showed that the particles were hexagonal in shape. The hydrodynamic size distribution was analyzed by using dynamic light scattering method and crystalline nature was determined by X-Ray diffraction method.Keywords: antibacterial, antibiofilm, capping agents, photodegradation, surface coating, zinc oxide nanoparticles
Procedia PDF Downloads 27210646 Knowledge Diffusion via Automated Organizational Cartography: Autocart
Authors: Mounir Kehal, Adel Al Araifi
Abstract:
The post-globalisation epoch has placed businesses everywhere in new and different competitive situations where knowledgeable, effective and efficient behaviour has come to provide the competitive and comparative edge. Enterprises have turned to explicit- and even conceptualising on tacit- Knowledge Management to elaborate a systematic approach to develop and sustain the Intellectual Capital needed to succeed. To be able to do that, you have to be able to visualize your organization as consisting of nothing but knowledge and knowledge flows, whilst being presented in a graphical and visual framework, referred to as automated organizational cartography. Hence, creating the ability of further actively classifying existing organizational content evolving from and within data feeds, in an algorithmic manner, potentially giving insightful schemes and dynamics by which organizational know-how is visualised. It is discussed and elaborated on most recent and applicable definitions and classifications of knowledge management, representing a wide range of views from mechanistic (systematic, data driven) to a more socially (psychologically, cognitive/metadata driven) orientated. More elaborate continuum models, for knowledge acquisition and reasoning purposes, are being used for effectively representing the domain of information that an end user may contain in their decision making process for utilization of available organizational intellectual resources (i.e. Autocart). In this paper we present likewise an empirical research study conducted previously to try and explore knowledge diffusion in a specialist knowledge domain.Keywords: knowledge management, knowledge maps, knowledge diffusion, organizational cartography
Procedia PDF Downloads 41710645 Comparison of Heuristic Methods for Solving Traveling Salesman Problem
Authors: Regita P. Permata, Ulfa S. Nuraini
Abstract:
Traveling Salesman Problem (TSP) is the most studied problem in combinatorial optimization. In simple language, TSP can be described as a problem of finding a minimum distance tour to a city, starting and ending in the same city, and exactly visiting another city. In product distribution, companies often get problems in determining the minimum distance that affects the time allocation. In this research, we aim to apply TSP heuristic methods to simulate nodes as city coordinates in product distribution. The heuristics used are sub tour reversal, nearest neighbor, farthest insertion, cheapest insertion, nearest insertion, and arbitrary insertion. We have done simulation nodes using Euclidean distances to compare the number of cities and processing time, thus we get optimum heuristic method. The results show that the optimum heuristic methods are farthest insertion and nearest insertion. These two methods can be recommended to solve product distribution problems in certain companies.Keywords: Euclidean, heuristics, simulation, TSP
Procedia PDF Downloads 12810644 Iron Supplementation for Patients Undergoing Cardiac Surgery: A Systematic Review and Meta-Analysis of Randomized-Controlled Trials
Authors: Matthew Cameron, Stephen Yang, Latifa Al Kharusi, Adam Gosselin, Anissa Chirico, Pouya Gholipour Baradari
Abstract:
Background: Iron supplementation has been evaluated in several randomized controlled trials (RCTs) for the potential to increase baseline hemoglobin and decrease the incidence of red blood cell (RBC) transfusion during cardiac surgery. This study's main objective was to evaluate the evidence for iron administration in cardiac surgery patients for its effect on the incidence of perioperative RBC transfusion. Methods: This systematic review protocol was registered with PROSPERO (CRD42020161927) on Dec. 19th, 2019, and was prepared as per the PRISMA guidelines. MEDLINE, EMBASE, CENTRAL, Web of Science databases, and Google Scholar were searched for RCTs evaluating perioperative iron administration in adult patients undergoing cardiac surgery. Each abstract was independently reviewed by two reviewers using predefined eligibility criteria. The primary outcome was perioperative RBC transfusion, with secondary outcomes of the number of RBC units transfused, change in ferritin level, reticulocyte count, hemoglobin, and adverse events, after iron administration. The risk of bias was assessed with the Cochrane Collaboration Risk of Bias Tool, and the primary and secondary outcomes were analyzed with a random-effects model. Results: Out of 1556 citations reviewed, five studies (n = 554 patients) met the inclusion criteria. The use of iron demonstrated no difference in transfusion incidence (RR 0.86; 95% CI 0.65 to 1.13). There was a low heterogeneity between studies (I²=0%). The trial sequential analysis suggested an optimal information size of 1132 participants, which the accrued information size did not reach. Conclusion: The current literature does not support the routine use of iron supplementation before cardiac surgery; however, insufficient data is available to draw a definite conclusion. A critical knowledge gap has been identified, and more robust RCTs are required on this topic.Keywords: cardiac surgery, iron, iron supplementation, perioperative medicine, meta-analysis, systematic review, randomized controlled trial
Procedia PDF Downloads 13210643 ACO-TS: an ACO-based Algorithm for Optimizing Cloud Task Scheduling
Authors: Fahad Y. Al-dawish
Abstract:
The current trend by a large number of organizations and individuals to use cloud computing. Many consider it a significant shift in the field of computing. Cloud computing are distributed and parallel systems consisting of a collection of interconnected physical and virtual machines. With increasing request and profit of cloud computing infrastructure, diverse computing processes can be executed on cloud environment. Many organizations and individuals around the world depend on the cloud computing environments infrastructure to carry their applications, platform, and infrastructure. One of the major and essential issues in this environment related to allocating incoming tasks to suitable virtual machine (cloud task scheduling). Cloud task scheduling is classified as optimization problem, and there are several meta-heuristic algorithms have been anticipated to solve and optimize this problem. Good task scheduler should execute its scheduling technique on altering environment and the types of incoming task set. In this research project a cloud task scheduling methodology based on ant colony optimization ACO algorithm, we call it ACO-TS Ant Colony Optimization for Task Scheduling has been proposed and compared with different scheduling algorithms (Random, First Come First Serve FCFS, and Fastest Processor to the Largest Task First FPLTF). Ant Colony Optimization (ACO) is random optimization search method that will be used for assigning incoming tasks to available virtual machines VMs. The main role of proposed algorithm is to minimizing the makespan of certain tasks set and maximizing resource utilization by balance the load among virtual machines. The proposed scheduling algorithm was evaluated by using Cloudsim toolkit framework. Finally after analyzing and evaluating the performance of experimental results we find that the proposed algorithm ACO-TS perform better than Random, FCFS, and FPLTF algorithms in each of the makespaan and resource utilization.Keywords: cloud Task scheduling, ant colony optimization (ACO), cloudsim, cloud computing
Procedia PDF Downloads 42210642 Stabilization of Transition Metal Chromite Nanoparticles in Silica Matrix
Authors: J. Plocek, P. Holec, S. Kubickova, B. Pacakova, I. Matulkova, A. Mantlikova, I. Němec, D. Niznansky, J. Vejpravova
Abstract:
This article presents summary on preparation and characterization of zinc, copper, cadmium and cobalt chromite nano crystals, embedded in an amorphous silica matrix. The ZnCr2O4/SiO2, CuCr2O4/SiO2, CdCr2O4/SiO2 and CoCr2O4/SiO2 nano composites were prepared by a conventional sol-gel method under acid catalysis. Final heat treatment of the samples was carried out at temperatures in the range of 900–1200 °C to adjust the phase composition and the crystallite size, respectively. The resulting samples were characterized by Powder X-ray diffraction (PXRD), High Resolution Transmission Electron Microscopy (HRTEM), Raman/FTIR spectroscopy and magnetic measurements. Formation of the spinel phase was confirmed in all samples. The average size of the nano crystals was determined from the PXRD data and by direct particle size observation on HRTEM; both results were correlated. The mean particle size (reviewed by HRTEM) was in the range from ~ 4 to 46 nm. The results showed that the sol-gel method can be effectively used for preparation of the spinel chromite nano particles embedded in the silica matrix and the particle size is driven by the type of the cation A2+ in the spinel structure and the temperature of the final heat treatment. Magnetic properties of the nano crystals were found to be just moderately modified in comparison to the bulk phases.Keywords: sol-gel method, nanocomposites, Rietveld refinement, Raman spectroscopy, Fourier transform infrared spectroscopy, magnetic properties, spinel, chromite
Procedia PDF Downloads 21610641 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution
Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph
Abstract:
In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.Keywords: forecasting, generalized extreme value (GEV), meteorology, return level
Procedia PDF Downloads 48210640 Estimation of Carbon Dioxide Absorption in DKI Jakarta Green Space
Authors: Mario Belseran
Abstract:
The issue of climate change become world attention where one of them increase in air temperature due to greenhouse gas emissions. This climate change is caused by gases in the atmosphere, one of which is CO2. DKI Jakarta as the capital has a dense population with a variety of existing land use. Land use that is dominated by settlements resulting in fewer green space, which functions to absorb atmospheric CO2. Image interpretation SPOT-7 is used to determine the greenness level of vegetation on a green space using the vegetation index NDVI, EVI, GNDVI and OSAVI. Measuring the diameter and height of trees were also performed to obtain the value of biomass that will be used as the CO2 absorption value. The CO2 absorption value that spread in Jakarta are classified into three classes: high, medium, and low. The distribution pattern of CO2 absorption value at green space in Jakarta dominance in the medium class with the distribution pattern is located in South Jakarta, East Jakarta, North Jakarta and West Jakarta. The distribution pattern of green space in Jakarta scattered randomly and more dominate in East Jakarta and South JakartaKeywords: carbon dioxide, DKI Jakarta, green space, SPOT-7, vegetation index
Procedia PDF Downloads 28110639 A Multi-Objective Decision Making Model for Biodiversity Conservation and Planning: Exploring the Concept of Interdependency
Authors: M. Mohan, J. P. Roise, G. P. Catts
Abstract:
Despite living in an era where conservation zones are de-facto the central element in any sustainable wildlife management strategy, we still find ourselves grappling with several pareto-optimal situations regarding resource allocation and area distribution for the same. In this paper, a multi-objective decision making (MODM) model is presented to answer the question of whether or not we can establish mutual relationships between these contradicting objectives. For our study, we considered a Red-cockaded woodpecker (Picoides borealis) habitat conservation scenario in the coastal plain of North Carolina, USA. Red-cockaded woodpecker (RCW) is a non-migratory territorial bird that excavates cavities in living pine trees for roosting and nesting. The RCW groups nest in an aggregation of cavity trees called ‘cluster’ and for our model we use the number of clusters to be established as a measure of evaluating the size of conservation zone required. The case study is formulated as a linear programming problem and the objective function optimises the Red-cockaded woodpecker clusters, carbon retention rate, biofuel, public safety and Net Present Value (NPV) of the forest. We studied the variation of individual objectives with respect to the amount of area available and plotted a two dimensional dynamic graph after establishing interrelations between the objectives. We further explore the concept of interdependency by integrating the MODM model with GIS, and derive a raster file representing carbon distribution from the existing forest dataset. Model results demonstrate the applicability of interdependency from both linear and spatial perspectives, and suggest that this approach holds immense potential for enhancing environmental investment decision making in future.Keywords: conservation, interdependency, multi-objective decision making, red-cockaded woodpecker
Procedia PDF Downloads 33810638 Superordinated Control for Increasing Feed-in Capacity and Improving Power Quality in Low Voltage Distribution Grids
Authors: Markus Meyer, Bastian Maucher, Rolf Witzmann
Abstract:
The ever increasing amount of distributed generation in low voltage distribution grids (mainly PV and micro-CHP) can lead to reverse load flows from low to medium/high voltage levels at times of high feed-in. Reverse load flow leads to rising voltages that may even exceed the limits specified in the grid codes. Furthermore, the share of electrical loads connected to low voltage distribution grids via switched power supplies continuously increases. In combination with inverter-based feed-in, this results in high harmonic levels reducing overall power quality. Especially high levels of third-order harmonic currents can lead to neutral conductor overload, which is even more critical if lines with reduced neutral conductor section areas are used. This paper illustrates a possible concept for smart grids in order to increase the feed-in capacity, improve power quality and to ensure safe operation of low voltage distribution grids at all times. The key feature of the concept is a hierarchically structured control strategy that is run on a superordinated controller, which is connected to several distributed grid analyzers and inverters via broad band powerline (BPL). The strategy is devised to ensure both quick response time as well as the technically and economically reasonable use of the available inverters in the grid (PV-inverters, batteries, stepless line voltage regulators). These inverters are provided with standard features for voltage control, e.g. voltage dependent reactive power control. In addition they can receive reactive power set points transmitted by the superordinated controller. To further improve power quality, the inverters are capable of active harmonic filtering, as well as voltage balancing, whereas the latter is primarily done by the stepless line voltage regulators. By additionally connecting the superordinated controller to the control center of the grid operator, supervisory control and data acquisition capabilities for the low voltage distribution grid are enabled, which allows easy monitoring and manual input. Such a low voltage distribution grid can also be used as a virtual power plant.Keywords: distributed generation, distribution grid, power quality, smart grid, virtual power plant, voltage control
Procedia PDF Downloads 26710637 Exploring the Correlation between Population Distribution and Urban Heat Island under Urban Data: Taking Shenzhen Urban Heat Island as an Example
Authors: Wang Yang
Abstract:
Shenzhen is a modern city of China's reform and opening-up policy, the development of urban morphology has been established on the administration of the Chinese government. This city`s planning paradigm is primarily affected by the spatial structure and human behavior. The subjective urban agglomeration center is divided into several groups and centers. In comparisons of this effect, the city development law has better to be neglected. With the continuous development of the internet, extensive data technology has been introduced in China. Data mining and data analysis has become important tools in municipal research. Data mining has been utilized to improve data cleaning such as receiving business data, traffic data and population data. Prior to data mining, government data were collected by traditional means, then were analyzed using city-relationship research, delaying the timeliness of urban development, especially for the contemporary city. Data update speed is very fast and based on the Internet. The city's point of interest (POI) in the excavation serves as data source affecting the city design, while satellite remote sensing is used as a reference object, city analysis is conducted in both directions, the administrative paradigm of government is broken and urban research is restored. Therefore, the use of data mining in urban analysis is very important. The satellite remote sensing data of the Shenzhen city in July 2018 were measured by the satellite Modis sensor and can be utilized to perform land surface temperature inversion, and analyze city heat island distribution of Shenzhen. This article acquired and classified the data from Shenzhen by using Data crawler technology. Data of Shenzhen heat island and interest points were simulated and analyzed in the GIS platform to discover the main features of functional equivalent distribution influence. Shenzhen is located in the east-west area of China. The city’s main streets are also determined according to the direction of city development. Therefore, it is determined that the functional area of the city is also distributed in the east-west direction. The urban heat island can express the heat map according to the functional urban area. Regional POI has correspondence. The research result clearly explains that the distribution of the urban heat island and the distribution of urban POIs are one-to-one correspondence. Urban heat island is primarily influenced by the properties of the underlying surface, avoiding the impact of urban climate. Using urban POIs as analysis object, the distribution of municipal POIs and population aggregation are closely connected, so that the distribution of the population corresponded with the distribution of the urban heat island.Keywords: POI, satellite remote sensing, the population distribution, urban heat island thermal map
Procedia PDF Downloads 10510636 Knowledge and Attitude Towards Strabismus Among Adult Residents in Woreta Town, Northwest Ethiopia: A Community-Based Study
Authors: Henok Biruk Alemayehu, Kalkidan Berhane Tsegaye, Fozia Seid Ali, Nebiyat Feleke Adimassu, Getasew Alemu Mersha
Abstract:
Background: Strabismus is a visual disorder where the eyes are misaligned and point in different directions. Untreated strabismus can lead to amblyopia, loss of binocular vision, and social stigma due to its appearance. Since it is assumed that knowledge is pertinent for early screening and prevention of strabismus, the main objective of this study was to assess knowledge and attitudes toward strabismus in Woreta town, Northwest Ethiopia. Providing data in this area is important for planning health policies. Methods: A community-based cross-sectional study was done in Woreta town from April–May 2020. The sample size was determined using a single population proportion formula by taking a 50% proportion of good knowledge, 95% confidence level, 5% margin of errors, and 10% non- response rate. Accordingly, the final computed sample size was 424. All four kebeles were included in the study. There were 42,595 people in total, with 39,684 adults and 9229 house holds. A sample fraction ’’k’’ was obtained by dividing the number of the household by the calculated sample size of 424. Systematic random sampling with proportional allocation was used to select the participating households with a sampling fraction (K) of 21 i.e. each household was approached in every 21 households included in the study. One individual was selected ran- domly from each household with more than one adult, using the lottery method to obtain a final sample size. The data was collected through a face-to-face interview with a pretested and semi-structured questionnaire which was translated from English to Amharic and back to English to maintain its consistency. Data were entered using epi-data version 3.1, then processed and analyzed via SPSS version- 20. Descriptive and analytical statistics were employed to summarize the data. A p-value of less than 0.05 was used to declare statistical significance. Result: A total of 401 individuals aged over 18 years participated, with a response rate of 94.5%. Of those who responded, 56.6% were males. Of all the participants, 36.9% were illiterate. The proportion of people with poor knowledge of strabismus was 45.1%. It was shown that 53.9% of the respondents had a favorable attitude. Older age, higher educational level, having a history of eye examination, and a having a family history of strabismus were significantly associated with good knowledge of strabismus. A higher educational level, older age, and hearing about strabismus were significantly associated with a favorable attitude toward strabismus. Conclusion and recommendation: The proportion of good knowledge and favorable attitude towards strabismus were lower than previously reported in Gondar City, Northwest Ethiopia. There is a need to provide health education and promotion campaigns on strabismus to the community: what strabismus is, its’ possible treatments and the need to bring children to the eye care center for early diagnosis and treatment. it advocate for prospective research endeavors to employ qualitative study design.Additionally, it suggest the exploration of studies that investigate causal-effect relationship.Keywords: strabismus, knowledge, attitude, Woreta
Procedia PDF Downloads 6310635 Heart Attack Prediction Using Several Machine Learning Methods
Authors: Suzan Anwar, Utkarsh Goyal
Abstract:
Heart rate (HR) is a predictor of cardiovascular, cerebrovascular, and all-cause mortality in the general population, as well as in patients with cardio and cerebrovascular diseases. Machine learning (ML) significantly improves the accuracy of cardiovascular risk prediction, increasing the number of patients identified who could benefit from preventive treatment while avoiding unnecessary treatment of others. This research examines relationship between the individual's various heart health inputs like age, sex, cp, trestbps, thalach, oldpeaketc, and the likelihood of developing heart disease. Machine learning techniques like logistic regression and decision tree, and Python are used. The results of testing and evaluating the model using the Heart Failure Prediction Dataset show the chance of a person having a heart disease with variable accuracy. Logistic regression has yielded an accuracy of 80.48% without data handling. With data handling (normalization, standardscaler), the logistic regression resulted in improved accuracy of 87.80%, decision tree 100%, random forest 100%, and SVM 100%.Keywords: heart rate, machine learning, SVM, decision tree, logistic regression, random forest
Procedia PDF Downloads 138