Search results for: stream computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1599

Search results for: stream computing

939 Frequent Itemset Mining Using Rough-Sets

Authors: Usman Qamar, Younus Javed

Abstract:

Frequent pattern mining is the process of finding a pattern (a set of items, subsequences, substructures, etc.) that occurs frequently in a data set. It was proposed in the context of frequent itemsets and association rule mining. Frequent pattern mining is used to find inherent regularities in data. What products were often purchased together? Its applications include basket data analysis, cross-marketing, catalog design, sale campaign analysis, Web log (click stream) analysis, and DNA sequence analysis. However, one of the bottlenecks of frequent itemset mining is that as the data increase the amount of time and resources required to mining the data increases at an exponential rate. In this investigation a new algorithm is proposed which can be uses as a pre-processor for frequent itemset mining. FASTER (FeAture SelecTion using Entropy and Rough sets) is a hybrid pre-processor algorithm which utilizes entropy and rough-sets to carry out record reduction and feature (attribute) selection respectively. FASTER for frequent itemset mining can produce a speed up of 3.1 times when compared to original algorithm while maintaining an accuracy of 71%.

Keywords: rough-sets, classification, feature selection, entropy, outliers, frequent itemset mining

Procedia PDF Downloads 425
938 High Level Synthesis of Canny Edge Detection Algorithm on Zynq Platform

Authors: Hanaa M. Abdelgawad, Mona Safar, Ayman M. Wahba

Abstract:

Real-time image and video processing is a demand in many computer vision applications, e.g. video surveillance, traffic management and medical imaging. The processing of those video applications requires high computational power. Therefore, the optimal solution is the collaboration of CPU and hardware accelerators. In this paper, a Canny edge detection hardware accelerator is proposed. Canny edge detection is one of the common blocks in the pre-processing phase of image and video processing pipeline. Our presented approach targets offloading the Canny edge detection algorithm from processing system (PS) to programmable logic (PL) taking the advantage of High Level Synthesis (HLS) tool flow to accelerate the implementation on Zynq platform. The resulting implementation enables up to a 100x performance improvement through hardware acceleration. The CPU utilization drops down and the frame rate jumps to 60 fps of 1080p full HD input video stream.

Keywords: high level synthesis, canny edge detection, hardware accelerators, computer vision

Procedia PDF Downloads 472
937 Effects of Smoking on the Indoor Air Quality and COVID-19

Authors: Sonam Sandal, Susan Verghese P.

Abstract:

The phrase "environmental tobacco smoke" (ETS) refers to exposure to tobacco smoke that isn't from your own smoking but instead is caused by being in close proximity to someone else's cigar, cigarette, or pipe smoke. Environmental cigarette smoke is one of the main contributors to indoor air pollution (IAP), which is exceedingly harmful to human health and results in millions of deaths each year, according to the World Health Organization. Sidestream smoke (SS), which is discharged from a cigarette's burning end in between puffs, is the primary cause of ETS. The bulk of the ETS residue is composed of gases that are produced while smoking through the cigarette paper, mainstream smoke (MS) ingested, and side stream smoke emitted while inhaling a puff from the burning end. Each of these mixtures—SS, ETS, and MS—is an aerosol composed of an IAP-causing vapor phase and a particle phase. Therefore, indoor air-cleaning equipment designed to remove particles will not significantly alter nicotine exposure but will alter the concentrations of other dangerous substances, including particulate matter (PM), PM 2.5, and PM 10. In conclusion, indoor airborne contaminants pose serious risks to human health. ETS degrades the air quality, and when someone breathes this bad air, it weakens their lungs and makes them more susceptible to COVID-19.

Keywords: pm 10, covid-19, indoor air pollution, cigarette smoke., pm 2.5

Procedia PDF Downloads 63
936 A High Compression Ratio for a Losseless Image Compression Based on the Arithmetic Coding with the Sorted Run Length Coding: Meteosat Second Generation Image Compression

Authors: Cherifi Mehdi, Lahdir Mourad, Ameur Soltane

Abstract:

Image compression is the heart of several multimedia techniques. It is used to reduce the number of bits required to represent an image. Meteosat Second Generation (MSG) satellite allows the acquisition of 12 image files every 15 minutes and that results in a large databases sizes. In this paper, a novel image compression method based on the arithmetic coding with the sorted Run Length Coding (SRLC) for MSG images is proposed. The SRLC allows us to find the occurrence of the consecutive pixels of the original image to create a sorted run. The arithmetic coding allows the encoding of the sorted data of the previous stage to retrieve a unique code word that represents a binary code stream in the sorted order to boost the compression ratio. Through this article, we show that our method can perform the best results concerning compression ratio and bit rate unlike the method based on the Run Length Coding (RLC) and the arithmetic coding. Evaluation criteria like the compression ratio and the bit rate allow the confirmation of the efficiency of our method of image compression.

Keywords: image compression, arithmetic coding, Run Length Coding, RLC, Sorted Run Length Coding, SRLC, Meteosat Second Generation, MSG

Procedia PDF Downloads 342
935 Analyzing Irbid’s Food Waste as Feedstock for Anaerobic Digestion

Authors: Assal E. Haddad

Abstract:

Food waste samples from Irbid were collected from 5 different sources for 12 weeks to characterize their composition in terms of four food categories; rice, meat, fruits and vegetables, and bread. Average food type compositions were 39% rice, 6% meat, 34% fruits and vegetables, and 23% bread. Methane yield was also measured for all food types and was found to be 362, 499, 352, and 375 mL/g VS for rice, meat, fruits and vegetables, and bread, respectively. A representative food waste sample was created to test the actual methane yield and compare it to calculated one. Actual methane yield (414 mL/g VS) was greater than the calculated value (377 mL/g VS) based on food type proportions and their specific methane yield. This study emphasizes the effect of the types of food and their proportions in food waste on the final biogas production. Findings in this study provide representative methane emission factors for Irbid’s food waste, which represent as high as 68% of total Municipal Solid Waste (MSW) in Irbid, and also indicate the energy and economic value within the solid waste stream in Irbid.

Keywords: food waste, solid waste management, anaerobic digestion, methane yield

Procedia PDF Downloads 194
934 An Improved Tracking Approach Using Particle Filter and Background Subtraction

Authors: Amir Mukhtar, Dr. Likun Xia

Abstract:

An improved, robust and efficient visual target tracking algorithm using particle filtering is proposed. Particle filtering has been proven very successful in estimating non-Gaussian and non-linear problems. In this paper, the particle filter is used with color feature to estimate the target state with time. Color distributions are applied as this feature is scale and rotational invariant, shows robustness to partial occlusion and computationally efficient. The performance is made more robust by choosing the different (YIQ) color scheme. Tracking is performed by comparison of chrominance histograms of target and candidate positions (particles). Color based particle filter tracking often leads to inaccurate results when light intensity changes during a video stream. Furthermore, background subtraction technique is used for size estimation of the target. The qualitative evaluation of proposed algorithm is performed on several real-world videos. The experimental results demonstrate that the improved algorithm can track the moving objects very well under illumination changes, occlusion and moving background.

Keywords: tracking, particle filter, histogram, corner points, occlusion, illumination

Procedia PDF Downloads 366
933 On Adaptive and Auto-Configurable Apps

Authors: Prisa Damrongsiri, Kittinan Pongpianskul, Mario Kubek, Herwig Unger

Abstract:

Apps are today the most important possibility to adapt mobile phones and computers to fulfill the special needs of their users. Location- and context-sensitive programs are hereby the key to support the interaction of the user with his/her environment and also to avoid an overload with a plenty of dispensable information. The contribution shows, how a trusted, secure and really bi-directional communication and interaction among users and their environment can be established and used, e.g. in the field of home automation.

Keywords: apps, context-sensitive, location-sensitive, self-configuration, mobile computing, smart home

Procedia PDF Downloads 391
932 The Dependence of the Liquid Application on the Coverage of the Sprayed Objects in Terms of the Characteristics of the Sprayed Object during Spraying

Authors: Beata Cieniawska, Deta Łuczycka, Katarzyna Dereń

Abstract:

When assessing the quality of the spraying procedure, three indicators are used: uneven distribution of precipitation of liquid sprayed, degree of coverage of sprayed surfaces, and deposition of liquid spraying However, there is a lack of information on the relationship between the quality parameters of the procedure. Therefore, the research was carried out at the Institute of Agricultural Engineering of Wrocław University of Environmental and Life Sciences. The aim of the study was to determine the relationship between the degree of coverage of sprayed surfaces and the deposition of liquid in the aspect of the parametric characteristics of the protected plant using selected single and double stream nozzles. Experiments were conducted under laboratory conditions. The carrier of nozzles acted as an independent self-propelled sprayer used for spraying, whereas the parametric characteristics of plants were determined using artificial plants as the ratio of the vertical projection surface and the horizontal projection surface. The results and their analysis showed a strong and very strong correlation between the analyzed parameters in terms of the characteristics of the sprayed object.

Keywords: degree of coverage, deposition of liquid, nozzle, spraying

Procedia PDF Downloads 326
931 The Role of Mass Sport Guidance in the Health Service Industry of China

Authors: Qiu Jian-Rong, Li Qing-Hui, Zhan Dong, Zhang Lei

Abstract:

Facing the problem of the demand of economic restructuring and risk of social economy stagnation due to the ageing of population, the Health Service Industry will play a very important role in the structure of industry in the future. During the process, the orient of Chinese sports medicine as well as the joint with preventive medicine, and the integration with data bank and cloud computing will be involved.

Keywords: China, the health service industry, mass sport, data bank

Procedia PDF Downloads 619
930 A Project in the Framework “Nextgenerationeu”: Sustainable Photoelectrochemical Hydrogen Evolution - SERGIO

Authors: Patrizia Frontera, Anastasia Macario, Simona Crispi, Angela Malara, Pierantonio De Luca, Stefano Trocino

Abstract:

The exploration of solar energy for the photoelectrochemical splitting of water into hydrogen and oxygen has been extensively researched as a means of generating sustainable H₂ fuel. However, despite these efforts, commercialization of this technology has not yet materialized. Presently, the primary impediments to commercialization include low solar-to-hydrogen efficiency (2-3% in PEC with an active area of up to 10-15 cm²), the utilization of costly and critical raw materials (e.g., BiVO₄), and energy losses during the separation of H₂ from O₂ and H₂O vapours in the output stream. The SERGIO partners have identified an advanced approach to fabricate photoelectrode materials, coupled with an appropriate scientific direction to achieve cost-effective solar-driven H₂ production in a tandem photoelectrochemical cell. This project is designed to reach Technology Readiness Level (TRL) 4 by validating the technology in the laboratory using a cell with an active area of up to 10 cm², boasting a solar-to-hydrogen efficiency of 5%, and ensuring acceptable hydrogen purity (99.99%). Our objectives include breakthroughs in cost efficiency, conversion efficiency, and H₂ purity.

Keywords: photoelectrolysis, green hydrogen, photoelectrochemical cell, semiconductors

Procedia PDF Downloads 54
929 Operating System Support for Mobile Device Thermal Management and Performance Optimization in Augmented Reality Applications

Authors: Yasith Mindula Saipath Wickramasinghe

Abstract:

Augmented reality applications require a high processing power to load, render and live stream high-definition AR models and virtual scenes; it also requires device sensors to work excessively to coordinate with internal hardware, OS and give the expected outcome in advance features like object detection, real time tracking, as well as voice and text recognition. Excessive thermal generation due to these advanced functionalities has become a major research problem as it is unbearable for smaller mobile devices to manage such heat increment and battery drainage as it causes physical harm to the devices in the long term. Therefore, effective thermal management is one of the major requirements in Augmented Reality application development. As this paper discusses major causes for this issue, it also provides possible solutions in the means of operating system adaptations as well as further research on best coding practises to optimize the application performance that reduces thermal excessive thermal generation.

Keywords: augmented reality, device thermal management, GPU, operating systems, device I/O, overheating

Procedia PDF Downloads 108
928 A Systematic Review of Street-Level Policy Entrepreneurship Strategies in Different Political Contexts

Authors: Hui Wang, Huan Zhang

Abstract:

This study uses systematic review and qualitative comparative analysis methods to comprehensively inquire about the recent street-level policy entrepreneurship research, to identify the characteristics and lessons we can learn from 20 years of street-level policy entrepreneurship literature, and the relations between political contexts and street-level policy entrepreneurs’ strategies. Using data from a systematic review of street-level policy entrepreneurship literature, we identify the sub-components of different political contexts and core strategies of street-level policy entrepreneurs and estimate the configurational relations between different political settings and street-level policy entrepreneurs’ strategies. Our results show that street-level policy entrepreneurs display social acuity, define the problem, and build team strategies when policy or political streams dominate. Street-level policy entrepreneurs will use lead-by-example strategies when both policy and political streams dominate. Furthermore, street-level policy entrepreneurs will use bureaucratic strategies, even if no stream dominates in the political context.

Keywords: policy entrepreneurs, qualitative comparative analysis, street-level bureaucracy, systematic review

Procedia PDF Downloads 97
927 Application of Lean Manufacturing Tools in Hot Asphalt Production

Authors: S. Bayona, J. Nunez, D. Paez, C. Diaz

Abstract:

The application of Lean manufacturing tools continues to be an effective solution for increasing productivity, reducing costs and eliminating waste in the manufacture of goods and services. This article analyzes the production process of a hot asphalt manufacturing company from an administrative and technical perspective. Three main phases were analyzed, the first phase was related to the determination of the risk priority number of the main operations in asphalt mix production process by an FMEA (Failure Mode Effects Analysis), in the second phase the Value Stream Mapping (VSM) of the production line was performed and in the third phase a SWOT (Strengths, Weaknesses Opportunities, Threats) matrix was constructed. Among the most valued failure modes were the lack training of workers in occupational safety and health issues, the lack of signaling and classification of granulated material, and the overweight of vehicles loaded. The analysis of the results in the three phases agree on the importance of training operational workers, improve communication with external actors in order to minimize delays in material orders and strengthen control suppliers.

Keywords: asphalt, lean manufacturing, productivity, process

Procedia PDF Downloads 106
926 Data Confidentiality in Public Cloud: A Method for Inclusion of ID-PKC Schemes in OpenStack Cloud

Authors: N. Nalini, Bhanu Prakash Gopularam

Abstract:

The term data security refers to the degree of resistance or protection given to information from unintended or unauthorized access. The core principles of information security are the confidentiality, integrity and availability, also referred as CIA triad. Cloud computing services are classified as SaaS, IaaS and PaaS services. With cloud adoption the confidential enterprise data are moved from organization premises to untrusted public network and due to this the attack surface has increased manifold. Several cloud computing platforms like OpenStack, Eucalyptus, Amazon EC2 offer users to build and configure public, hybrid and private clouds. While the traditional encryption based on PKI infrastructure still works in cloud scenario, the management of public-private keys and trust certificates is difficult. The Identity based Public Key Cryptography (also referred as ID-PKC) overcomes this problem by using publicly identifiable information for generating the keys and works well with decentralized systems. The users can exchange information securely without having to manage any trust information. Another advantage is that access control (role based access control policy) information can be embedded into data unlike in PKI where it is handled by separate component or system. In OpenStack cloud platform the keystone service acts as identity service for authentication and authorization and has support for public key infrastructure for auto services. In this paper, we explain OpenStack security architecture and evaluate the PKI infrastructure piece for data confidentiality. We provide method to integrate ID-PKC schemes for securing data while in transit and stored and explain the key measures for safe guarding data against security attacks. The proposed approach uses JPBC crypto library for key-pair generation based on IEEE P1636.3 standard and secure communication to other cloud services.

Keywords: data confidentiality, identity based cryptography, secure communication, open stack key stone, token scoping

Procedia PDF Downloads 372
925 Ceramic Membrane Filtration Technologies for Oilfield Produced Water Treatment

Authors: Mehrdad Ebrahimi, Oliver Schmitz, Axel Schmidt, Peter Czermak

Abstract:

“Produced water” (PW) is any fossil water that is brought to the surface along with crude oil or natural gas. By far, PW is the largest waste stream by volume associated with oil and gas production operations. Due to the increasing volume of waste all over the world in the current decade, the outcome and effect of discharging PW on the environment has lately become a significant issue of environmental concerns. Therefore, there is a need for new technologies for PW treatment due to increase focus on water conservation and environmental regulation. The use of membrane processes for treatment of PW has several advantages over many of the traditional separation techniques. In oilfield produced water treatment with ceramic membranes, process efficiency is characterized by the specific permeate flux and by the oil separation performance. Apart from the membrane properties, the permeate flux during filtration of oily wastewaters is known to be strongly dependent on the constituents of the feed solution, as well as on process conditions, e.g. trans-membrane pressure (TMP) and cross-flow velocity (CFV). The research project presented in these report describes the application of different ceramic membrane filtration technologies for the efficient treatment of oil-field produced water and different model oily solutions.

Keywords: ceramic membrane, membrane fouling, oil rejection, produced water treatment

Procedia PDF Downloads 173
924 The Effect of Velocity Increment by Blockage Factor on Savonius Hydrokinetic Turbine Performance

Authors: Thochi Seb Rengma, Mahendra Kumar Gupta, P. M. V. Subbarao

Abstract:

Hydrokinetic turbines can be used to produce power in inaccessible villages located near rivers. The hydrokinetic turbine uses the kinetic energy of the water and maybe put it directly into the natural flow of water without dams. For off-grid power production, the Savonius-type vertical axis turbine is the easiest to design and manufacture. This proposal uses three-dimensional computational fluid dynamics (CFD) simulations to measure the considerable interaction and complexity of turbine blades. Savonius hydrokinetic turbine (SHKT) performance is affected by a blockage in the river, canals, and waterways. Putting a large object in a water channel causes water obstruction and raises local free stream velocity. The blockage correction factor or velocity increment measures the impact of velocity on the performance. SHKT performance is evaluated by comparing power coefficient (Cp) with tip-speed ratio (TSR) at various blockage ratios. The maximum Cp was obtained at a TSR of 1.1 with a blockage ratio of 45%, whereas TSR of 0.8 yielded the highest Cp without blockage. The greatest Cp of 0.29 was obtained with a 45% blockage ratio compared to a Cp max of 0.18 without a blockage.

Keywords: savonius hydrokinetic turbine, blockage ratio, vertical axis turbine, power coefficient

Procedia PDF Downloads 117
923 Developing a Framework for Open Source Software Adoption in a Higher Education Institution in Uganda. A case of Kyambogo University

Authors: Kafeero Frank

Abstract:

This study aimed at developing a frame work for open source software adoption in an institution of higher learning in Uganda, with the case of KIU as a study area. There were mainly four research questions based on; individual staff interaction with open source software forum, perceived FOSS characteristics, organizational characteristics and external characteristics as factors that affect open source software adoption. The researcher used causal-correlation research design to study effects of these variables on open source software adoption. A quantitative approach was used in this study with self-administered questionnaire on a purposively and randomly sampled sample of university ICT staff. Resultant data was analyzed using means, correlation coefficients and multivariate multiple regression analysis as statistical tools. The study reveals that individual staff interaction with open source software forum and perceived FOSS characteristics were the primary factors that significantly affect FOSS adoption while organizational and external factors were secondary with no significant effect but significant correlation to open source software adoption. It was concluded that for effective open source software adoption to occur there must be more effort on primary factors with subsequent reinforcement of secondary factors to fulfill the primary factors and adoption of open source software. Lastly recommendations were made in line with conclusions for coming up with Kyambogo University frame work for open source software adoption in institutions of higher learning. Areas of further research recommended include; Stakeholders’ analysis of open source software adoption in Uganda; Challenges and way forward. Evaluation of Kyambogo University frame work for open source software adoption in institutions of higher learning. Framework development for cloud computing adoption in Ugandan universities. Framework for FOSS development in Uganda IT industry

Keywords: open source software., organisational characteristics, external characteristics, cloud computing adoption

Procedia PDF Downloads 61
922 Highway Capacity and Level of Service

Authors: Kidist Mesfin Nguse

Abstract:

Ethiopia is the second most densely populated nation in Africa, and about 121 million people as the 2022 Ethiopia population live report recorded. In recent years, the Ethiopian government (GOE) has been gradually growing its road network. With 138,127 kilometers (85,825 miles) of all-weather roads as of the end of 2018–19, Ethiopia possessed just 39% of the nation's necessary road network and lacked a well-organized system. The Ethiopian urban population report recorded that about 21% of the population lives in urban areas, and the high population, coupled with growth in various infrastructures, has led to the migration of the workforce from rural areas to cities across the country. In main roads, the heterogeneous traffic flow with various operational features makes it more unfavorable, causing frequent congestion in the stretch of road. The Level of Service (LOS), a qualitative measure of traffic, is categorized based on the operating conditions in the traffic stream. Determining the capacity and LOS for this city is very crucial as this affects the planning and design of traffic systems and their operation, and the allocation of route selection for infrastructure building projects to provide for a considerably good level of service.

Keywords: capacity, level of service, traffic volume, free flow speed

Procedia PDF Downloads 39
921 A Survey on Constraint Solving Approaches Using Parallel Architectures

Authors: Nebras Gharbi, Itebeddine Ghorbel

Abstract:

In the latest years and with the advancements of the multicore computing world, the constraint programming community tried to benefit from the capacity of new machines and make the best use of them through several parallel schemes for constraint solving. In this paper, we propose a survey of the different proposed approaches to solve Constraint Satisfaction Problems using parallel architectures. These approaches use in a different way a parallel architecture: the problem itself could be solved differently by several solvers or could be split over solvers.

Keywords: constraint programming, parallel programming, constraint satisfaction problem, speed-up

Procedia PDF Downloads 305
920 Comparison of Salt-Water Intrusion into Eastern and Western Coastal Aquifers of Urmia Lake thru Over-Exploration of Groundwater Resources

Authors: Saman Javadi, Mohammad Hassan Mahmoudi, Fatemeh Jafari, Aminreza Neshat

Abstract:

Urmia Lake’s water level has been dropped during the past decade. Although the most common reason in studies was declared climate change, but observation of adjacent lake (like Van in Turkey) is not the same as the common reason. Most of studies were focused on climate and land use change, but groundwater resource as one of the most important element is negligible. Due to population and agriculture activities growth, exploration of groundwater resource has been increased. In as much as continued decline of water levels can lead to saltwater intrusion, reduce stream discharge near outcrop regions and threaten groundwater quality, aquifers of this region were affected by saltwater intrusion of Urmia Lake. In this research comparison of saltwater intrusion into eastern and western coastal aquifer was studied. In conclusion eastern aquifers are in a critical situation; vice versa the western ones are in a better situation. Thus applying management of groundwater operation would be necessary for eastern aquifers.

Keywords: coastal aquifer, groundwater over-exploration, saltwater intrusion, Urmia Lake

Procedia PDF Downloads 527
919 Discerning Divergent Nodes in Social Networks

Authors: Mehran Asadi, Afrand Agah

Abstract:

In data mining, partitioning is used as a fundamental tool for classification. With the help of partitioning, we study the structure of data, which allows us to envision decision rules, which can be applied to classification trees. In this research, we used online social network dataset and all of its attributes (e.g., Node features, labels, etc.) to determine what constitutes an above average chance of being a divergent node. We used the R statistical computing language to conduct the analyses in this report. The data were found on the UC Irvine Machine Learning Repository. This research introduces the basic concepts of classification in online social networks. In this work, we utilize overfitting and describe different approaches for evaluation and performance comparison of different classification methods. In classification, the main objective is to categorize different items and assign them into different groups based on their properties and similarities. In data mining, recursive partitioning is being utilized to probe the structure of a data set, which allow us to envision decision rules and apply them to classify data into several groups. Estimating densities is hard, especially in high dimensions, with limited data. Of course, we do not know the densities, but we could estimate them using classical techniques. First, we calculated the correlation matrix of the dataset to see if any predictors are highly correlated with one another. By calculating the correlation coefficients for the predictor variables, we see that density is strongly correlated with transitivity. We initialized a data frame to easily compare the quality of the result classification methods and utilized decision trees (with k-fold cross validation to prune the tree). The method performed on this dataset is decision trees. Decision tree is a non-parametric classification method, which uses a set of rules to predict that each observation belongs to the most commonly occurring class label of the training data. Our method aggregates many decision trees to create an optimized model that is not susceptible to overfitting. When using a decision tree, however, it is important to use cross-validation to prune the tree in order to narrow it down to the most important variables.

Keywords: online social networks, data mining, social cloud computing, interaction and collaboration

Procedia PDF Downloads 144
918 Simulation of the FDA Centrifugal Blood Pump Using High Performance Computing

Authors: Mehdi Behbahani, Sebastian Rible, Charles Moulinec, Yvan Fournier, Mike Nicolai, Paolo Crosetto

Abstract:

Computational Fluid Dynamics blood-flow simulations are increasingly used to develop and validate blood-contacting medical devices. This study shows that numerical simulations can provide additional and accurate estimates of relevant hemodynamic indicators (e.g., recirculation zones or wall shear stresses), which may be difficult and expensive to obtain from in-vivo or in-vitro experiments. The most recent FDA (Food and Drug Administration) benchmark consisted of a simplified centrifugal blood pump model that contains fluid flow features as they are commonly found in these devices with a clear focus on highly turbulent phenomena. The FDA centrifugal blood pump study is composed of six test cases with different volumetric flow rates ranging from 2.5 to 7.0 liters per minute, pump speeds, and Reynolds numbers ranging from 210,000 to 293,000. Within the frame of this study different turbulence models were tested including RANS models, e.g. k-omega, k-epsilon and a Reynolds Stress Model (RSM) and, LES. The partitioners Hilbert, METIS, ParMETIS and SCOTCH were used to create an unstructured mesh of 76 million elements and compared in their efficiency. Computations were performed on the JUQUEEN BG/Q architecture applying the highly parallel flow solver Code SATURNE and typically using 32768 or more processors in parallel. Visualisations were performed by means of PARAVIEW. Different turbulence models including all six flow situations could be successfully analysed and validated against analytical considerations and from comparison to other data-bases. It showed that an RSM represents an appropriate choice with respect to modeling high-Reynolds number flow cases. Especially, the Rij-SSG (Speziale, Sarkar, Gatzki) variant turned out to be a good approach. Visualisation of complex flow features could be obtained and the flow situation inside the pump could be characterized.

Keywords: blood flow, centrifugal blood pump, high performance computing, scalability, turbulence

Procedia PDF Downloads 375
917 Rainfall–Runoff Simulation Using WetSpa Model in Golestan Dam Basin, Iran

Authors: M. R. Dahmardeh Ghaleno, M. Nohtani, S. Khaledi

Abstract:

Flood simulation and prediction is one of the most active research areas in surface water management. WetSpa is a distributed, continuous, and physical model with daily or hourly time step that explains precipitation, runoff, and evapotranspiration processes for both simple and complex contexts. This model uses a modified rational method for runoff calculation. In this model, runoff is routed along the flow path using Diffusion-Wave equation which depends on the slope, velocity, and flow route characteristics. Golestan Dam Basin is located in Golestan province in Iran and it is passing over coordinates 55° 16´ 50" to 56° 4´ 25" E and 37° 19´ 39" to 37° 49´ 28"N. The area of the catchment is about 224 km2, and elevations in the catchment range from 414 to 2856 m at the outlet, with average slope of 29.78%. Results of the simulations show a good agreement between calculated and measured hydrographs at the outlet of the basin. Drawing upon Nash-Sutcliffe model efficiency coefficient for calibration periodic model estimated daily hydrographs and maximum flow rate with an accuracy up to 59% and 80.18%, respectively.

Keywords: watershed simulation, WetSpa, stream flow, flood prediction

Procedia PDF Downloads 236
916 Economic Analysis of a Carbon Abatement Technology

Authors: Hameed Rukayat Opeyemi, Pericles Pilidis Pagone Emmanuele, Agbadede Roupa, Allison Isaiah

Abstract:

Climate change represents one of the single most challenging problems facing the world today. According to the National Oceanic and Administrative Association, Atmospheric temperature rose almost 25% since 1958, Artic sea ice has shrunk 40% since 1959 and global sea levels have risen more than 5.5cm since 1990. Power plants are the major culprits of GHG emission to the atmosphere. Several technologies have been proposed to reduce the amount of GHG emitted to the atmosphere from power plant, one of which is the less researched Advanced zero-emission power plant. The advanced zero emission power plants make use of mixed conductive membrane (MCM) reactor also known as oxygen transfer membrane (OTM) for oxygen transfer. The MCM employs membrane separation process. The membrane separation process was first introduced in 1899 when Walter Hermann Nernst investigated electric current between metals and solutions. He found that when a dense ceramic is heated, the current of oxygen molecules move through it. In the bid to curb the amount of GHG emitted to the atmosphere, the membrane separation process was applied to the field of power engineering in the low carbon cycle known as the Advanced zero emission power plant (AZEP cycle). The AZEP cycle was originally invented by Norsk Hydro, Norway and ABB Alstom power (now known as Demag Delaval Industrial turbomachinery AB), Sweden. The AZEP drew a lot of attention because its ability to capture ~100% CO2 and also boasts of about 30-50% cost reduction compared to other carbon abatement technologies, the penalty in efficiency is also not as much as its counterparts and crowns it with almost zero NOx emissions due to very low nitrogen concentrations in the working fluid. The advanced zero emission power plants differ from a conventional gas turbine in the sense that its combustor is substituted with the mixed conductive membrane (MCM-reactor). The MCM-reactor is made up of the combustor, low-temperature heat exchanger LTHX (referred to by some authors as air preheater the mixed conductive membrane responsible for oxygen transfer and the high-temperature heat exchanger and in some layouts, the bleed gas heat exchanger. Air is taken in by the compressor and compressed to a temperature of about 723 Kelvin and pressure of 2 Mega-Pascals. The membrane area needed for oxygen transfer is reduced by increasing the temperature of 90% of the air using the LTHX; the temperature is also increased to facilitate oxygen transfer through the membrane. The air stream enters the LTHX through the transition duct leading to inlet of the LTHX. The temperature of the air stream is then increased to about 1150 K depending on the design point specification of the plant and the efficiency of the heat exchanging system. The amount of oxygen transported through the membrane is directly proportional to the temperature of air going through the membrane. The AZEP cycle was developed using the Fortran software and economic analysis was conducted using excel and Matlab followed by optimization case study. The Simple bleed gas heat exchange layout (100 % CO2 capture), Bleed gas heat exchanger layout with flue gas turbine (100 % CO2 capture), Pre-expansion reheating layout (Sequential burning layout)–AZEP 85% (85% CO2 capture) and Pre-expansion reheating layout (Sequential burning layout) with flue gas turbine–AZEP 85% (85% CO2 capture). This paper discusses monte carlo risk analysis of four possible layouts of the AZEP cycle.

Keywords: gas turbine, global warming, green house gas, fossil fuel power plants

Procedia PDF Downloads 388
915 Landslide Susceptibility Mapping Using Soft Computing in Amhara Saint

Authors: Semachew M. Kassa, Africa M Geremew, Tezera F. Azmatch, Nandyala Darga Kumar

Abstract:

Frequency ratio (FR) and analytical hierarchy process (AHP) methods are developed based on past landslide failure points to identify the landslide susceptibility mapping because landslides can seriously harm both the environment and society. However, it is still difficult to select the most efficient method and correctly identify the main driving factors for particular regions. In this study, we used fourteen landslide conditioning factors (LCFs) and five soft computing algorithms, including Random Forest (RF), Support Vector Machine (SVM), Logistic Regression (LR), Artificial Neural Network (ANN), and Naïve Bayes (NB), to predict the landslide susceptibility at 12.5 m spatial scale. The performance of the RF (F1-score: 0.88, AUC: 0.94), ANN (F1-score: 0.85, AUC: 0.92), and SVM (F1-score: 0.82, AUC: 0.86) methods was significantly better than the LR (F1-score: 0.75, AUC: 0.76) and NB (F1-score: 0.73, AUC: 0.75) method, according to the classification results based on inventory landslide points. The findings also showed that around 35% of the study region was made up of places with high and very high landslide risk (susceptibility greater than 0.5). The very high-risk locations were primarily found in the western and southeastern regions, and all five models showed good agreement and similar geographic distribution patterns in landslide susceptibility. The towns with the highest landslide risk include Amhara Saint Town's western part, the Northern part, and St. Gebreal Church villages, with mean susceptibility values greater than 0.5. However, rainfall, distance to road, and slope were typically among the top leading factors for most villages. The primary contributing factors to landslide vulnerability were slightly varied for the five models. Decision-makers and policy planners can use the information from our study to make informed decisions and establish policies. It also suggests that various places should take different safeguards to reduce or prevent serious damage from landslide events.

Keywords: artificial neural network, logistic regression, landslide susceptibility, naïve Bayes, random forest, support vector machine

Procedia PDF Downloads 64
914 A Two-Step, Temperature-Staged, Direct Coal Liquefaction Process

Authors: Reyna Singh, David Lokhat, Milan Carsky

Abstract:

The world crude oil demand is projected to rise to 108.5 million bbl/d by the year 2035. With reserves estimated at 869 billion tonnes worldwide, coal is an abundant resource. This work was aimed at producing a high value hydrocarbon liquid product from the Direct Coal Liquefaction (DCL) process at, comparatively, mild operating conditions. Via hydrogenation, the temperature-staged approach was investigated. In a two reactor lab-scale pilot plant facility, the objectives included maximising thermal dissolution of the coal in the presence of a hydrogen donor solvent in the first stage, subsequently promoting hydrogen saturation and hydrodesulphurization (HDS) performance in the second. The feed slurry consisted of high grade, pulverized bituminous coal on a moisture-free basis with a size fraction of < 100μm; and Tetralin mixed in 2:1 and 3:1 solvent/coal ratios. Magnetite (Fe3O4) at 0.25wt% of the dry coal feed was added for the catalysed runs. For both stages, hydrogen gas was used to maintain a system pressure of 100barg. In the first stage, temperatures of 250℃ and 300℃, reaction times of 30 and 60 minutes were investigated in an agitated batch reactor. The first stage liquid product was pumped into the second stage vertical reactor, which was designed to counter-currently contact the hydrogen rich gas stream and incoming liquid flow in the fixed catalyst bed. Two commercial hydrotreating catalysts; Cobalt-Molybdenum (CoMo) and Nickel-Molybdenum (NiMo); were compared in terms of their conversion, selectivity and HDS performance at temperatures 50℃ higher than the respective first stage tests. The catalysts were activated at 300°C with a hydrogen flowrate of approximately 10 ml/min prior to the testing. A gas-liquid separator at the outlet of the reactor ensured that the gas was exhausted to the online VARIOplus gas analyser. The liquid was collected and sampled for analysis using Gas Chromatography-Mass Spectrometry (GC-MS). Internal standard quantification methods for the sulphur content, the BTX (benzene, toluene, and xylene) and alkene quality; alkanes and polycyclic aromatic hydrocarbon (PAH) compounds in the liquid products were guided by ASTM standards of practice for hydrocarbon analysis. In the first stage, using a 2:1 solvent/coal ratio, an increased coal to liquid conversion was favoured by a lower operating temperature of 250℃, 60 minutes and a system catalysed by magnetite. Tetralin functioned effectively as the hydrogen donor solvent. A 3:1 ratio favoured increased concentrations of the long chain alkanes undecane and dodecane, unsaturated alkenes octene and nonene and PAH compounds such as indene. The second stage product distribution showed an increase in the BTX quality of the liquid product, branched chain alkanes and a reduction in the sulphur concentration. As an HDS performer and selectivity to the production of long and branched chain alkanes, NiMo performed better than CoMo. CoMo is selective to a higher concentration of cyclohexane. For 16 days on stream each, NiMo had a higher activity than CoMo. The potential to cover the demand for low–sulphur, crude diesel and solvents from the production of high value hydrocarbon liquid in the said process, is thus demonstrated.

Keywords: catalyst, coal, liquefaction, temperature-staged

Procedia PDF Downloads 639
913 Feasibility Study to Enhance the Heat Transfer in a Typical Pressurized Water Reactor by Ribbed Spacer Grids

Authors: A. Ghadbane, M. N. Bouaziz, S. Hanini, B. Baggoura, M. Abbaci

Abstract:

The spacer grids are used to fix the rods bundle in a nuclear reactor core also act as turbulence-enhancing devices to improve the heat transfer from the hot surfaces of the rods to the surrounding coolant stream. Therefore, the investigation of thermal-hydraulic characteristics inside the rod bundles is important for optima design and safety operation of a nuclear reactor power plant. This contribution presents a feasibility study to use the ribbed spacer grids as mixing devices. The present study evaluates the effects of different ribbed spacer grids configurations on flow pattern and heat transfer in the downstream of the mixing devices in a 2 x 2 rod bundle array. This is done by obtaining velocity and pressure fields, turbulent intensity and the heat transfer coefficient using a three-dimensional CFD analysis. Numerical calculations are performed by employing K-ε turbulent model. The computational results obtained are promising and the comparison with standard spacer grids shows a clear difference which required the experimental approach to validate.

Keywords: PWR fuel assembly, spacer grid, mixing vane, swirl flow, turbulent heat transfer, CFD

Procedia PDF Downloads 482
912 Axisymmetric Rotating Flow over a Permeable Surface with Heat and Mass Transfer Effects

Authors: Muhammad Faraz, Talat Rafique, Jang Min Park

Abstract:

In this article, rotational flow above a permeable surface with a variable free stream angular velocity is considered. Main interest is to solve the associated heat/mass transport equations under different situations. Firstly, heat transport phenomena occurring in generalized vortex flow are analyzed under two altered heating processes, namely, the (i) prescribed surface temperature and (ii) prescribed heat flux. The vortex motion imposed at infinity is assumed to follow a power-law form 〖(r/r_0)〗^((2n-1)) where r denotes the radial coordinate, r_0 the disk radius, and n is a power-law parameter. Assuming a similar solution, the governing Navier-Stokes equations transform into a set of coupled ODEs which are treated numerically for the aforementioned thermal conditions. Secondly, mass transport phenomena accompanied by activation energy are incorporated into the generalized vortex flow situation. After finding self-similar equations, a numerical solution is furnished by using MATLAB's built-in function bvp4c.

Keywords: bödewadt flow, vortex flow, rotating flows, prescribed heat flux, permeable surface, activation energy

Procedia PDF Downloads 105
911 A Unified Deep Framework for Joint 3d Pose Estimation and Action Recognition from a Single Color Camera

Authors: Huy Hieu Pham, Houssam Salmane, Louahdi Khoudour, Alain Crouzil, Pablo Zegers, Sergio Velastin

Abstract:

We present a deep learning-based multitask framework for joint 3D human pose estimation and action recognition from color video sequences. Our approach proceeds along two stages. In the first, we run a real-time 2D pose detector to determine the precise pixel location of important key points of the body. A two-stream neural network is then designed and trained to map detected 2D keypoints into 3D poses. In the second, we deploy the Efficient Neural Architecture Search (ENAS) algorithm to find an optimal network architecture that is used for modeling the Spatio-temporal evolution of the estimated 3D poses via an image-based intermediate representation and performing action recognition. Experiments on Human3.6M, Microsoft Research Redmond (MSR) Action3D, and Stony Brook University (SBU) Kinect Interaction datasets verify the effectiveness of the proposed method on the targeted tasks. Moreover, we show that our method requires a low computational budget for training and inference.

Keywords: human action recognition, pose estimation, D-CNN, deep learning

Procedia PDF Downloads 133
910 The Role of E-Learning in Science, Technology, Engineering, and Math Education

Authors: Annette McArthur

Abstract:

The traditional model of teaching and learning, where ICT sits as a separate entity is not a model for a 21st century school. It is imperative that teaching and learning embraces technological advancements. The challenge in schools lies in shifting the mindset of teachers so they see ICT as integral to their teaching, learning and curriculum rather than a separate E-Learning curriculum stream. This research project investigates how the effective, planned, intentional integration of ICT into a STEM curriculum, can enable the shift in the teacher mindset. The project incorporated: • Developing a professional coaching relationship with key STEM teachers. • Facilitating staff professional development involving student centered project based learning pedagogy in the context of a STEM curriculum. • Facilitating staff professional development involving digital literacy. • Establishing a professional community where collaboration; sharing and reflection were part of the culture of the STEM community. • Facilitating classroom support for the effective delivery innovative STEM curriculum. • Developing STEM learning spaces where technologies were used to empower and engage learners to participate in student-centered, project-based learning.

Keywords: e-learning, ICT, project based learning, STEM

Procedia PDF Downloads 292