Search results for: grid-interactive efficient buildings (GEB)
2023 Embedded System of Signal Processing on FPGA: Underwater Application Architecture
Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad
Abstract:
The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing
Procedia PDF Downloads 822022 The Influence of Human Factors Education on the Irish Registered Pre-Hospital Practitioner within the National Ambulance Service
Authors: Desmond Wade, Alfredo Ormazabal
Abstract:
Background: Ever since it commenced its registration process of pre-hospital practitioners in the year 2000 through the Irish Government Statute Instrument (SI 109 of 2000) process, the approach to education of its professionals has changed drastically. The progression from the traditional behaviouristic to the current constructivist approach has been based on experiences from other sectors and industries, nationally and internationally. Today, the delivery of a safe and efficient ambulance service heavily depends on its practitioners’ range of technical skills, academic knowledge, and overall competences. As these increase, so does the level of complexity of paramedics’ everyday practice. This has made it inevitable to consider the 'Human Factor' as a source of potential risk and made formative institutions like the National Ambulance Service College to include it in their curriculum. Methods: This paper used a mixed-method approach, where both, an online questionnaire and a set of semi-structured interviews were the source of primary data. An analysis of this data was carried out using qualitative and quantitative data analysis. Conclusions: The evidence presented leads to the conclusion that in the National Ambulance Service there is a considerable lack of education of Human Factors and the levels in understanding of how to manage Human Factors in practice vary across its spectrum. Paramedic Practitioners in Ireland seem to understand that the responsibility of patient care lies on the team, rather than on the most hierarchically senior practitioner present in the scene.Keywords: human factors, ergonomics, stress, decision making, pre-hospital care, paramedic, education
Procedia PDF Downloads 1542021 An Optimal Path for Virtual Reality Education using Association Rules
Authors: Adam Patterson
Abstract:
This study analyzes the self-reported experiences of virtual reality users to develop insight into an optimal learning path for education within virtual reality. This research uses a sample of 1000 observations to statistically define factors influencing (i) immersion level and (ii) motion sickness rating for virtual reality experience respondents of college age. This paper recommends an efficient duration for each virtual reality session, to minimize sickness and maximize engagement, utilizing modern machine learning methods such as association rules. The goal of this research, in augmentation with previous literature, is to inform logistical decisions relating to implementation of pilot instruction for virtual reality at the collegiate level. Future research will include a Randomized Control Trial (RCT) to quantify the effect of virtual reality education on student learning outcomes and engagement measures. Current research aims to maximize the treatment effect within the RCT by optimizing the learning benefits of virtual reality. Results suggest significant gender heterogeneity amongst likelihood of reporting motion sickness. Females are 1.7 times more likely, than males, to report high levels of motion sickness resulting from a virtual reality experience. Regarding duration, respondents were 1.29 times more likely to select the lowest level of motion sickness after an engagement lasting between 24.3 and 42 minutes. Conversely, respondents between 42 to 60 minutes were 1.2 times more likely to select the higher levels of motion sickness.Keywords: applications and integration of e-education, practices and cases in e-education, systems and technologies in e-education, technology adoption and diffusion of e-learning
Procedia PDF Downloads 722020 Binderless Naturally-extracted Metal-free Electrocatalyst for Efficient NOₓ Reduction
Authors: Hafiz Muhammad Adeel Sharif, Tian Li, Changping Li
Abstract:
Recently, the emission of nitrogen-sulphur oxides (NOₓ, SO₂) has become a global issue and causing serious threats to health and the environment. Catalytic reduction of NOx and SOₓ gases into friendly gases is considered one of the best approaches. However, regeneration of the catalyst, higher bond-dissociation energy for NOx, i.e., 150.7 kcal/mol, escape of intermediate gas (N₂O, a greenhouse gas) with treated flue-gas, and limited activity of catalyst remains a great challenge. Here, a cheap, binderless naturally-extracted bass-wood thin carbon electrode (TCE) is presented, which shows excellent catalytic activity towards NOx reduction. The bass-wood carbonization at 900 ℃ followed by thermal activation in the presence of CO2 gas at 750 ℃. The thermal activation resulted in an increase in epoxy groups on the surface of the TCE and enhancement in the surface area as well as the degree of graphitization. The TCE unique 3D strongly inter-connected network through hierarchical micro/meso/macro pores that allow large electrode/electrolyte interface. Owing to these characteristics, the TCE exhibited excellent catalytic efficiency towards NOx (~83.3%) under ambient conditions and enhanced catalytic response under pH and sulphite exposure as well as excellent stability up to 168 hours. Moreover, a temperature-dependent activity trend was found where the highest catalytic activity was achieved at 80 ℃, beyond which the electrolyte became evaporative and resulted in a performance decrease. The designed electrocatalyst showed great potential for effective NOx-reduction, which is highly cost-effective, green, and sustainable.Keywords: electrocatalyst, NOx-reduction, bass-wood electrode, integrated wet-scrubbing, sustainable
Procedia PDF Downloads 802019 Statistical Modeling of Local Area Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes
Authors: Jihad Daba, Jean-Pierre Dubois
Abstract:
Multi path fading noise degrades the performance of cellular communication, most notably in femto- and pico-cells in 3G and 4G systems. When the wireless channel consists of a small number of scattering paths, the statistics of fading noise is not analytically tractable and poses a serious challenge to developing closed canonical forms that can be analysed and used in the design of efficient and optimal receivers. In this context, noise is multiplicative and is referred to as stochastically local fading. In many analytical investigation of multiplicative noise, the exponential or Gamma statistics are invoked. More recent advances by the author of this paper have utilized a Poisson modulated and weighted generalized Laguerre polynomials with controlling parameters and uncorrelated noise assumptions. In this paper, we investigate the statistics of multi-diversity stochastically local area fading channel when the channel consists of randomly distributed Rayleigh and Rician scattering centers with a coherent specular Nakagami-distributed line of sight component and an underlying doubly stochastic Poisson process driven by a lognormal intensity. These combined statistics form a unifying triply stochastic filtered marked Poisson point process model.Keywords: cellular communication, femto and pico-cells, stochastically local area fading channel, triply stochastic filtered marked Poisson point process
Procedia PDF Downloads 4522018 Simulation and Controller Tunning in a Photo-Bioreactor Applying by Taguchi Method
Authors: Hosein Ghahremani, MohammadReza Khoshchehre, Pejman Hakemi
Abstract:
This study involves numerical simulations of a vertical plate-type photo-bioreactor to investigate the performance of Microalgae Spirulina and Control and optimization of parameters for the digital controller by Taguchi method that MATLAB software and Qualitek-4 has been made. Since the addition of parameters such as temperature, dissolved carbon dioxide, biomass, and ... Some new physical parameters such as light intensity and physiological conditions like photosynthetic efficiency and light inhibitors are involved in biological processes, control is facing many challenges. Not only facilitate the commercial production photo-bioreactor Microalgae as feed for aquaculture and food supplements are efficient systems but also as a possible platform for the production of active molecules such as antibiotics or innovative anti-tumor agents, carbon dioxide removal and removal of heavy metals from wastewater is used. Digital controller is designed for controlling the light bioreactor until Microalgae growth rate and carbon dioxide concentration inside the bioreactor is investigated. The optimal values of the controller parameters of the S/N and ANOVA analysis software Qualitek-4 obtained With Reaction curve, Cohen-Con and Ziegler-Nichols method were compared. The sum of the squared error obtained for each of the control methods mentioned, the Taguchi method as the best method for controlling the light intensity was selected photo-bioreactor. This method compared to control methods listed the higher stability and a shorter interval to be answered.Keywords: photo-bioreactor, control and optimization, Light intensity, Taguchi method
Procedia PDF Downloads 3982017 Considerations upon Structural Health Monitoring of Small to Medium Wind Turbines
Authors: Nicolae Constantin, Ştefan Sorohan
Abstract:
The small and medium wind turbines are running in quite different conditions as compared to the big ones. Consequently, they need also a different approach concerning the structural health monitoring (SHM) issues. There are four main differences between the above mentioned categories: (i) significantly smaller dimensions, (ii) considerably higher rotation speed, (iii) generally small distance between the turbine and the energy consumer and (iv) monitoring assumed in many situations by the owner. In such conditions, nondestructive inspections (NDI) have to be made as much as possible with affordable, yet effective techniques, requiring portable and accessible equipment. Additionally, the turbines and accessories should be easy to mount, dispose and repair. As the materials used for such unit can be metals, composites and combined, the technologies should be adapted accordingly. An example in which the two materials co-exist is the situation in which the damaged metallic skin of a blade is repaired with a composite patch. The paper presents the inspection of the bonding state of the patch, using portable ultrasonic equipment, able to put in place the Lamb wave method, which proves efficient in global and local inspections as well. The equipment is relatively easy to handle and can be borrowed from specialized laboratories or used by a community of small wind turbine users, upon the case. This evaluation is the first in a row, aimed to evaluate efficiency of NDI performed with rather accessible, less sophisticated equipment and related inspection techniques, having field inspection capabilities. The main goal is to extend such inspection procedures to other components of the wind power unit, such as the support tower, water storage tanks, etc.Keywords: structural health monitoring, small wind turbines, non-destructive inspection, field inspection capabilities
Procedia PDF Downloads 3422016 The Analysis of Emergency Shutdown Valves Torque Data in Terms of Its Use as a Health Indicator for System Prognostics
Authors: Ewa M. Laskowska, Jorn Vatn
Abstract:
Industry 4.0 focuses on digital optimization of industrial processes. The idea is to use extracted data in order to build a decision support model enabling use of those data for real time decision making. In terms of predictive maintenance, the desired decision support tool would be a model enabling prognostics of system's health based on the current condition of considered equipment. Within area of system prognostics and health management, a commonly used health indicator is Remaining Useful Lifetime (RUL) of a system. Because the RUL is a random variable, it has to be estimated based on available health indicators. Health indicators can be of different types and come from different sources. They can be process variables, equipment performance variables, data related to number of experienced failures, etc. The aim of this study is the analysis of performance variables of emergency shutdown valves (ESV) used in oil and gas industry. ESV is inspected periodically, and at each inspection torque and time of valve operation are registered. The data will be analyzed by means of machine learning or statistical analysis. The purpose is to investigate whether the available data could be used as a health indicator for a prognostic purpose. The second objective is to examine what is the most efficient way to incorporate the data into predictive model. The idea is to check whether the data can be applied in form of explanatory variables in Markov process or whether other stochastic processes would be a more convenient to build an RUL model based on the information coming from registered data.Keywords: emergency shutdown valves, health indicator, prognostics, remaining useful lifetime, RUL
Procedia PDF Downloads 952015 Analysis of Shallow Foundation Using Conventional and Finite Element Approach
Authors: Sultan Al Shafian, Mozaher Ul Kabir, Khondoker Istiak Ahmad, Masnun Abrar, Mahfuza Khanum, Hossain M. Shahin
Abstract:
For structural evaluation of shallow foundation, the modulus of subgrade reaction is one of the most widely used and accepted parameter for its ease of calculations. To determine this parameter, one of the most common field method is Plate Load test method. In this field test method, the subgrade modulus is considered for a specific location and according to its application, it is assumed that the displacement occurred in one place does not affect other adjacent locations. For this kind of assumptions, the modulus of subgrade reaction sometimes forced the engineers to overdesign the underground structure, which eventually results in increasing the cost of the construction and sometimes failure of the structure. In the present study, the settlement of a shallow foundation has been analyzed using both conventional and numerical analysis. Around 25 plate load tests were conducted on a sand fill site in Bangladesh to determine the Modulus of Subgrade reaction of ground which is later used to design a shallow foundation considering different depth. After the collection of the field data, the field condition was appropriately simulated in a finite element software. Finally results obtained from both the conventional and numerical approach has been compared. A significant difference has been observed in the case of settlement while comparing the results. A proper correlation has also been proposed at the end of this research work between the two methods of in order to provide the most efficient way to calculate the subgrade modulus of the ground for designing the shallow foundation.Keywords: modulus of subgrade reaction, shallow foundation, finite element analysis, settlement, plate load test
Procedia PDF Downloads 1852014 Method of Estimating Absolute Entropy of Municipal Solid Waste
Authors: Francis Chinweuba Eboh, Peter Ahlström, Tobias Richards
Abstract:
Entropy, as an outcome of the second law of thermodynamics, measures the level of irreversibility associated with any process. The identification and reduction of irreversibility in the energy conversion process helps to improve the efficiency of the system. The entropy of pure substances known as absolute entropy is determined at an absolute reference point and is useful in the thermodynamic analysis of chemical reactions; however, municipal solid waste (MSW) is a structurally complicated material with unknown absolute entropy. In this work, an empirical model to calculate the absolute entropy of MSW based on the content of carbon, hydrogen, oxygen, nitrogen, sulphur, and chlorine on a dry ash free basis (daf) is presented. The proposed model was derived from 117 relevant organic substances which represent the main constituents in MSW with known standard entropies using statistical analysis. The substances were divided into different waste fractions; namely, food, wood/paper, textiles/rubber and plastics waste and the standard entropies of each waste fraction and for the complete mixture were calculated. The correlation of the standard entropy of the complete waste mixture derived was found to be somsw= 0.0101C + 0.0630H + 0.0106O + 0.0108N + 0.0155S + 0.0084Cl (kJ.K-1.kg) and the present correlation can be used for estimating the absolute entropy of MSW by using the elemental compositions of the fuel within the range of 10.3% ≤ C ≤ 95.1%, 0.0% ≤ H ≤ 14.3%, 0.0% ≤ O ≤ 71.1%, 0.0 ≤ N ≤ 66.7%, 0.0% ≤ S ≤ 42.1%, 0.0% ≤ Cl ≤ 89.7%. The model is also applicable for the efficient modelling of a combustion system in a waste-to-energy plant.Keywords: absolute entropy, irreversibility, municipal solid waste, waste-to-energy
Procedia PDF Downloads 3122013 Near-Infrared Hyperspectral Imaging Spectroscopy to Detect Microplastics and Pieces of Plastic in Almond Flour
Authors: H. Apaza, L. Chévez, H. Loro
Abstract:
Plastic and microplastic pollution in human food chain is a big problem for human health that requires more elaborated techniques that can identify their presences in different kinds of food. Hyperspectral imaging technique is an optical technique than can detect the presence of different elements in an image and can be used to detect plastics and microplastics in a scene. To do this statistical techniques are required that need to be evaluated and compared in order to find the more efficient ones. In this work, two problems related to the presence of plastics are addressed, the first is to detect and identify pieces of plastic immersed in almond seeds, and the second problem is to detect and quantify microplastic in almond flour. To do this we make use of the analysis hyperspectral images taken in the range of 900 to 1700 nm using 4 unmixing techniques of hyperspectral imaging which are: least squares unmixing (LSU), non-negatively constrained least squares unmixing (NCLSU), fully constrained least squares unmixing (FCLSU), and scaled constrained least squares unmixing (SCLSU). NCLSU, FCLSU, SCLSU techniques manage to find the region where the plastic is found and also manage to quantify the amount of microplastic contained in the almond flour. The SCLSU technique estimated a 13.03% abundance of microplastics and 86.97% of almond flour compared to 16.66% of microplastics and 83.33% abundance of almond flour prepared for the experiment. Results show the feasibility of applying near-infrared hyperspectral image analysis for the detection of plastic contaminants in food.Keywords: food, plastic, microplastic, NIR hyperspectral imaging, unmixing
Procedia PDF Downloads 1342012 A Fine-Grained Scheduling Algorithm for Heterogeneous Supercomputing Clusters Based on Graph Convolutional Networks and Proximal Policy Optimization
Authors: Jiahao Zhou, Lei Wang
Abstract:
In heterogeneous supercomputing clusters, designing an efficient scheduling strategy is crucial for enhancing both energy efficiency and workflow execution performance. The dynamic allocation and reclamation of computing resources are essential for improving resource utilization. However, existing studies often allocate fixed resources to jobs prior to execution, maintaining these resources until job completion, which overlooks the importance of dynamic scheduling. This paper proposes a heterogeneous hierarchical fine-grained scheduling algorithm (HeHiFiS) based on graph convolutional networks (GCN) and proximal policy optimization (PPO) to address issues such as prolonged workflow completion times and low resource utilization in heterogeneous supercomputing clusters. Specifically, GCN is employed to extract task dependency features as part of the state information, and the PPO reinforcement learning algorithm is then used to train the scheduling policy. The trained scheduling policy dynamically adjusts scheduling actions during operation based on the continuously changing states of tasks and computing resources. Additionally, we developed a heterogeneous scheduling simulation platform to validate the effectiveness of the proposed algorithm. Experimental results indicate that HeHiFiS, by incorporating resource inheritance and intra-task parallel mechanisms, significantly improves resource utilization. Compared to existing scheduling algorithms, HeHiFiS achieves over a 50% improvement in both job completion and response performance metrics.Keywords: heterogeneous, dynamic scheduling, GCN, PPO
Procedia PDF Downloads 82011 Bioremediation of Paper Mill Effluent by Microbial Consortium Comprising Bacterial and Fungal Strain and Optimizing the Effect of Carbon Source
Authors: Priya Tomar, Pallavi Mittal
Abstract:
Bioremediation has been recognized as an environment friendly and less expensive method which involves the natural processes resulting in the efficient conversion of hazardous compounds into innocuous products. The pulp and paper mill effluent is one of the high polluting effluents amongst the effluents obtained from polluting industries. The colouring body present in the wastewater from pulp and paper mill is organic in nature and is comprised of wood extractives, tannin, resins, synthetic dyes, lignin, and its degradation products formed by the action of chlorine on lignin which imparts an offensive colour to the water. These mills use different chemical process for paper manufacturing due to which lignified chemicals are released into the environment. Therefore, the chemical oxygen demand (COD) of the emanating stream is quite high. For solving the above problem we present this paper with some new techniques that were developed for the efficiency of paper mill effluents. In the present study we utilized the consortia of fungal and bacterial strain and the treatment named as C1, C2, and C3 for the decolourization of paper mill effluent. During the study, role of carbon source i.e. glucose was studied for decolourization. From the results it was observed that a maximum colour reduction of 66.9%, COD reduction of 51.8%, TSS reduction of 0.34%, TDS reduction of 0.29% and pH changes of 4.2 is achieved by consortia of Aspergillus niger with Pseudomonas aeruginosa. Data indicated that consortia of Aspergillus niger with Pseudomonas aeruginosa is giving better result with glucose.Keywords: bioremediation, decolourization, black liquor, mycoremediation
Procedia PDF Downloads 4152010 Isolation, Purification and Characterisation of Non-Digestible Oligosaccharides Derived from Extracellular Polysaccharide of Antarctic Fungus Thelebolus Sp. IITKGP-BT12
Authors: Abinaya Balasubramanian, Satyabrata Ghosh, Satyahari Dey
Abstract:
Non-Digestible Oligosaccharides(NDOs) are low molecular weight carbohydrates with degree of polymerization (DP) 3-20, that are delivered intact to the large intestine. NDOs are gaining attention as effective prebiotic molecules that facilitate prevention and treatment of several chronic diseases. Recently, NDOs are being obtained by cleaving complex polysaccharides as it results in high yield and also as the former tend to display greater bioactivity. Thelebolus sp. IITKGP BT-12, a recently identified psychrophilic, Ascomycetes fungus has been reported to produce a bioactive extracellular polysaccharide(EPS). The EPS has been proved to possess strong prebiotic activity and anti- proliferative effects. The current study is an attempt to identify and optimise the most suitable method for hydrolysis of the above mentioned novel EPS into NDOs, and further purify and characterise the same. Among physical, chemical and enzymatic methods, enzymatic hydrolysis was identified as the best method and the optimum hydrolysis conditions obtained using response surface methodology were: reaction time of 24h, β-(1,3) endo-glucanase concentration of 0.53U and substrate concentration of 10 mg/ml. The NDOs were purified using gel filtration chromatography and their molecular weights were determined using MALDI-TOF. The major fraction was found to have a DP of 7,8. The monomeric units of the NDOs were confirmed to be glucose using TLC and GCMS-MS analysis. The obtained oligosaccharides proved to be non-digestible when subjected to gastric acidity, salivary and pancreatic amylases and hence could serve as efficient prebiotics.Keywords: characterisation, enzymatic hydrolysis, non-digestible oligosaccharides, response surface methodology
Procedia PDF Downloads 1332009 A Comparative Study on Deep Learning Models for Pneumonia Detection
Authors: Hichem Sassi
Abstract:
Pneumonia, being a respiratory infection, has garnered global attention due to its rapid transmission and relatively high mortality rates. Timely detection and treatment play a crucial role in significantly reducing mortality associated with pneumonia. Presently, X-ray diagnosis stands out as a reasonably effective method. However, the manual scrutiny of a patient's X-ray chest radiograph by a proficient practitioner usually requires 5 to 15 minutes. In situations where cases are concentrated, this places immense pressure on clinicians for timely diagnosis. Relying solely on the visual acumen of imaging doctors proves to be inefficient, particularly given the low speed of manual analysis. Therefore, the integration of artificial intelligence into the clinical image diagnosis of pneumonia becomes imperative. Additionally, AI recognition is notably rapid, with convolutional neural networks (CNNs) demonstrating superior performance compared to human counterparts in image identification tasks. To conduct our study, we utilized a dataset comprising chest X-ray images obtained from Kaggle, encompassing a total of 5216 training images and 624 test images, categorized into two classes: normal and pneumonia. Employing five mainstream network algorithms, we undertook a comprehensive analysis to classify these diseases within the dataset, subsequently comparing the results. The integration of artificial intelligence, particularly through improved network architectures, stands as a transformative step towards more efficient and accurate clinical diagnoses across various medical domains.Keywords: deep learning, computer vision, pneumonia, models, comparative study
Procedia PDF Downloads 672008 Simulation of Utility Accrual Scheduling and Recovery Algorithm in Multiprocessor Environment
Authors: A. Idawaty, O. Mohamed, A. Z. Zuriati
Abstract:
This paper presents the development of an event based Discrete Event Simulation (DES) for a recovery algorithm known Backward Recovery Global Preemptive Utility Accrual Scheduling (BR_GPUAS). This algorithm implements the Backward Recovery (BR) mechanism as a fault recovery solution under the existing Time/Utility Function/ Utility Accrual (TUF/UA) scheduling domain for multiprocessor environment. The BR mechanism attempts to take the faulty tasks back to its initial safe state and then proceeds to re-execute the affected section of the faulty tasks to enable recovery. Considering that faults may occur in the components of any system; a fault tolerance system that can nullify the erroneous effect is necessary to be developed. Current TUF/UA scheduling algorithm uses the abortion recovery mechanism and it simply aborts the erroneous task as their fault recovery solution. None of the existing algorithm in TUF/UA scheduling domain in multiprocessor scheduling environment have considered the transient fault and implement the BR mechanism as a fault recovery mechanism to nullify the erroneous effect and solve the recovery problem in this domain. The developed BR_GPUAS simulator has derived the set of parameter, events and performance metrics according to a detailed analysis of the base model. Simulation results revealed that BR_GPUAS algorithm can saved almost 20-30% of the accumulated utilities making it reliable and efficient for the real-time application in the multiprocessor scheduling environment.Keywords: real-time system (RTS), time utility function/ utility accrual (TUF/UA) scheduling, backward recovery mechanism, multiprocessor, discrete event simulation (DES)
Procedia PDF Downloads 3082007 A Study on Characteristics of Runoff Analysis Methods at the Time of Rainfall in Rural Area, Okinawa Prefecture Part 2: A Case of Kohatu River in South Central Part of Okinawa Pref
Authors: Kazuki Kohama, Hiroko Ono
Abstract:
The rainfall in Japan is gradually increasing every year according to Japan Meteorological Agency and Intergovernmental Panel on Climate Change Fifth Assessment Report. It means that the rainfall difference between rainy season and non-rainfall is increasing. In addition, the increasing trend of strong rain for a short time clearly appears. In recent years, natural disasters have caused enormous human injuries in various parts of Japan. Regarding water disaster, local heavy rain and floods of large rivers occur frequently, and it was decided on a policy to promote hard and soft sides as emergency disaster prevention measures with water disaster prevention awareness social reconstruction vision. Okinawa prefecture in subtropical region has torrential rain and water disaster several times a year such as river flood, in which is caused in specific rivers from all 97 rivers. Also, the shortage of capacity and narrow width are characteristic of river in Okinawa and easily cause river flood in heavy rain. This study focuses on Kohatu River that is one of the specific rivers. In fact, the water level greatly rises over the river levee almost once a year but non-damage of buildings around. On the other hand in some case, the water level reaches to ground floor height of house and has happed nine times until today. The purpose of this research is to figure out relationship between precipitation, surface outflow and total treatment water quantity of Kohatu River. For the purpose, we perform hydrological analysis although is complicated and needs specific details or data so that, the method is mainly using Geographic Information System software and outflow analysis system. At first, we extract watershed and then divided to 23 catchment areas to understand how much surface outflow flows to runoff point in each 10 minutes. On second, we create Unit Hydrograph indicating the area of surface outflow with flow area and time. This index shows the maximum amount of surface outflow at 2400 to 3000 seconds. Lastly, we compare an estimated value from Unit Hydrograph to a measured value. However, we found that measure value is usually lower than measured value because of evaporation and transpiration. In this study, hydrograph analysis was performed using GIS software and outflow analysis system. Based on these, we could clarify the flood time and amount of surface outflow.Keywords: disaster prevention, water disaster, river flood, GIS software
Procedia PDF Downloads 1432006 Assessment of Solid Waste Management in General Mohammed Inuwa Wushishi Housing Estate, Minna, Niger State, Nigeria
Authors: Garba Inuwa Kuta, Mohammed, Adamu, Mohammed Ahmed Emigilati, Ibrahim Ishiaku, Kudu Dangana
Abstract:
The study sought to identify the problems of solid waste management in General Mohammed InuwaWushishi Housing Estate. The two broad types of data, the secondary and primary data were used in the study. Questionnaires and personal observations were also used to collect some of the data. Factors impeding the effective and efficient solid waste management were identified. The study revealed that sacks disposal method and open dumping are the most commonly used method of disposal, about 30.0% of the respondent use sacks disposal method in the estate while 24.9% dump their refuse on the floor. Wrong attitudes and perceptions of the people about sanitation issues contributed to solid waste management problems of General Mohammed InuwaWushishi Housing Estate. Majority of the households did not educate their members on the need to clean their surroundings and refuse to buy drum for waste disposal from Niger State Environmental Protection Agency (NISEPA) on the basis that the drums are expensive. Virtually, all the people depended on Niger State Environmental Protection Agency (NISEPA) facilities for the disposal of their household refuse. Solid waste management problems were partly the results of NISEPA’s inability to cope with the situation because of lack of equipment. It was recommended that there should be an increase in enlightenment to the people on domestic waste disposal to keep the surroundings clean.Keywords: housing estate, assessment, solid waste, disposal, management
Procedia PDF Downloads 6582005 Productive Safety Net Program and Rural Livelihood in Ethiopia
Authors: Desta Brhanu Gebrehiwot
Abstract:
The purpose of this review was to analyze the overall or combined effect of scholarly studies conducted on the impacts of Food for work (FFW) and Productive Safety Net Program (PSNP) on farm households’ livelihood (agricultural investment on the adoption of fertilizer, food security, livestock holding, nutrition and its’ disincentive effect) in Ethiopia. In addition, to make a critical assessment of the internal and external validity of the existing studies, the review also indicates the possibility to redesign the program. The method of selecting eligible studies for review was PICOS (Participants, Intervention, Comparison, Outcomes, and Settings) framework. The method of analysis was the fixed effects model under Meta-Analysis. The findings of this systematic review confirm the overall or combined positive significant impact of PSNP on fertilizer adoption (combined point estimate=0.015, standard error=0.005, variance=0.000, lower limit 0.004 up to the upper limit=0.026, z-value=2.726, and p-value=0.006). And the program had a significant positive impact on the child nutrition of rural households and had no significant disincentive effect. However, the program had no significant impact on livestock holdings. Thus, PSNP is important for households whose livelihood depends on rain-fed agriculture and are exposed to rainfall shocks. Thus, better to integrate the program into the national agricultural policy. In addition, most of the studies suggested that PSNP needs more attention to the design and targeting issued in order to be effective and efficient in social protection.Keywords: meta-analysis, fixed effect model, PSNP, rural-livelihood, Ethiopia
Procedia PDF Downloads 762004 Design of Replication System for Computer-Generated Hologram in Optical Component Application
Authors: Chih-Hung Chen, Yih-Shyang Cheng, Yu-Hsin Tu
Abstract:
Holographic optical elements (HOEs) have recently been one of the most suitable components in optoelectronic technology owing to the requirement of the product system with compact size. Computer-generated holography (CGH) is a well-known technology for HOEs production. In some cases, a well-designed diffractive optical element with multifunctional components is also an important issue and needed for an advanced optoelectronic system. Spatial light modulator (SLM) is one of the key components that has great capability to display CGH pattern and is widely used in various applications, such as an image projection system. As mentioned to multifunctional components, such as phase and amplitude modulation of light, high-resolution hologram with multiple-exposure procedure is also one of the suitable candidates. However, holographic recording under multiple exposures, the diffraction efficiency of the final hologram is inevitably lower than that with single exposure process. In this study, a two-step holographic recording method, including the master hologram fabrication and the replicated hologram production, will be designed. Since there exist a reduction factor M² of diffraction efficiency in multiple-exposure holograms (M multiple exposures), so it seems that single exposure would be more efficient for holograms replication. In the second step of holographic replication, a stable optical system with one-shot copying is introduced. For commercial application, one may utilize this concept of holographic copying to obtain duplications of HOEs with higher optical performance.Keywords: holographic replication, holography, one-shot copying, optical element
Procedia PDF Downloads 1572003 Life Cycle Assessment of Todays and Future Electricity Grid Mixes of EU27
Authors: Johannes Gantner, Michael Held, Rafael Horn, Matthias Fischer
Abstract:
At the United Nations Climate Change Conference 2015 a global agreement on the reduction of climate change was achieved stating CO₂ reduction targets for all countries. For instance, the EU targets a reduction of 40 percent in emissions by 2030 compared to 1990. In order to achieve this ambitious goal, the environmental performance of the different European electricity grid mixes is crucial. First, the electricity directly needed for everyone’s daily life (e.g. heating, plug load, mobility) and therefore a reduction of the environmental impacts of the electricity grid mix reduces the overall environmental impacts of a country. Secondly, the manufacturing of every product depends on electricity. Thereby a reduction of the environmental impacts of the electricity mix results in a further decrease of environmental impacts of every product. As a result, the implementation of the two-degree goal highly depends on the decarbonization of the European electricity mixes. Currently the production of electricity in the EU27 is based on fossil fuels and therefore bears a high GWP impact per kWh. Due to the importance of the environmental impacts of the electricity mix, not only today but also in future, within the European research projects, CommONEnergy and Senskin, time-dynamic Life Cycle Assessment models for all EU27 countries were set up. As a methodology, a combination of scenario modeling and life cycle assessment according to ISO14040 and ISO14044 was conducted. Based on EU27 trends regarding energy, transport, and buildings, the different national electricity mixes were investigated taking into account future changes such as amount of electricity generated in the country, change in electricity carriers, COP of the power plants and distribution losses, imports and exports. As results, time-dynamic environmental profiles for the electricity mixes of each country and for Europe overall were set up. Thereby for each European country, the decarbonization strategies of the electricity mix are critically investigated in order to identify decisions, that can lead to negative environmental effects, for instance on the reduction of the global warming of the electricity mix. For example, the withdrawal of the nuclear energy program in Germany and at the same time compensation of the missing energy by non-renewable energy carriers like lignite and natural gas is resulting in an increase in global warming potential of electricity grid mix. Just after two years this increase countervailed by the higher share of renewable energy carriers such as wind power and photovoltaic. Finally, as an outlook a first qualitative picture is provided, illustrating from environmental perspective, which country has the highest potential for low-carbon electricity production and therefore how investments in a connected European electricity grid could decrease the environmental impacts of the electricity mix in Europe.Keywords: electricity grid mixes, EU27 countries, environmental impacts, future trends, life cycle assessment, scenario analysis
Procedia PDF Downloads 1872002 The Performance Improvement of Solar Aided Power Generation System by Introducing the Second Solar Field
Authors: Junjie Wu, Hongjuan Hou, Eric Hu, Yongping Yang
Abstract:
Solar aided power generation (SAPG) technology has been proven as an efficient way to make use of solar energy for power generation purpose. In an SAPG plant, a solar field consisting of parabolic solar collectors is normally used to supply the solar heat in order to displace the high pressure/temperature extraction steam. To understand the performance of such a SAPG plant, a new simulation model was developed by the authors recently, in which the boiler was treated, as a series of heat exchangers unlike other previous models. Through the simulations using the new model, it was found the outlet properties of reheated steam, e.g. temperature, would decrease due to the introduction of the solar heat. The changes make the (lower stage) turbines work under off-design condition. As a result, the whole plant’s performance may not be optimal. In this paper, the second solar filed was proposed to increase the inlet temperature of steam to be reheated, in order to bring the outlet temperature of reheated steam back to the designed condition. A 600MW SAPG plant was simulated as a case study using the new model to understand the impact of the second solar field on the plant performance. It was found in the study, the 2nd solar field would improve the plant’s performance in terms of cycle efficiency and solar-to-electricity efficiency by 1.91% and 6.01%. The solar-generated electricity produced by per aperture area under the design condition was 187.96W/m2, which was 26.14% higher than the previous design.Keywords: solar-aided power generation system, off-design performance, coal-saving performance, boiler modelling, integration schemes
Procedia PDF Downloads 2932001 Accelerating Decision-Making in Oil and Gas Wells: 'A Digital Transformation Journey for Rapid and Precise Insights from Well History Data'
Authors: Linung Kresno Adikusumo, Ivan Ramos Sampe Immanuel, Liston Sitanggang
Abstract:
An excellent, well work program in the oil and gas industry can have numerous positive business impacts, contributing to operational efficiency, increased production, enhanced safety, and improved financial performance. In summary, an excellent, well work program not only ensures the immediate success of specific projects but also has a broader positive impact on the overall business performance and reputation of the oil and gas company. It positions the company for long-term success in a competitive and dynamic industry. Nevertheless, a number of challenges were encountered when developing a good work program, such as the poor quality and lack of integration of well documentation, the incompleteness of the well history, and the low accessibility of well documentation. As a result, the well work program was delivered less accurately, plus well damage was managed slowly. Our solution implementing digital technology by developing a web-based database and application not only solves those issues but also provides an easy-to-access report and user-friendly display for management as well as engineers to analyze the report’s content. This application aims to revolutionize the documentation of well history in the field of oil and gas exploration and production. The current lack of a streamlined and comprehensive system for capturing, organizing, and accessing well-related data presents challenges in maintaining accurate and up-to-date records. Our innovative solution introduces a user-friendly and efficient platform designed to capture well history documentation seamlessly.Keywords: digital, drilling, well work, application
Procedia PDF Downloads 822000 Optical Variability of Faint Quasars
Authors: Kassa Endalamaw Rewnu
Abstract:
The variability properties of a quasar sample, spectroscopically complete to magnitude J = 22.0, are investigated on a time baseline of 2 years using three different photometric bands (U, J and F). The original sample was obtained using a combination of different selection criteria: colors, slitless spectroscopy and variability, based on a time baseline of 1 yr. The main goals of this work are two-fold: first, to derive the percentage of variable quasars on a relatively short time baseline; secondly, to search for new quasar candidates missed by the other selection criteria; and, thus, to estimate the completeness of the spectroscopic sample. In order to achieve these goals, we have extracted all the candidate variable objects from a sample of about 1800 stellar or quasi-stellar objects with limiting magnitude J = 22.50 over an area of about 0.50 deg2. We find that > 65% of all the objects selected as possible variables are either confirmed quasars or quasar candidates on the basis of their colors. This percentage increases even further if we exclude from our lists of variable candidates a number of objects equal to that expected on the basis of `contamination' induced by our photometric errors. The percentage of variable quasars in the spectroscopic sample is also high, reaching about 50%. On the basis of these results, we can estimate that the incompleteness of the original spectroscopic sample is < 12%. We conclude that variability analysis of data with small photometric errors can be successfully used as an efficient and independent (or at least auxiliary) selection method in quasar surveys, even when the time baseline is relatively short. Finally, when corrected for the different intrinsic time lags corresponding to a fixed observed time baseline, our data do not show a statistically significant correlation between variability and either absolute luminosity or redshift.Keywords: nuclear activity, galaxies, active quasars, variability
Procedia PDF Downloads 861999 Molecular-Dynamics Study of H₂-C₃H₈-Hydrate Dissociation: Non-Equilibrium Analysis
Authors: Mohammad Reza Ghaani, Niall English
Abstract:
Hydrogen is looked upon as the next-generation clean-energy carrier; the search for an efficient material and method for storing hydrogen has been, and is, pursued relentlessly. Clathrate hydrates are inclusion compounds wherein guest gas molecules like hydrogen are trapped in a host water-lattice framework. These types of materials can be categorised as potentially attractive hosting environments for physical hydrogen storage (i.e., no chemical reaction upon storage). Non-equilibrium molecular dynamics (NEMD) simulations have been performed to investigate thermal-driven break-up of propane-hydrate interfaces with liquid water at 270-300 K, with the propane hydrate containing either one or no hydrogen molecule in each of its small cavities. In addition, two types of hydrate-surface water-lattice molecular termination were adopted, at the hydrate edge with water: a 001-direct surface cleavage and one with completed cages. The geometric hydrate-ice-liquid distinction criteria of Báez and Clancy were employed to distinguish between the hydrate, ice lattices, and liquid-phase. Consequently, the melting temperatures of interface were estimated, and dissociation rates were observed to be strongly dependent on temperature, with higher dissociation rates at larger over-temperatures vis-à-vis melting. The different hydrate-edge terminations for the hydrate-water interface led to statistically-significant differences in the observed melting point and dissociation profile: it was found that the clathrate with the planar interface melts at around 280 K, whilst the melting temperature of the cage-completed interface was determined to be circa 270 K.Keywords: hydrogen storage, clathrate hydrate, molecular dynamics, thermal dissociation
Procedia PDF Downloads 2781998 FACTS Based Stabilization for Smart Grid Applications
Authors: Adel. M. Sharaf, Foad H. Gandoman
Abstract:
Nowadays, Photovoltaic-PV Farms/ Parks and large PV-Smart Grid Interface Schemes are emerging and commonly utilized in Renewable Energy distributed generation. However, PV-hybrid-Dc-Ac Schemes using interface power electronic converters usually has negative impact on power quality and stabilization of modern electrical network under load excursions and network fault conditions in smart grid. Consequently, robust FACTS based interface schemes are required to ensure efficient energy utilization and stabilization of bus voltages as well as limiting switching/fault onrush current condition. FACTS devices are also used in smart grid-Battery Interface and Storage Schemes with PV-Battery Storage hybrid systems as an elegant alternative to renewable energy utilization with backup battery storage for electric utility energy and demand side management to provide needed energy and power capacity under heavy load conditions. The paper presents a robust interface PV-Li-Ion Battery Storage Interface Scheme for Distribution/Utilization Low Voltage Interface using FACTS stabilization enhancement and dynamic maximum PV power tracking controllers. Digital simulation and validation of the proposed scheme is done using MATLAB/Simulink software environment for Low Voltage- Distribution/Utilization system feeding a hybrid Linear-Motorized inrush and nonlinear type loads from a DC-AC Interface VSC-6-pulse Inverter Fed from the PV Park/Farm with a back-up Li-Ion Storage Battery.Keywords: AC FACTS, smart grid, stabilization, PV-battery storage, Switched Filter-Compensation (SFC)
Procedia PDF Downloads 4151997 Iris Cancer Detection System Using Image Processing and Neural Classifier
Authors: Abdulkader Helwan
Abstract:
Iris cancer, so called intraocular melanoma is a cancer that starts in the iris; the colored part of the eye that surrounds the pupil. There is a need for an accurate and cost-effective iris cancer detection system since the available techniques used currently are still not efficient. The combination of the image processing and artificial neural networks has a great efficiency for the diagnosis and detection of the iris cancer. Image processing techniques improve the diagnosis of the cancer by enhancing the quality of the images, so the physicians diagnose properly. However, neural networks can help in making decision; whether the eye is cancerous or not. This paper aims to develop an intelligent system that stimulates a human visual detection of the intraocular melanoma, so called iris cancer. The suggested system combines both image processing techniques and neural networks. The images are first converted to grayscale, filtered, and then segmented using prewitt edge detection algorithm to detect the iris, sclera circles and the cancer. The principal component analysis is used to reduce the image size and for extracting features. Those features are considered then as inputs for a neural network which is capable of deciding if the eye is cancerous or not, throughout its experience adopted by many training iterations of different normal and abnormal eye images during the training phase. Normal images are obtained from a public database available on the internet, “Mile Research”, while the abnormal ones are obtained from another database which is the “eyecancer”. The experimental results for the proposed system show high accuracy 100% for detecting cancer and making the right decision.Keywords: iris cancer, intraocular melanoma, cancerous, prewitt edge detection algorithm, sclera
Procedia PDF Downloads 5051996 In-silico Antimicrobial Activity of Bioactive Compounds of Ricinus communis against DNA Gyrase of Staphylococcus aureus as Molecular Target
Authors: S. Rajeswari
Abstract:
Medicinal Plant extracts and their bioactive compounds have been used for antimicrobial activities and have significant remedial properties. In the recent years, a wide range of investigations have been carried out throughout the world to confirm antimicrobial properties of different medicinally important plants. A number of plants showed efficient antimicrobial activities, which were comparable to that of synthetic standard drugs or antimicrobial agents. The large family Euphorbiaceae contains nearly about 300 genera and 7,500 speciesand one among is Ricinus communis or castor plant which has high traditional and medicinal value for disease free healthy life. Traditionally the plant is used as laxative, purgative, fertilizer and fungicide etc. whereas the plant possess beneficial effects such as anti-oxidant, antihistamine, antinociceptive, antiasthmatic, antiulcer, immunomodulatory anti diabetic, hepatoprotective, anti inflammatory, antimicrobial, and many other medicinal properties. This activity of the plant possess due to the important phytochemical constituents like flavonoids, saponins, glycosides, alkaloids and steroids. The presents study includes the phytochemical properties of Ricinus communis and to prediction of the anti-microbial activity of Ricinus communis using DNA gyrase of Staphylococcus aureus as molecular target. Docking results of varies chemicals compounds of Ricinus communis against DNA gyrase of Staphylococcus aureus by maestro 9.8 of Schrodinger show that the phytochemicals are effective against the target protein DNA gyrase. our studies suggest that the phytochemical from Ricinus communis such has INDICAN (G.Score 4.98) and SUPLOPIN-2(G.Score 5.74) can be used as lead molecule against Staphylococcus infections.Keywords: euphorbiaceae, antimicrobial activity, Ricinus communis, Staphylococcus aureus
Procedia PDF Downloads 4821995 Heuristic Algorithms for Time Based Weapon-Target Assignment Problem
Authors: Hyun Seop Uhm, Yong Ho Choi, Ji Eun Kim, Young Hoon Lee
Abstract:
Weapon-target assignment (WTA) is a problem that assigns available launchers to appropriate targets in order to defend assets. Various algorithms for WTA have been developed over past years for both in the static and dynamic environment (denoted by SWTA and DWTA respectively). Due to the problem requirement to be solved in a relevant computational time, WTA has suffered from the solution efficiency. As a result, SWTA and DWTA problems have been solved in the limited situation of the battlefield. In this paper, the general situation under continuous time is considered by Time based Weapon Target Assignment (TWTA) problem. TWTA are studied using the mixed integer programming model, and three heuristic algorithms; decomposed opt-opt, decomposed opt-greedy, and greedy algorithms are suggested. Although the TWTA optimization model works inefficiently when it is characterized by a large size, the decomposed opt-opt algorithm based on the linearization and decomposition method extracted efficient solutions in a reasonable computation time. Because the computation time of the scheduling part is too long to solve by the optimization model, several algorithms based on greedy is proposed. The models show lower performance value than that of the decomposed opt-opt algorithm, but very short time is needed to compute. Hence, this paper proposes an improved method by applying decomposition to TWTA, and more practical and effectual methods can be developed for using TWTA on the battlefield.Keywords: air and missile defense, weapon target assignment, mixed integer programming, piecewise linearization, decomposition algorithm, military operations research
Procedia PDF Downloads 3391994 A Generalised Propensity Score Analysis to Investigate the Influence of Agricultural Research Systems on Greenhouse Gas Emissions
Authors: Spada Alessia, Fiore Mariantonietta, Lamonaca Emilia, Contò Francesco
Abstract:
Bioeconomy can give the chance to face new global challenges and can move ahead the transition from a waste economy to an economy based on renewable resources and sustainable consumption. Air pollution is a grave issue in green challenges, mainly caused by anthropogenic factors. The agriculture sector is a great contributor to global greenhouse gases (GHGs) emissions due to lacking efficient management of the resources involved and research policies. In particular, livestock sector contributes to emissions of GHGs, deforestation, and nutrient imbalances. More effective agricultural research systems and technologies are crucial in order to improve farm productivity but also to reduce the GHGs emissions. Using data from FAOSTAT statistics and concern the EU countries; the aim of this research is to evaluate the impact of ASTI R&D (Agricultural Science and Technology Indicators) on GHGs emissions for countries EU in 2015 by generalized propensity score procedures, estimating a dose-response function, also considering a set of covariates. Expected results show the existence of the influence of ASTI R&D on GHGs across EU countries. Implications are crucial: reducing GHGs emissions by means of R&D based policies and correlatively reaching eco-friendly management of required resources by means of green available practices could have a crucial role for fair intra-generational implications.Keywords: agricultural research systems, dose-response function, generalized propensity score, GHG emissions
Procedia PDF Downloads 279