Search results for: illumination techniques
6277 Quantitative Characterization of Single Orifice Hydraulic Flat Spray Nozzle
Authors: Y. C. Khoo, W. T. Lai
Abstract:
The single orifice hydraulic flat spray nozzle was evaluated with two global imaging techniques to characterize various aspects of the resulting spray. The two techniques were high resolution flow visualization and Particle Image Velocimetry (PIV). A CCD camera with 29 million pixels was used to capture shadowgraph images to realize ligament formation and collapse as well as droplet interaction. Quantitative analysis was performed to give the sizing information of the droplets and ligaments. This camera was then applied with a PIV system to evaluate the overall velocity field of the spray, from nozzle exit to droplet discharge. PIV images were further post-processed to determine the inclusion angle of the spray. The results from those investigations provided significant quantitative understanding of the spray structure. Based on the quantitative results, detailed understanding of the spray behavior was achieved.Keywords: spray, flow visualization, PIV, shadowgraph, quantitative sizing, velocity field
Procedia PDF Downloads 3826276 Short-Term Physiological Evaluation of Augmented Reality System for Thanatophobia Psychotherapy
Authors: Kais Siala, Mohamed Kharrat, Mohamed Abid
Abstract:
Exposure therapies encourage patients to gradually begin facing their painful memories of the trauma in order to reduce fear and anxiety. In this context, virtual reality techniques are widely used for treatment of different kinds of phobia. The particular case of fear of death phobia (thanataphobia) is addressed in this paper. For this purpose, we propose to make a simulation of Near Death Experience (NDE) using augmented reality techniques. We propose in particular to simulate the Out-of-Body experience (OBE) which is the first step of a Near-Death-Experience (NDE). In this paper, we present technical aspects of this simulation as well as short-term impact in terms of physiological measures. The non-linear Poincéré plot is used to describe the difference in Heart Rate Variability between In-Body and Out-Of-Body conditions.Keywords: Out-of-Body simulation, physiological measure, augmented reality, phobia psychotherapy, HRV, Poincaré plot
Procedia PDF Downloads 3076275 Crashworthiness Optimization of an Automotive Front Bumper in Composite Material
Authors: S. Boria
Abstract:
In the last years, the crashworthiness of an automotive body structure can be improved, since the beginning of the design stage, thanks to the development of specific optimization tools. It is well known how the finite element codes can help the designer to investigate the crashing performance of structures under dynamic impact. Therefore, by coupling nonlinear mathematical programming procedure and statistical techniques with FE simulations, it is possible to optimize the design with reduced number of analytical evaluations. In engineering applications, many optimization methods which are based on statistical techniques and utilize estimated models, called meta-models, are quickly spreading. A meta-model is an approximation of a detailed simulation model based on a dataset of input, identified by the design of experiments (DOE); the number of simulations needed to build it depends on the number of variables. Among the various types of meta-modeling techniques, Kriging method seems to be excellent in accuracy, robustness and efficiency compared to other ones when applied to crashworthiness optimization. Therefore the application of such meta-model was used in this work, in order to improve the structural optimization of a bumper for a racing car in composite material subjected to frontal impact. The specific energy absorption represents the objective function to maximize and the geometrical parameters subjected to some design constraints are the design variables. LS-DYNA codes were interfaced with LS-OPT tool in order to find the optimized solution, through the use of a domain reduction strategy. With the use of the Kriging meta-model the crashworthiness characteristic of the composite bumper was improved.Keywords: composite material, crashworthiness, finite element analysis, optimization
Procedia PDF Downloads 2566274 An Information-Based Approach for Preference Method in Multi-Attribute Decision Making
Authors: Serhat Tuzun, Tufan Demirel
Abstract:
Multi-Criteria Decision Making (MCDM) is the modelling of real-life to solve problems we encounter. It is a discipline that aids decision makers who are faced with conflicting alternatives to make an optimal decision. MCDM problems can be classified into two main categories: Multi-Attribute Decision Making (MADM) and Multi-Objective Decision Making (MODM), based on the different purposes and different data types. Although various MADM techniques were developed for the problems encountered, their methodology is limited in modelling real-life. Moreover, objective results are hard to obtain, and the findings are generally derived from subjective data. Although, new and modified techniques are developed by presenting new approaches such as fuzzy logic; comprehensive techniques, even though they are better in modelling real-life, could not find a place in real world applications for being hard to apply due to its complex structure. These constraints restrict the development of MADM. This study aims to conduct a comprehensive analysis of preference methods in MADM and propose an approach based on information. For this purpose, a detailed literature review has been conducted, current approaches with their advantages and disadvantages have been analyzed. Then, the approach has been introduced. In this approach, performance values of the criteria are calculated in two steps: first by determining the distribution of each attribute and standardizing them, then calculating the information of each attribute as informational energy.Keywords: literature review, multi-attribute decision making, operations research, preference method, informational energy
Procedia PDF Downloads 2246273 Effect of Relaxation Techniques on Immunological Properties of Breast Milk
Authors: Ahmed Ali Torad
Abstract:
Background: Breast feeding maintains the maternal fetal immunological link, favours the transmission of immune-competence from the mother to her infant and is considered an important contributory factor to the neo natal immune defense system. Purpose: This study was conducted to investigate the effect of relaxation techniques on immunological properties of breast milk. Subjects and Methods: Thirty breast feeding mothers with a single, mature infant without any complications participated in the study. Subjects will be recruited from outpatient clinic of obstetric department of El Kasr El-Aini university hospital in Cairo. Mothers were randomly divided into two equal groups using coin toss method: Group (A) (relaxation training group) (experimental group): It will be composed of 15 women who received relaxation training program in addition to breast feeding and nutritional advices and Group (B) (control group): It will be composed of 15 women who received breast feeding and nutritional advices only. Results: The results showed that mean mother’s age was 28.4 ± 3.68 and 28.07 ± 4.09 for group A and B respectively, there were statistically significant differences between pre and post values regarding cortisol level, IgA level, leucocyte count and infant’s weight and height and there is only statistically significant differences between both groups regarding post values of all immunological variables (cortisol – IgA – leucocyte count). Conclusion: We could conclude that there is a statistically significant effect of relaxation techniques on immunological properties of breast milk.Keywords: relaxation, breast, milk, immunology, lactation
Procedia PDF Downloads 1186272 Comparison Between Fuzzy and P&O Control for MPPT for Photovoltaic System Using Boost Converter
Authors: M. Doumi, A. Miloudi, A. G. Aissaoui, K. Tahir, C. Belfedal, S. Tahir
Abstract:
The studies on the photovoltaic system are extensively increasing because of a large, secure, essentially exhaustible and broadly available resource as a future energy supply. However, the output power induced in the photovoltaic modules is influenced by an intensity of solar cell radiation, temperature of the solar cells and so on. Therefore, to maximize the efficiency of the photovoltaic system, it is necessary to track the maximum power point of the PV array, for this Maximum Power Point Tracking (MPPT) technique is used. Some MPPT techniques are available in that perturbation and observation (P&O) and Fuzzy logic controller (FLC). The fuzzy control method has been compared with perturb and observe (P&O) method as one of the most widely conventional method used in this area. Both techniques have been analyzed and simulated. MPPT using fuzzy logic shows superior performance and more reliable control with respect to the P&O technique for this application.Keywords: photovoltaic system, MPPT, perturb and observe, fuzzy logic
Procedia PDF Downloads 6046271 Synchronous Reference Frame and Instantaneous P-Q Theory Based Control of Unified Power Quality Conditioner for Power Quality Improvement of Distribution System
Authors: Ambachew Simreteab Gebremedhn
Abstract:
Context: The paper explores the use of synchronous reference frame theory (SRFT) and instantaneous reactive power theory (IRPT) based control of Unified Power Quality Conditioner (UPQC) for improving power quality in distribution systems. Research Aim: To investigate the performance of different control configurations of UPQC using SRFT and IRPT for mitigating power quality issues in distribution systems. Methodology: The study compares three control techniques (SRFT-IRPT, SRFT-SRFT, IRPT-IRPT) implemented in series and shunt active filters of UPQC. Data is collected under various control algorithms to analyze UPQC performance. Findings: Results indicate the effectiveness of SRFT and IRPT based control techniques in addressing power quality problems such as voltage sags, swells, unbalance, harmonics, and current harmonics in distribution systems. Theoretical Importance: The study provides insights into the application of SRFT and IRPT in improving power quality, specifically in mitigating unbalanced voltage sags, where conventional methods fall short. Data Collection: Data is collected under various control algorithms using simulation in MATLAB Simulink and real-time operation executed with experimental results obtained using RT-LAB. Analysis Procedures: Performance analysis of UPQC under different control algorithms is conducted to evaluate the effectiveness of SRFT and IRPT based control techniques in mitigating power quality issues. Questions Addressed: How do SRFT and IRPT based control techniques compare in improving power quality in distribution systems? What is the impact of using different control configurations on the performance of UPQC? Conclusion: The study demonstrates the efficacy of SRFT and IRPT based control of UPQC in mitigating power quality issues in distribution systems, highlighting their potential for enhancing voltage and current quality.Keywords: power quality, UPQC, shunt active filter, series active filter, non-linear load, RT-LAB, MATLAB
Procedia PDF Downloads 96270 Dynamic Bandwidth Allocation in Fiber-Wireless (FiWi) Networks
Authors: Eman I. Raslan, Haitham S. Hamza, Reda A. El-Khoribi
Abstract:
Fiber-Wireless (FiWi) networks are a promising candidate for future broadband access networks. These networks combine the optical network as the back end where different passive optical network (PON) technologies are realized and the wireless network as the front end where different wireless technologies are adopted, e.g. LTE, WiMAX, Wi-Fi, and Wireless Mesh Networks (WMNs). The convergence of both optical and wireless technologies requires designing architectures with robust efficient and effective bandwidth allocation schemes. Different bandwidth allocation algorithms have been proposed in FiWi networks aiming to enhance the different segments of FiWi networks including wireless and optical subnetworks. In this survey, we focus on the differentiating between the different bandwidth allocation algorithms according to their enhancement segment of FiWi networks. We classify these techniques into wireless, optical and Hybrid bandwidth allocation techniques.Keywords: fiber-wireless (FiWi), dynamic bandwidth allocation (DBA), passive optical networks (PON), media access control (MAC)
Procedia PDF Downloads 5316269 Iris Cancer Detection System Using Image Processing and Neural Classifier
Authors: Abdulkader Helwan
Abstract:
Iris cancer, so called intraocular melanoma is a cancer that starts in the iris; the colored part of the eye that surrounds the pupil. There is a need for an accurate and cost-effective iris cancer detection system since the available techniques used currently are still not efficient. The combination of the image processing and artificial neural networks has a great efficiency for the diagnosis and detection of the iris cancer. Image processing techniques improve the diagnosis of the cancer by enhancing the quality of the images, so the physicians diagnose properly. However, neural networks can help in making decision; whether the eye is cancerous or not. This paper aims to develop an intelligent system that stimulates a human visual detection of the intraocular melanoma, so called iris cancer. The suggested system combines both image processing techniques and neural networks. The images are first converted to grayscale, filtered, and then segmented using prewitt edge detection algorithm to detect the iris, sclera circles and the cancer. The principal component analysis is used to reduce the image size and for extracting features. Those features are considered then as inputs for a neural network which is capable of deciding if the eye is cancerous or not, throughout its experience adopted by many training iterations of different normal and abnormal eye images during the training phase. Normal images are obtained from a public database available on the internet, “Mile Research”, while the abnormal ones are obtained from another database which is the “eyecancer”. The experimental results for the proposed system show high accuracy 100% for detecting cancer and making the right decision.Keywords: iris cancer, intraocular melanoma, cancerous, prewitt edge detection algorithm, sclera
Procedia PDF Downloads 5036268 Use of Multivariate Statistical Techniques for Water Quality Monitoring Network Assessment, Case of Study: Jequetepeque River Basin
Authors: Jose Flores, Nadia Gamboa
Abstract:
A proper water quality management requires the establishment of a monitoring network. Therefore, evaluation of the efficiency of water quality monitoring networks is needed to ensure high-quality data collection of critical quality chemical parameters. Unfortunately, in some Latin American countries water quality monitoring programs are not sustainable in terms of recording historical data or environmentally representative sites wasting time, money and valuable information. In this study, multivariate statistical techniques, such as principal components analysis (PCA) and hierarchical cluster analysis (HCA), are applied for identifying the most significant monitoring sites as well as critical water quality parameters in the monitoring network of the Jequetepeque River basin, in northern Peru. The Jequetepeque River basin, like others in Peru, shows socio-environmental conflicts due to economical activities developed in this area. Water pollution by trace elements in the upper part of the basin is mainly related with mining activity, and agricultural land lost due to salinization is caused by the extensive use of groundwater in the lower part of the basin. Since the 1980s, the water quality in the basin has been non-continuously assessed by public and private organizations, and recently the National Water Authority had established permanent water quality networks in 45 basins in Peru. Despite many countries use multivariate statistical techniques for assessing water quality monitoring networks, those instruments have never been applied for that purpose in Peru. For this reason, the main contribution of this study is to demonstrate that application of the multivariate statistical techniques could serve as an instrument that allows the optimization of monitoring networks using least number of monitoring sites as well as the most significant water quality parameters, which would reduce costs concerns and improve the water quality management in Peru. Main socio-economical activities developed and the principal stakeholders related to the water management in the basin are also identified. Finally, water quality management programs will also be discussed in terms of their efficiency and sustainability.Keywords: PCA, HCA, Jequetepeque, multivariate statistical
Procedia PDF Downloads 3556267 AI-Powered Models for Real-Time Fraud Detection in Financial Transactions to Improve Financial Security
Authors: Shanshan Zhu, Mohammad Nasim
Abstract:
Financial fraud continues to be a major threat to financial institutions across the world, causing colossal money losses and undermining public trust. Fraud prevention techniques, based on hard rules, have become ineffective due to evolving patterns of fraud in recent times. Against such a background, the present study probes into distinct methodologies that exploit emergent AI-driven techniques to further strengthen fraud detection. We would like to compare the performance of generative adversarial networks and graph neural networks with other popular techniques, like gradient boosting, random forests, and neural networks. To this end, we would recommend integrating all these state-of-the-art models into one robust, flexible, and smart system for real-time anomaly and fraud detection. To overcome the challenge, we designed synthetic data and then conducted pattern recognition and unsupervised and supervised learning analyses on the transaction data to identify which activities were fishy. With the use of actual financial statistics, we compare the performance of our model in accuracy, speed, and adaptability versus conventional models. The results of this study illustrate a strong signal and need to integrate state-of-the-art, AI-driven fraud detection solutions into frameworks that are highly relevant to the financial domain. It alerts one to the great urgency that banks and related financial institutions must rapidly implement these most advanced technologies to continue to have a high level of security.Keywords: AI-driven fraud detection, financial security, machine learning, anomaly detection, real-time fraud detection
Procedia PDF Downloads 426266 Auteur 3D Filmmaking: From Hitchcock’s Protrusion Technique to Godard’s Immersion Aesthetic
Authors: Delia Enyedi
Abstract:
Throughout film history, the regular return of 3D cinema has been discussed in connection to crises caused by the advent of television or the competition of the Internet. In addition, the three waves of stereoscopic 3D (from 1952 up to 1983) and its current digital version have been blamed for adding a challenging technical distraction to the viewing experience. By discussing the films Dial M for Murder (1954) and Goodbye to Language (2014), the paper aims to analyze the response of recognized auteurs to the use of 3D techniques in filmmaking. For Alfred Hitchcock, the solution to attaining perceptual immersion paradoxically resided in restraining the signature effect of 3D, namely protrusion. In Jean-Luc Godard’s vision, 3D techniques allowed him to explore perceptual absorption by means of depth of field, for which he had long advocated as being central to cinema. Thus, both directors contribute to the foundation of an auteur aesthetic in 3D filmmaking.Keywords: Alfred Hitchcock, authorship, 3D filmmaking, Jean-Luc Godard, perceptual absorption, perceptual immersion
Procedia PDF Downloads 2906265 Forecasting Equity Premium Out-of-Sample with Sophisticated Regression Training Techniques
Authors: Jonathan Iworiso
Abstract:
Forecasting the equity premium out-of-sample is a major concern to researchers in finance and emerging markets. The quest for a superior model that can forecast the equity premium with significant economic gains has resulted in several controversies on the choice of variables and suitable techniques among scholars. This research focuses mainly on the application of Regression Training (RT) techniques to forecast monthly equity premium out-of-sample recursively with an expanding window method. A broad category of sophisticated regression models involving model complexity was employed. The RT models include Ridge, Forward-Backward (FOBA) Ridge, Least Absolute Shrinkage and Selection Operator (LASSO), Relaxed LASSO, Elastic Net, and Least Angle Regression were trained and used to forecast the equity premium out-of-sample. In this study, the empirical investigation of the RT models demonstrates significant evidence of equity premium predictability both statistically and economically relative to the benchmark historical average, delivering significant utility gains. They seek to provide meaningful economic information on mean-variance portfolio investment for investors who are timing the market to earn future gains at minimal risk. Thus, the forecasting models appeared to guarantee an investor in a market setting who optimally reallocates a monthly portfolio between equities and risk-free treasury bills using equity premium forecasts at minimal risk.Keywords: regression training, out-of-sample forecasts, expanding window, statistical predictability, economic significance, utility gains
Procedia PDF Downloads 1076264 Determination of Weathering at Kilistra Ancient City by Using Non-Destructive Techniques, Central Anatolia, Turkey
Authors: İsmail İnce, Osman Günaydin, Fatma Özer
Abstract:
Stones used in the construction of historical structures are exposed to various direct or indirect atmospheric effects depending on climatic conditions. Building stones deteriorate partially or fully as a result of this exposure. The historic structures are important symbols of any cultural heritage. Therefore, it is important to protect and restore these historical structures. The aim of this study is to determine the weathering conditions at the Kilistra ancient city. It is located in the southwest of the Konya city, Central Anatolia, and was built by carving into pyroclastic rocks during the Byzantine Era. For this purpose, the petrographic and mechanical properties of the pyroclastic rocks were determined. In the assessment of weathering of structures in the ancient city, in-situ non-destructive testing (i.e., Schmidt hardness rebound value, relative humidity measurement) methods were applied.Keywords: cultural heritage, Kilistra ancient city, non-destructive techniques, weathering
Procedia PDF Downloads 3596263 Neutral Heavy Scalar Searches via Standard Model Gauge Boson Decays at the Large Hadron Electron Collider with Multivariate Techniques
Authors: Luigi Delle Rose, Oliver Fischer, Ahmed Hammad
Abstract:
In this article, we study the prospects of the proposed Large Hadron electron Collider (LHeC) in the search for heavy neutral scalar particles. We consider a minimal model with one additional complex scalar singlet that interacts with the Standard Model (SM) via mixing with the Higgs doublet, giving rise to an SM-like Higgs boson and a heavy scalar particle. Both scalar particles are produced via vector boson fusion and can be tested via their decays into pairs of SM particles, analogously to the SM Higgs boson. Using multivariate techniques, we show that the LHeC is sensitive to heavy scalars with masses between 200 and 800 GeV down to scalar mixing of order 0.01.Keywords: beyond the standard model, large hadron electron collider, multivariate analysis, scalar singlet
Procedia PDF Downloads 1376262 Attack Redirection and Detection using Honeypots
Authors: Chowduru Ramachandra Sharma, Shatunjay Rawat
Abstract:
A false positive state is when the IDS/IPS identifies an activity as an attack, but the activity is acceptable behavior in the system. False positives in a Network Intrusion Detection System ( NIDS ) is an issue because they desensitize the administrator. It wastes computational power and valuable resources when rules are not tuned properly, which is the main issue with anomaly NIDS. Furthermore, most false positives reduction techniques are not performed during the real-time of attempted intrusions; instead, they have applied afterward on collected traffic data and generate alerts. Of course, false positives detection in ‘offline mode’ is tremendously valuable. Nevertheless, there is room for improvement here; automated techniques still need to reduce False Positives in real-time. This paper uses the Snort signature detection model to redirect the alerted attacks to Honeypots and verify attacks.Keywords: honeypot, TPOT, snort, NIDS, honeybird, iptables, netfilter, redirection, attack detection, docker, snare, tanner
Procedia PDF Downloads 1566261 Study on Control Techniques for Adaptive Impact Mitigation
Authors: Rami Faraj, Cezary Graczykowski, Błażej Popławski, Grzegorz Mikułowski, Rafał Wiszowaty
Abstract:
Progress in the field of sensors, electronics and computing results in more and more often applications of adaptive techniques for dynamic response mitigation. When it comes to systems excited with mechanical impacts, the control system has to take into account the significant limitations of actuators responsible for system adaptation. The paper provides a comprehensive discussion of the problem of appropriate design and implementation of adaptation techniques and mechanisms. Two case studies are presented in order to compare completely different adaptation schemes. The first example concerns a double-chamber pneumatic shock absorber with a fast piezo-electric valve and parameters corresponding to the suspension of a small unmanned aerial vehicle, whereas the second considered system is a safety air cushion applied for evacuation of people from heights during a fire. For both systems, it is possible to ensure adaptive performance, but a realization of the system’s adaptation is completely different. The reason for this is technical limitations corresponding to specific types of shock-absorbing devices and their parameters. Impact mitigation using a pneumatic shock absorber corresponds to much higher pressures and small mass flow rates, which can be achieved with minimal change of valve opening. In turn, mass flow rates in safety air cushions relate to gas release areas counted in thousands of sq. cm. Because of these facts, both shock-absorbing systems are controlled based on completely different approaches. Pneumatic shock-absorber takes advantage of real-time control with valve opening recalculated at least every millisecond. In contrast, safety air cushion is controlled using the semi-passive technique, where adaptation is provided using prediction of the entire impact mitigation process. Similarities of both approaches, including applied models, algorithms and equipment, are discussed. The entire study is supported by numerical simulations and experimental tests, which prove the effectiveness of both adaptive impact mitigation techniques.Keywords: adaptive control, adaptive system, impact mitigation, pneumatic system, shock-absorber
Procedia PDF Downloads 906260 Optimization Techniques for Microwave Structures
Authors: Malika Ourabia
Abstract:
A new and efficient method is presented for the analysis of arbitrarily shaped discontinuities. The discontinuities is characterized using a hybrid spectral/numerical technique. This structure presents an arbitrary number of ports, each one with different orientation and dimensions. This article presents a hybrid method based on multimode contour integral and mode matching techniques. The process is based on segmentation and dividing the structure into key building blocks. We use the multimode contour integral method to analyze the blocks including irregular shape discontinuities. Finally, the multimode scattering matrix of the whole structure can be found by cascading the blocks. Therefore, the new method is suitable for analysis of a wide range of waveguide problems. Therefore, the present approach can be applied easily to the analysis of any multiport junctions and cascade blocks. The accuracy of the method is validated comparing with results for several complex problems found in the literature. CPU times are also included to show the efficiency of the new method proposed.Keywords: segmentation, s parameters, simulation, optimization
Procedia PDF Downloads 5286259 Quantitative Elemental Analysis of Cyperus rotundus Medicinal Plant by Particle Induced X-Ray Emission and ICP-MS Techniques
Authors: J. Chandrasekhar Rao, B. G. Naidu, G. J. Naga Raju, P. Sarita
Abstract:
Particle Induced X-ray Emission (PIXE) and Inductively Coupled Plasma Mass Spectroscopy (ICP-MS) techniques have been employed in this work to determine the elements present in the root of Cyperus rotundus medicinal plant used in the treatment of rheumatoid arthritis. The elements V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Rb, and Sr were commonly identified and quantified by both PIXE and ICP-MS whereas the elements Li, Be, Al, As, Se, Ag, Cd, Ba, Tl, Pb and U were determined by ICP-MS and Cl, K, Ca, Ti and Br were determined by PIXE. The regional variation of elemental content has also been studied by analyzing the same plant collected from different geographical locations. Information on the elemental content of the medicinal plant would be helpful in correlating its ability in the treatment of rheumatoid arthritis and also in deciding the dosage of this herbal medicine from the metal toxicity point of view. Principal component analysis and cluster analysis were also applied to the data matrix to understand the correlation among the elements.Keywords: PIXE, CP-MS, elements, Cyperus rotundus, rheumatoid arthritis
Procedia PDF Downloads 3336258 A Review on Modeling and Optimization of Integration of Renewable Energy Resources (RER) for Minimum Energy Cost, Minimum CO₂ Emissions and Sustainable Development, in Recent Years
Authors: M. M. Wagh, V. V. Kulkarni
Abstract:
The rising economic activities, growing population and improving living standards of world have led to a steady growth in its appetite for quality and quantity of energy services. As the economy expands the electricity demand is going to grow further, increasing the challenges of the more generation and stresses on the utility grids. Appropriate energy model will help in proper utilization of the locally available renewable energy sources such as solar, wind, biomass, small hydro etc. to integrate in the available grid, reducing the investments in energy infrastructure. Further to these new technologies like smart grids, decentralized energy planning, energy management practices, energy efficiency are emerging. In this paper, the attempt has been made to study and review the recent energy planning models, energy forecasting models, and renewable energy integration models. In addition, various modeling techniques and tools are reviewed and discussed.Keywords: energy modeling, integration of renewable energy, energy modeling tools, energy modeling techniques
Procedia PDF Downloads 3456257 Count of Trees in East Africa with Deep Learning
Authors: Nubwimana Rachel, Mugabowindekwe Maurice
Abstract:
Trees play a crucial role in maintaining biodiversity and providing various ecological services. Traditional methods of counting trees are time-consuming, and there is a need for more efficient techniques. However, deep learning makes it feasible to identify the multi-scale elements hidden in aerial imagery. This research focuses on the application of deep learning techniques for tree detection and counting in both forest and non-forest areas through the exploration of the deep learning application for automated tree detection and counting using satellite imagery. The objective is to identify the most effective model for automated tree counting. We used different deep learning models such as YOLOV7, SSD, and UNET, along with Generative Adversarial Networks to generate synthetic samples for training and other augmentation techniques, including Random Resized Crop, AutoAugment, and Linear Contrast Enhancement. These models were trained and fine-tuned using satellite imagery to identify and count trees. The performance of the models was assessed through multiple trials; after training and fine-tuning the models, UNET demonstrated the best performance with a validation loss of 0.1211, validation accuracy of 0.9509, and validation precision of 0.9799. This research showcases the success of deep learning in accurate tree counting through remote sensing, particularly with the UNET model. It represents a significant contribution to the field by offering an efficient and precise alternative to conventional tree-counting methods.Keywords: remote sensing, deep learning, tree counting, image segmentation, object detection, visualization
Procedia PDF Downloads 716256 Infrared Lightbox and iPhone App for Improving Detection Limit of Phosphate Detecting Dip Strips
Authors: H. Heidari-Bafroui, B. Ribeiro, A. Charbaji, C. Anagnostopoulos, M. Faghri
Abstract:
In this paper, we report the development of a portable and inexpensive infrared lightbox for improving the detection limits of paper-based phosphate devices. Commercial paper-based devices utilize the molybdenum blue protocol to detect phosphate in the environment. Although these devices are easy to use and have a long shelf life, their main deficiency is their low sensitivity based on the qualitative results obtained via a color chart. To improve the results, we constructed a compact infrared lightbox that communicates wirelessly with a smartphone. The system measures the absorbance of radiation for the molybdenum blue reaction in the infrared region of the spectrum. It consists of a lightbox illuminated by four infrared light-emitting diodes, an infrared digital camera, a Raspberry Pi microcontroller, a mini-router, and an iPhone to control the microcontroller. An iPhone application was also developed to analyze images captured by the infrared camera in order to quantify phosphate concentrations. Additionally, the app connects to an online data center to present a highly scalable worldwide system for tracking and analyzing field measurements. In this study, the detection limits for two popular commercial devices were improved by a factor of 4 for the Quantofix devices (from 1.3 ppm using visible light to 300 ppb using infrared illumination) and a factor of 6 for the Indigo units (from 9.2 ppm to 1.4 ppm) with repeatability of less than or equal to 1.2% relative standard deviation (RSD). The system also provides more granular concentration information compared to the discrete color chart used by commercial devices and it can be easily adapted for use in other applications.Keywords: infrared lightbox, paper-based device, phosphate detection, smartphone colorimetric analyzer
Procedia PDF Downloads 1236255 Load Balancing Technique for Energy - Efficiency in Cloud Computing
Authors: Rani Danavath, V. B. Narsimha
Abstract:
Cloud computing is emerging as a new paradigm of large scale distributed computing. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., three service models, and four deployment networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics models. Load balancing is one of the main challenges in cloud computing, which is required to distribute the dynamic workload across multiple nodes, to ensure that no single node is overloaded. It helps in optimal utilization of resources, enhancing the performance of the system. The goal of the load balancing is to minimize the resource consumption and carbon emission rate, that is the direct need of cloud computing. This determined the need of new metrics energy consumption and carbon emission for energy-efficiency load balancing techniques in cloud computing. Existing load balancing techniques mainly focuses on reducing overhead, services, response time and improving performance etc. In this paper we introduced a Technique for energy-efficiency, but none of the techniques have considered the energy consumption and carbon emission. Therefore, our proposed work will go towards energy – efficiency. So this energy-efficiency load balancing technique can be used to improve the performance of cloud computing by balancing the workload across all the nodes in the cloud with the minimum resource utilization, in turn, reducing energy consumption, and carbon emission to an extent, which will help to achieve green computing.Keywords: cloud computing, distributed computing, energy efficiency, green computing, load balancing, energy consumption, carbon emission
Procedia PDF Downloads 4496254 Health of Riveted Joints with Active and Passive Structural Health Monitoring Techniques
Authors: Javad Yarmahmoudi, Alireza Mirzaee
Abstract:
Many active and passive structural health monitoring (SHM) techniques have been developed for detection of the defects of plates. Generally, riveted joints hold the plates together and their failure may create accidents. In this study, well known active and passive methods were modified for the evaluation of the health of the riveted joints between the plates. The active method generated Lamb waves and monitored their propagation by using lead zirconate titanate (PZT) disks. The signal was analyzed by using the wavelet transformations. The passive method used the Fiber Bragg Grating (FBG) sensors and evaluated the spectral characteristics of the signals by using Fast Fourier Transformation (FFT). The results indicated that the existing methods designed for the evaluation of the health of individual plates may be used for inspection of riveted joints with software modifications.Keywords: structural health monitoring, SHM, active SHM, passive SHM, fiber bragg grating sensor, lead zirconate titanate, PZT
Procedia PDF Downloads 3276253 Heritage Buildings an Inspiration for Energy Conservation under Solar Control – a Case Study of Hadoti Region of India.
Authors: Abhinav Chaturvedi, Joohi Chaturvedi, Renu Chaturvedi
Abstract:
With rapid urbanization and growth of population, more buildings are require to be constructed to meet the increasing demand of the shelter. 80 % of the world population is living in developing countries, but the adequate energy supplied to only 30% of it. In India situation get little more difficult as majority of the villages of India are still deprived of energy. 1/3 of the Indian household does not have energy supply. So there is big gap between energy demand and supply. Moreover India is producing around 65 % of the energy from Non – Renewable sources and 25 % of the Energy is imported in the form of oil and gas and only 10% of the total, is generated from other sources like solar power, wind power etc. Present modern structures are big energy consumers as they are consuming 40 % of the total energy in providing comfort conditions to the users, in from of heating and cooling,5 % in Building Construction, 20 % in transportation and 20 % in industrial process and 10 % in other processes. If we minimize this Heating and Cooling and lighting load of the building we can conserve huge amount of energy for the future. In history, buildings do not have artificial systems of cooling or heating. These buildings, especially in Hadoti Region which have Semi Arid Climatic conditions, are provided with Solar Passive Design Techniques that is the reason of comfort inside the buildings. So if we use some appropriate elements of these heritage structures, in our present age building design we can find some certain solution to energy crises. Present paper describes Various Solar Passive design techniques used in past, and the same could be used in present to reduce the consumption of energy.Keywords: energy conservation, Hadoti region, solar passive design techniques , semi - arid climatic condition
Procedia PDF Downloads 4756252 Techniques of Construction Management in Civil Engineering
Authors: Mamoon M. Atout
Abstract:
The Middle East Gulf region has witnessed rapid growth and development in many areas over the last two decades. The development of the real-estate sector, construction industry and infrastructure projects are a major share of the development that has participated in the civilization of the countries of the Gulf. Construction industry projects were planned and managed by different types of experts, who came from all over the world having different types of experiences in construction management and industry. Some of these projects were completed on time, while many were not, due to many accumulating factors. Many accumulated factors are considered as the principle reason for the problem experienced at the project construction stage, which reflected negatively on the project success. Specific causes of delay have been identified by construction managers to avoid any unexpected delays through proper analysis and considerations to some implications such as risk assessment and analysis for many potential problems to ensure that projects will be delivered on time. Construction management implications were adopted and considered by project managers who have experience and knowledge in applying the techniques of the system of engineering construction management. The aim of this research is to determine the benefits of the implications of construction management by the construction team and level of considerations of the techniques and processes during the project development and construction phases to avoid any delay in the projects. It also aims to determine the factors that participate to project completion delays in case project managers are not well committed to their roles and responsibilities. The results of the analysis will determine the necessity of the applications required by the project team to avoid the causes of delays that help them deliver projects on time, e.g. verifying tender documents, quantities and preparing the construction method of the project.Keywords: construction management, control process, cost control, planning and scheduling
Procedia PDF Downloads 2476251 The Geometry of Natural Formation: an Application of Geometrical Analysis for Complex Natural Order of Pomegranate
Authors: Anahita Aris
Abstract:
Geometry always plays a key role in natural structures, which can be a source of inspiration for architects and urban designers to create spaces. By understanding formative principles in nature, a variety of options can be provided that lead to freedom of formation. The main purpose of this paper is to analyze the geometrical order found in pomegranate to find formative principles explaining its complex structure. The point is how spherical arils of pomegranate pressed together inside the fruit and filled the space as they expand in the growing process, which made a self-organized system leads to the formation of each of the arils are unique in size, topology and shape. The main challenge of this paper would be using advanced architectural modeling techniques to discover these principles.Keywords: advanced modeling techniques, architectural modeling, computational design, the geometry of natural formation, geometrical analysis, the natural order of pomegranate, voronoi diagrams
Procedia PDF Downloads 2206250 Algorithms used in Spatial Data Mining GIS
Authors: Vahid Bairami Rad
Abstract:
Extracting knowledge from spatial data like GIS data is important to reduce the data and extract information. Therefore, the development of new techniques and tools that support the human in transforming data into useful knowledge has been the focus of the relatively new and interdisciplinary research area ‘knowledge discovery in databases’. Thus, we introduce a set of database primitives or basic operations for spatial data mining which are sufficient to express most of the spatial data mining algorithms from the literature. This approach has several advantages. Similar to the relational standard language SQL, the use of standard primitives will speed-up the development of new data mining algorithms and will also make them more portable. We introduced a database-oriented framework for spatial data mining which is based on the concepts of neighborhood graphs and paths. A small set of basic operations on these graphs and paths were defined as database primitives for spatial data mining. Furthermore, techniques to efficiently support the database primitives by a commercial DBMS were presented.Keywords: spatial data base, knowledge discovery database, data mining, spatial relationship, predictive data mining
Procedia PDF Downloads 4606249 Quantum Statistical Machine Learning and Quantum Time Series
Authors: Omar Alzeley, Sergey Utev
Abstract:
Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series
Procedia PDF Downloads 4696248 A Comprehensive Study on Quality Assurance in Game Development
Authors: Maria Komal, Zaineb Khalil, Mehreen Sirshar
Abstract:
Due to the recent technological advancements, Games have become one of the most demanding applications. Gaming industry is rapidly growing and the key to success in this industry is the development of good quality games, which is a highly competitive issue. The ultimate goal of game developers is to provide player’s satisfaction by developing high-quality games. This research is the comprehensive survey of techniques followed by game industries to ensure games quality. After analysis of various techniques, it has been found that quality simulation according to ISO standards and play test methods are used to ensure games quality. Because game development requires cross-disciplined team, an increasing trend towards distributed game development has been observed. This paper evaluates the strengths and weaknesses of current methodologies used in game industry and draws a conclusion. We have also proposed quality parameters which can be used as a heuristic framework to identify those attributes which have high testing priorities.Keywords: game development, computer games, video games, gaming industry, quality assurance, playability, user experience
Procedia PDF Downloads 534