Search results for: hazard of noise
958 Multi-Spectral Deep Learning Models for Forest Fire Detection
Authors: Smitha Haridasan, Zelalem Demissie, Atri Dutta, Ajita Rattani
Abstract:
Aided by the wind, all it takes is one ember and a few minutes to create a wildfire. Wildfires are growing in frequency and size due to climate change. Wildfires and its consequences are one of the major environmental concerns. Every year, millions of hectares of forests are destroyed over the world, causing mass destruction and human casualties. Thus early detection of wildfire becomes a critical component to mitigate this threat. Many computer vision-based techniques have been proposed for the early detection of forest fire using video surveillance. Several computer vision-based methods have been proposed to predict and detect forest fires at various spectrums, namely, RGB, HSV, and YCbCr. The aim of this paper is to propose a multi-spectral deep learning model that combines information from different spectrums at intermediate layers for accurate fire detection. A heterogeneous dataset assembled from publicly available datasets is used for model training and evaluation in this study. The experimental results show that multi-spectral deep learning models could obtain an improvement of about 4.68 % over those based on a single spectrum for fire detection.Keywords: deep learning, forest fire detection, multi-spectral learning, natural hazard detection
Procedia PDF Downloads 241957 Inverse Heat Transfer Analysis of a Melting Furnace Using Levenberg-Marquardt Method
Authors: Mohamed Hafid, Marcel Lacroix
Abstract:
This study presents a simple inverse heat transfer procedure for predicting the wall erosion and the time-varying thickness of the protective bank that covers the inside surface of the refractory brick wall of a melting furnace. The direct problem is solved by using the Finite-Volume model. The melting/solidification process is modeled using the enthalpy method. The inverse procedure rests on the Levenberg-Marquardt method combined with the Broyden method. The effect of the location of the temperature sensors and of the measurement noise on the inverse predictions is investigated. Recommendations are made concerning the location of the temperature sensor.Keywords: melting furnace, inverse heat transfer, enthalpy method, levenberg–marquardt method
Procedia PDF Downloads 324956 Experimental Study of the Sound Absorption of a Geopolymer Panel with a Textile Component Designed for a Railway Corridor
Authors: Ludmila Fridrichová, Roman Knížek, Pavel Němeček, Katarzyna Ewa Buczkowska
Abstract:
The design of the sound absorption panel, which consists of three layers, is presented in this study. The first layer of the panel is perforated and provides sound transmission to the inner part of the panel. The second layer is composed of a bulk material whose purpose is to absorb as much noise as possible. The third layer of the panel has two functions: the first function is to ensure the strength of the panel, and the second function is to reflect the sound back into the bulk layer. Experimental results have shown that the size of the holes in the perforated panel affects the sound absorption of the required frequency. The percentage of filling of the perforated area affects the quantity of sound absorbed.Keywords: sound absorption, railway corridor, health, textile waste, natural fibres, concrete
Procedia PDF Downloads 14955 Multi-Robotic Partial Disassembly Line Balancing with Robotic Efficiency Difference via HNSGA-II
Authors: Tao Yin, Zeqiang Zhang, Wei Liang, Yanqing Zeng, Yu Zhang
Abstract:
To accelerate the remanufacturing process of electronic waste products, this study designs a partial disassembly line with the multi-robotic station to effectively dispose of excessive wastes. The multi-robotic partial disassembly line is a technical upgrade to the existing manual disassembly line. Balancing optimization can make the disassembly line smoother and more efficient. For partial disassembly line balancing with the multi-robotic station (PDLBMRS), a mixed-integer programming model (MIPM) considering the robotic efficiency differences is established to minimize cycle time, energy consumption and hazard index and to calculate their optimal global values. Besides, an enhanced NSGA-II algorithm (HNSGA-II) is proposed to optimize PDLBMRS efficiently. Finally, MIPM and HNSGA-II are applied to an actual mixed disassembly case of two types of computers, the comparison of the results solved by GUROBI and HNSGA-II verifies the correctness of the model and excellent performance of the algorithm, and the obtained Pareto solution set provides multiple options for decision-makers.Keywords: waste disposal, disassembly line balancing, multi-robot station, robotic efficiency difference, HNSGA-II
Procedia PDF Downloads 237954 Stochastic Pi Calculus in Financial Markets: An Alternate Approach to High Frequency Trading
Authors: Jerome Joshi
Abstract:
The paper presents the modelling of financial markets using the Stochastic Pi Calculus model. The Stochastic Pi Calculus model is mainly used for biological applications; however, the feature of this model promotes its use in financial markets, more prominently in high frequency trading. The trading system can be broadly classified into exchange, market makers or intermediary traders and fundamental traders. The exchange is where the action of the trade is executed, and the two types of traders act as market participants in the exchange. High frequency trading, with its complex networks and numerous market participants (intermediary and fundamental traders) poses a difficulty while modelling. It involves the participants to seek the advantage of complex trading algorithms and high execution speeds to carry out large volumes of trades. To earn profits from each trade, the trader must be at the top of the order book quite frequently by executing or processing multiple trades simultaneously. This would require highly automated systems as well as the right sentiment to outperform other traders. However, always being at the top of the book is also not best for the trader, since it was the reason for the outbreak of the ‘Hot – Potato Effect,’ which in turn demands for a better and more efficient model. The characteristics of the model should be such that it should be flexible and have diverse applications. Therefore, a model which has its application in a similar field characterized by such difficulty should be chosen. It should also be flexible in its simulation so that it can be further extended and adapted for future research as well as be equipped with certain tools so that it can be perfectly used in the field of finance. In this case, the Stochastic Pi Calculus model seems to be an ideal fit for financial applications, owing to its expertise in the field of biology. It is an extension of the original Pi Calculus model and acts as a solution and an alternative to the previously flawed algorithm, provided the application of this model is further extended. This model would focus on solving the problem which led to the ‘Flash Crash’ which is the ‘Hot –Potato Effect.’ The model consists of small sub-systems, which can be integrated to form a large system. It is designed in way such that the behavior of ‘noise traders’ is considered as a random process or noise in the system. While modelling, to get a better understanding of the problem, a broader picture is taken into consideration with the trader, the system, and the market participants. The paper goes on to explain trading in exchanges, types of traders, high frequency trading, ‘Flash Crash,’ ‘Hot-Potato Effect,’ evaluation of orders and time delay in further detail. For the future, there is a need to focus on the calibration of the module so that they would interact perfectly with other modules. This model, with its application extended, would provide a basis for researchers for further research in the field of finance and computing.Keywords: concurrent computing, high frequency trading, financial markets, stochastic pi calculus
Procedia PDF Downloads 77953 Robust Noisy Speech Identification Using Frame Classifier Derived Features
Authors: Punnoose A. K.
Abstract:
This paper presents an approach for identifying noisy speech recording using a multi-layer perception (MLP) trained to predict phonemes from acoustic features. Characteristics of the MLP posteriors are explored for clean speech and noisy speech at the frame level. Appropriate density functions are used to fit the softmax probability of the clean and noisy speech. A function that takes into account the ratio of the softmax probability density of noisy speech to clean speech is formulated. These phoneme independent scoring is weighted using a phoneme-specific weightage to make the scoring more robust. Simple thresholding is used to identify the noisy speech recording from the clean speech recordings. The approach is benchmarked on standard databases, with a focus on precision.Keywords: noisy speech identification, speech pre-processing, noise robustness, feature engineering
Procedia PDF Downloads 127952 Omni-Relay (OR) Scheme-Aided LTE-A Communication Systems
Authors: Hassan Mahasneh, Abu Sesay
Abstract:
We propose the use of relay terminals at the cell edge of an LTE-based cellar system. Each relay terminal is equipped with an omni-directional antenna. We refer to this scheme as the Omni-Relay (OR) scheme. The OR scheme coordinates the inter-cell interference (ICI) stemming from adjacent cells and increases the desired signal level at cell-edge regions. To validate the performance of the OR scheme, we derive the average signal-to-interference plus noise ratio (SINR) and the average capacity and compare it with the conventional universal frequency reuse factor (UFRF). The results show that the proposed OR scheme provides higher average SINR and average capacity compared to the UFRF due to the assistance of the distributed relay nodes.Keywords: the UFRF scheme, the OR scheme, ICI, relay terminals, SINR, spectral efficiency
Procedia PDF Downloads 341951 New Features for Copy-Move Image Forgery Detection
Authors: Michael Zimba
Abstract:
A novel set of features for copy-move image forgery, CMIF, detection method is proposed. The proposed set presents a new approach which relies on electrostatic field theory, EFT. Solely for the purpose of reducing the dimension of a suspicious image, firstly performs discrete wavelet transform, DWT, of the suspicious image and extracts only the approximation subband. The extracted subband is then bijectively mapped onto a virtual electrostatic field where concepts of EFT are utilised to extract robust features. The extracted features are shown to be invariant to additive noise, JPEG compression, and affine transformation. The proposed features can also be used in general object matching.Keywords: virtual electrostatic field, features, affine transformation, copy-move image forgery
Procedia PDF Downloads 543950 Image Steganography Using Predictive Coding for Secure Transmission
Authors: Baljit Singh Khehra, Jagreeti Kaur
Abstract:
In this paper, steganographic strategy is used to hide the text file inside an image. To increase the storage limit, predictive coding is utilized to implant information. In the proposed plan, one can exchange secure information by means of predictive coding methodology. The predictive coding produces high stego-image. The pixels are utilized to insert mystery information in it. The proposed information concealing plan is powerful as contrasted with the existing methodologies. By applying this strategy, a provision helps clients to productively conceal the information. Entropy, standard deviation, mean square error and peak signal noise ratio are the parameters used to evaluate the proposed methodology. The results of proposed approach are quite promising.Keywords: cryptography, steganography, reversible image, predictive coding
Procedia PDF Downloads 417949 Survival Analysis Based Delivery Time Estimates for Display FAB
Authors: Paul Han, Jun-Geol Baek
Abstract:
In the flat panel display industry, the scheduler and dispatching system to meet production target quantities and the deadline of production are the major production management system which controls each facility production order and distribution of WIP (Work in Process). In dispatching system, delivery time is a key factor for the time when a lot can be supplied to the facility. In this paper, we use survival analysis methods to identify main factors and a forecasting model of delivery time. Of survival analysis techniques to select important explanatory variables, the cox proportional hazard model is used to. To make a prediction model, the Accelerated Failure Time (AFT) model was used. Performance comparisons were conducted with two other models, which are the technical statistics model based on transfer history and the linear regression model using same explanatory variables with AFT model. As a result, the Mean Square Error (MSE) criteria, the AFT model decreased by 33.8% compared to the existing prediction model, decreased by 5.3% compared to the linear regression model. This survival analysis approach is applicable to implementing a delivery time estimator in display manufacturing. And it can contribute to improve the productivity and reliability of production management system.Keywords: delivery time, survival analysis, Cox PH model, accelerated failure time model
Procedia PDF Downloads 543948 Research on Sensitivity of Geological Disasters in Road Area Based on Analytic Hierarchy Process
Authors: Li Yongyi
Abstract:
In order to explore the distribution of geological disasters within the expressway area of Shaanxi Province, the Analytic Hierarchy Process theory is applied based on the geographic information system technology platform, and the ground elevation, rainfall, vegetation coverage and other indicators are selected for analysis, and the expressway area is sensitive Sexual evaluation. The results show that the highway area disasters in Shaanxi Province are mainly distributed in the southern mountainous areas and are dominated by landslides; the disaster area ratio basically increases with the increase in ground elevation, surface slope, surface undulation, rainfall, and vegetation coverage. The increase in the distance from the river shows a decreasing trend; after grading the disaster sensitivity within 5km of the expressway, the extremely sensitive area, the highly sensitive area, the medium sensitive area, the low sensitive area, and the extremely low sensitive area respectively account for 8.17%、15.80%、22.99%、26.22%、26.82%. Highly sensitive road areas are mainly distributed in southern Shaanxi.Keywords: highway engineering, sensitivity, analytic hierarchy process, geological hazard, road area
Procedia PDF Downloads 101947 Design of Target Selection for Pedestrian Autonomous Emergency Braking System
Authors: Tao Song, Hao Cheng, Guangfeng Tian, Chuang Xu
Abstract:
An autonomous emergency braking system is an advanced driving assistance system that enables vehicle collision avoidance and pedestrian collision avoidance to improve vehicle safety. At present, because the pedestrian target is small, and the mobility is large, the pedestrian AEB system is faced with more technical difficulties and higher functional requirements. In this paper, a method of pedestrian target selection based on a variable width funnel is proposed. Based on the current position and predicted position of pedestrians, the relative position of vehicle and pedestrian at the time of collision is calculated, and different braking strategies are adopted according to the hazard level of pedestrian collisions. In the CNCAP standard operating conditions, comparing the method of considering only the current position of pedestrians and the method of considering pedestrian prediction position, as well as the method based on fixed width funnel and variable width funnel, the results show that, based on variable width funnel, the choice of pedestrian target will be more accurate and the opportunity of the intervention of AEB system will be more reasonable by considering the predicted position of the pedestrian target and vehicle's lateral motion.Keywords: automatic emergency braking system, pedestrian target selection, TTC, variable width funnel
Procedia PDF Downloads 157946 Analysis and Performance of Handover in Universal Mobile Telecommunications System (UMTS) Network Using OPNET Modeller
Authors: Latif Adnane, Benaatou Wafa, Pla Vicent
Abstract:
Handover is of great significance to achieve seamless connectivity in wireless networks. This paper gives an impression of the main factors which are being affected by the soft and the hard handovers techniques. To know and understand the handover process in The Universal Mobile Telecommunications System (UMTS) network, different statistics are calculated. This paper focuses on the quality of service (QoS) of soft and hard handover in UMTS network, which includes the analysis of received power, signal to noise radio, throughput, delay traffic, traffic received, delay, total transmit load, end to end delay and upload response time using OPNET simulator.Keywords: handover, UMTS, mobility, simulation, OPNET modeler
Procedia PDF Downloads 321945 Improving Decision Support for Organ Transplant
Authors: Ian McCulloh, Andrew Placona, Darren Stewart, Daniel Gause, Kevin Kiernan, Morgan Stuart, Christopher Zinner, Laura Cartwright
Abstract:
An estimated 22-25% of viable deceased donor kidneys are discarded every year in the US, while waitlisted candidates are dying every day. As many as 85% of transplanted organs are refused at least once for a patient that scored higher on the match list. There are hundreds of clinical variables involved in making a clinical transplant decision and there is rarely an ideal match. Decision makers exhibit an optimism bias where they may refuse an organ offer assuming a better match is imminent. We propose a semi-parametric Cox proportional hazard model, augmented by an accelerated failure time model based on patient specific suitable organ supply and demand to estimate a time-to-next-offer. Performance is assessed with Cox-Snell residuals and decision curve analysis, demonstrating improved decision support for up to a 5-year outlook. Providing clinical decision makers with quantitative evidence of likely patient outcomes (e.g., time to next offer and the mortality associated with waiting) may improve decisions and reduce optimism bias, thus reducing discarded organs and matching more patients on the waitlist.Keywords: decision science, KDPI, optimism bias, organ transplant
Procedia PDF Downloads 105944 Empirical Green’s Function Technique for Accelerogram Synthesis: The Problem of the Use for Marine Seismic Hazard Assessment
Authors: Artem A. Krylov
Abstract:
Instrumental seismological researches in water areas are complicated and expensive, that leads to the lack of strong motion records in most offshore regions. In the same time the number of offshore industrial infrastructure objects, such as oil rigs, subsea pipelines, is constantly increasing. The empirical Green’s function technique proved to be very effective for accelerograms synthesis under the conditions of poorly described seismic wave propagation medium. But the selection of suitable small earthquake record in offshore regions as an empirical Green’s function is a problem because of short seafloor instrumental seismological investigation results usually with weak micro-earthquakes recordings. An approach based on moving average smoothing in the frequency domain is presented for preliminary processing of weak micro-earthquake records before using it as empirical Green’s function. The method results in significant waveform correction for modeled event. The case study for 2009 L’Aquila earthquake was used to demonstrate the suitability of the method. This work was supported by the Russian Foundation of Basic Research (project № 18-35-00474 mol_a).Keywords: accelerogram synthesis, empirical Green's function, marine seismology, microearthquakes
Procedia PDF Downloads 324943 Bridges Seismic Isolation Using CNT Reinforced Polymer Bearings
Authors: Mohamed Attia, Vissarion Papadopoulos
Abstract:
There is no doubt that there is a continuous deterioration of structures as a result of multiple hazards which can be divided into natural hazards (e.g., earthquakes, floods, winds) and other hazards due to human behavior (e.g., ship collisions, excessive traffic, terrorist attacks). There have been numerous attempts to address the catastrophic consequences of these hazards and traditional solutions through structural design and safety factors within the design codes, but there has not been much research addressing solutions through the use of new materials that have high performance and can be more effective than usual materials such as reinforced concrete and steel. To illustrate the effect of one of the new high-performance materials, carbon nanotube-reinforced polymer (CNT/polymer) bearings with different weight fractions were simulated as structural components of seismic isolation using ABAQUS in the connection between a bridge superstructure and the substructure. The results of the analyzes showed a significant increase in the time period of the bridge and a clear decrease in the bending moment at the base of the bridge piers at each time step of the time-history analysis in the case of using CNT/polymer bearings compared to the case of direct contact between the superstructure of the bridge and the substructure.Keywords: seismic isolation, bridges damage, earthquake hazard, earthquake resistant structures
Procedia PDF Downloads 195942 A Bayesian Model with Improved Prior in Extreme Value Problems
Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro
Abstract:
In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior
Procedia PDF Downloads 198941 Synthesis and Characterization of Renewable Resource Based Green Epoxy Coating
Authors: Sukanya Pradhan, Smita Mohanty, S. K Nayak
Abstract:
Plant oils are a great renewable source for being a reliable starting material to access new products with a wide spectrum of structural and functional variations. Even though petroleum products might also render the same, but it would also impose a high risk factor of environmental and health hazard. Since epoxidized vegetable oils are easily available, eco-compatible, non-toxic and renewable, hence these have drawn much of the attentions in the polymer industrial sector especially for the development of eco-friendly coating materials. In this study a waterborne epoxy coating was prepared from epoxidized soyabean oil by using triethanolamine. Because of its hydrophobic nature, it was a tough and tedius task to make it hydrophilic. The hydrophobic biobased epoxy was modified into waterborne epoxy by the help of a plant based anhydride as curing agent. Physico-mechanical, chemical resistance tests and thermal analysis of the green coating material were carried out which showed good physic-mechanical, chemical resistance properties as well as environment friendly. The complete characterization of the final material was done in terms of scratch hardness, gloss test, impact resistance, adhesion and bend test.Keywords: epoxidized soybean oil, waterborne, curing agent, green coating
Procedia PDF Downloads 541940 Perceptual Organization within Temporal Displacement
Authors: Michele Sinico
Abstract:
The psychological present has an actual extension. When a sequence of instantaneous stimuli falls in this short interval of time, observers perceive a compresence of events in succession and the temporal order depends on the qualitative relationships between the perceptual properties of the events. Two experiments were carried out to study the influence of perceptual grouping, with and without temporal displacement, on the duration of auditory sequences. The psychophysical method of adjustment was adopted. The first experiment investigated the effect of temporal displacement of a white noise on sequence duration. The second experiment investigated the effect of temporal displacement, along the pitch dimension, on temporal shortening of sequence. The results suggest that the temporal order of sounds, in the case of temporal displacement, is organized along the pitch dimension.Keywords: time perception, perceptual present, temporal displacement, Gestalt laws of perceptual organization
Procedia PDF Downloads 251939 Combined Safety and Cybersecurity Risk Assessment for Intelligent Distributed Grids
Authors: Anders Thorsén, Behrooz Sangchoolie, Peter Folkesson, Ted Strandberg
Abstract:
As more parts of the power grid become connected to the internet, the risk of cyberattacks increases. To identify the cybersecurity threats and subsequently reduce vulnerabilities, the common practice is to carry out a cybersecurity risk assessment. For safety classified systems and products, there is also a need for safety risk assessments in addition to the cybersecurity risk assessment in order to identify and reduce safety risks. These two risk assessments are usually done separately, but since cybersecurity and functional safety are often related, a more comprehensive method covering both aspects is needed. Some work addressing this has been done for specific domains like the automotive domain, but more general methods suitable for, e.g., intelligent distributed grids, are still missing. One such method from the automotive domain is the Security-Aware Hazard Analysis and Risk Assessment (SAHARA) method that combines safety and cybersecurity risk assessments. This paper presents an approach where the SAHARA method has been modified in order to be more suitable for larger distributed systems. The adapted SAHARA method has a more general risk assessment approach than the original SAHARA. The proposed method has been successfully applied on two use cases of an intelligent distributed grid.Keywords: intelligent distribution grids, threat analysis, risk assessment, safety, cybersecurity
Procedia PDF Downloads 153938 Planning Strategies for Urban Flood Mitigation through Different Case Studies of Best Practices across the World
Authors: Bismina Akbar, Smitha M. V.
Abstract:
Flooding is a global phenomenon that causes widespread devastation, economic damage, and loss of human lives. In the past twenty years, the number of reported flood events has increased significantly. Millions of people around the globe are at risk of flooding from coastal, dam breaks, groundwater, and urban surface water and wastewater sources. Climate change is one of the important causes for them since it affects, directly and indirectly, the river network. Although the contribution of climate change is undeniable, human contributions are there to increase the frequency of floods. There are different types of floods, such as Flash floods, Coastal floods, Urban floods, River (or fluvial) floods, and Ponding (or pluvial flooding). This study focuses on formulating mitigation strategies for urban flood risk reduction through analysis of different best practice case studies, including China, Japan, Indonesia, and Brazil. The mitigation measures suggest that apart from the structural and non-structural measures, environmental considerations like blue-green solutions are beneficial for flood risk reduction. And also, Risk-Informed Master plans are essential nowadays to take risk-based decision processes that enable more sustainability and resilience.Keywords: hazard, mitigation, risk reduction, urban flood
Procedia PDF Downloads 77937 Evaluation of Expected Annual Loss Probabilities of RC Moment Resisting Frames
Authors: Saemee Jun, Dong-Hyeon Shin, Tae-Sang Ahn, Hyung-Joon Kim
Abstract:
Building loss estimation methodologies which have been advanced considerably in recent decades are usually used to estimate socio and economic impacts resulting from seismic structural damage. In accordance with these methods, this paper presents the evaluation of an annual loss probability of a reinforced concrete moment resisting frame designed according to Korean Building Code. The annual loss probability is defined by (1) a fragility curve obtained from a capacity spectrum method which is similar to a method adopted from HAZUS, and (2) a seismic hazard curve derived from annual frequencies of exceedance per peak ground acceleration. Seismic fragilities are computed to calculate the annual loss probability of a certain structure using functions depending on structural capacity, seismic demand, structural response and the probability of exceeding damage state thresholds. This study carried out a nonlinear static analysis to obtain the capacity of a RC moment resisting frame selected as a prototype building. The analysis results show that the probability of being extensive structural damage in the prototype building is expected to 0.004% in a year.Keywords: expected annual loss, loss estimation, RC structure, fragility analysis
Procedia PDF Downloads 397936 Developing an AI-Driven Application for Real-Time Emotion Recognition from Human Vocal Patterns
Authors: Sayor Ajfar Aaron, Mushfiqur Rahman, Sajjat Hossain Abir, Ashif Newaz
Abstract:
This study delves into the development of an artificial intelligence application designed for real-time emotion recognition from human vocal patterns. Utilizing advanced machine learning algorithms, including deep learning and neural networks, the paper highlights both the technical challenges and potential opportunities in accurately interpreting emotional cues from speech. Key findings demonstrate the critical role of diverse training datasets and the impact of ambient noise on recognition accuracy, offering insights into future directions for improving robustness and applicability in real-world scenarios.Keywords: artificial intelligence, convolutional neural network, emotion recognition, vocal patterns
Procedia PDF Downloads 52935 Probabilistic Model for Evaluating Seismic Soil Liquefaction Based on Energy Approach
Authors: Hamid Rostami, Ali Fallah Yeznabad, Mohammad H. Baziar
Abstract:
The energy-based method for evaluating seismic soil liquefaction has two main sections. First is the demand energy, which is dissipated energy of earthquake at a site, and second is the capacity energy as a representation of soil resistance against liquefaction hazard. In this study, using a statistical analysis of recorded data by 14 down-hole array sites in California, an empirical equation was developed to estimate the demand energy at sites. Because determination of capacity energy at a site needs to calculate several site calibration factors, which are obtained by experimental tests, in this study the standard penetration test (SPT) N-value was assumed as an alternative to the capacity energy at a site. Based on this assumption, the empirical equation was employed to calculate the demand energy for 193 liquefied and no-liquefied sites and then these amounts were plotted versus the corresponding SPT numbers for all sites. Subsequently, a discrimination analysis was employed to determine the equations of several boundary curves for various liquefaction likelihoods. Finally, a comparison was made between the probabilistic model and the commonly used stress method. As a conclusion, the results clearly showed that energy-based method can be more reliable than conventional stress-based method in evaluation of liquefaction occurrence.Keywords: energy demand, liquefaction, probabilistic analysis, SPT number
Procedia PDF Downloads 367934 Risk Reduction of Household Refuse, a Case Study of Shagari Low-Cost, Mubi North (LGA) Adamawa State, Nigeria
Authors: Maryam Tijjani Kolo
Abstract:
Lack of refuse dumping points has made the residents of Shagari low-cost well armed with some health and environmental related hazards. These studies investigate the effect of household refuse on the resident of Shagari low-cost. A well structured questionnaire was administered to elicit views of the respondent in the study area through adopting cluster sampling method. A total of 100 questionnaires were selected and divided into 50, each to both sections of the study area. Interview was conducted to each household head. Data obtained were analyzed using simple parentages to determine the major hazard in the area. Result showed that majority of the household are civil servant and traders, earning reasonable monthly income. 68% of the respondent has experienced the effect of living close to waste dumping areas, which include unpleasant smell and polluted smoke when refuse is burnt, which causes eye and respiratory induction, human injury from broken bottles or sharp objects as well as water, insect and air borne diseases. Hence, the need to urgently address these menace before it overwhelms the capacities of the community becomes paramount. Thus, the community should be given more enlightenment and refuse dumping sites should be created by the local government area.Keywords: household, refuse, refuse dumping points, Shagari low-cost
Procedia PDF Downloads 319933 In-situ Acoustic Emission Analysis of a Polymer Electrolyte Membrane Water Electrolyser
Authors: M. Maier, I. Dedigama, J. Majasan, Y. Wu, Q. Meyer, L. Castanheira, G. Hinds, P. R. Shearing, D. J. L. Brett
Abstract:
Increasing the efficiency of electrolyser technology is commonly seen as one of the main challenges on the way to the Hydrogen Economy. There is a significant lack of understanding of the different states of operation of polymer electrolyte membrane water electrolysers (PEMWE) and how these influence the overall efficiency. This in particular means the two-phase flow through the membrane, gas diffusion layers (GDL) and flow channels. In order to increase the efficiency of PEMWE and facilitate their spread as commercial hydrogen production technology, new analytic approaches have to be found. Acoustic emission (AE) offers the possibility to analyse the processes within a PEMWE in a non-destructive, fast and cheap in-situ way. This work describes the generation and analysis of AE data coming from a PEM water electrolyser, for, to the best of our knowledge, the first time in literature. Different experiments are carried out. Each experiment is designed so that only specific physical processes occur and AE solely related to one process can be measured. Therefore, a range of experimental conditions is used to induce different flow regimes within flow channels and GDL. The resulting AE data is first separated into different events, which are defined by exceeding the noise threshold. Each acoustic event consists of a number of consequent peaks and ends when the wave diminishes under the noise threshold. For all these acoustic events the following key attributes are extracted: maximum peak amplitude, duration, number of peaks, peaks before the maximum, average intensity of a peak and time till the maximum is reached. Each event is then expressed as a vector containing the normalized values for all criteria. Principal Component Analysis is performed on the resulting data, which orders the criteria by the eigenvalues of their covariance matrix. This can be used as an easy way of determining which criteria convey the most information on the acoustic data. In the following, the data is ordered in the two- or three-dimensional space formed by the most relevant criteria axes. By finding spaces in the two- or three-dimensional space only occupied by acoustic events originating from one of the three experiments it is possible to relate physical processes to certain acoustic patterns. Due to the complex nature of the AE data modern machine learning techniques are needed to recognize these patterns in-situ. Using the AE data produced before allows to train a self-learning algorithm and develop an analytical tool to diagnose different operational states in a PEMWE. Combining this technique with the measurement of polarization curves and electrochemical impedance spectroscopy allows for in-situ optimization and recognition of suboptimal states of operation.Keywords: acoustic emission, gas diffusion layers, in-situ diagnosis, PEM water electrolyser
Procedia PDF Downloads 156932 Image Steganography Using Least Significant Bit Technique
Authors: Preeti Kumari, Ridhi Kapoor
Abstract:
In any communication, security is the most important issue in today’s world. In this paper, steganography is the process of hiding the important data into other data, such as text, audio, video, and image. The interest in this topic is to provide availability, confidentiality, integrity, and authenticity of data. The steganographic technique that embeds hides content with unremarkable cover media so as not to provoke eavesdropper’s suspicion or third party and hackers. In which many applications of compression, encryption, decryption, and embedding methods are used for digital image steganography. Due to compression, the nose produces in the image. To sustain noise in the image, the LSB insertion technique is used. The performance of the proposed embedding system with respect to providing security to secret message and robustness is discussed. We also demonstrate the maximum steganography capacity and visual distortion.Keywords: steganography, LSB, encoding, information hiding, color image
Procedia PDF Downloads 474931 Investigating the Demand of Short-Shelf Life Food Products for SME Wholesalers
Authors: Yamini Raju, Parminder S. Kang, Adam Moroz, Ross Clement, Alistair Duffy, Ashley Hopwell
Abstract:
Accurate prediction of fresh produce demand is one the challenges faced by Small Medium Enterprise (SME) wholesalers. Current research in this area focused on limited number of factors specific to a single product or a business type. This paper gives an overview of the current literature on the variability factors used to predict demand and the existing forecasting techniques of short shelf life products. It then extends it by adding new factors and investigating if there is a time lag and possibility of noise in the orders. It also identifies the most important factors using correlation and Principal Component Analysis (PCA).Keywords: demand forecasting, deteriorating products, food wholesalers, principal component analysis, variability factors
Procedia PDF Downloads 520930 Fractal-Wavelet Based Techniques for Improving the Artificial Neural Network Models
Authors: Reza Bazargan lari, Mohammad H. Fattahi
Abstract:
Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for pre-processing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based pre-processing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.Keywords: wavelet, de-noising, predictability, time series fractal analysis, valid length, ANN
Procedia PDF Downloads 368929 DeClEx-Processing Pipeline for Tumor Classification
Authors: Gaurav Shinde, Sai Charan Gongiguntla, Prajwal Shirur, Ahmed Hambaba
Abstract:
Health issues are significantly increasing, putting a substantial strain on healthcare services. This has accelerated the integration of machine learning in healthcare, particularly following the COVID-19 pandemic. The utilization of machine learning in healthcare has grown significantly. We introduce DeClEx, a pipeline that ensures that data mirrors real-world settings by incorporating Gaussian noise and blur and employing autoencoders to learn intermediate feature representations. Subsequently, our convolutional neural network, paired with spatial attention, provides comparable accuracy to state-of-the-art pre-trained models while achieving a threefold improvement in training speed. Furthermore, we provide interpretable results using explainable AI techniques. We integrate denoising and deblurring, classification, and explainability in a single pipeline called DeClEx.Keywords: machine learning, healthcare, classification, explainability
Procedia PDF Downloads 55