Search results for: noise mapping
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2209

Search results for: noise mapping

799 Failure Analysis of Low Relaxation Prestressed High Carbon Steel Wire During Drawing Operation: A Metallurgical Investigation

Authors: Souvik Das, Sandip Bhattacharya, Goutam Mukhopadhyay, Manashi Adhikary

Abstract:

Wires breakages during cold drawing are a complex phenomenon; wire breakages may be induced by improper wire-rod quality, inappropriate heat-treated microstructure, and/or lubrication breakdown on the wire surface. A comprehensive metallurgical investigation of failed/broken wire samples is therefore essential for understanding the origin of failure. Frequent breakage of wires during drawing is a matter of serious concern to the wire drawers as it erodes their already slim margins through reduced productivity and loss in yield. The present paper highlights the failure investigation of wires of Low Relaxation Prestressed High Carbon grade during cold drawing due to entrapment of hard constituents detached from the roller entry guide during rolling operations. The hardness measurement of this entrapped location indicates 54.9 Rockwell Hardness as against the rest portion 33.4 Rockwell Hardness. The microstructure chemical analysis and X-ray mapping analysis data of the entrapment location confirmed complex chromium carbide originated from D2-steel used in entry guide during the rolling process. Since the harder entrapped phase could not be deformed in the same manner as the parent phase, the failure of the wire rod occurs during hot rolling.

Keywords: LRPC, D2-steel, chromium carbide, roller guide

Procedia PDF Downloads 139
798 Optimization of Parameters for Electrospinning of Pan Nanofibers by Taguchi Method

Authors: Gamze Karanfil Celep, Kevser Dincer

Abstract:

The effects of polymer concentration and electrospinning process parameters on the average diameters of electrospun polyacrylonitrile (PAN) nanofibers were experimentally investigated. Besides, mechanical and thermal properties of PAN nanofibers were examined by tensile test and thermogravimetric analysis (TGA), respectively. For this purpose, the polymer concentration, solution feed rate, supply voltage and tip-to-collector distance were determined as the control factors. To succeed these aims, Taguchi’s L16 orthogonal design (4 parameters, 4 level) was employed for the experimental design. Optimal electrospinning conditions were defined using the signal-to-noise (S/N) ratio that was calculated from diameters of the electrospun PAN nanofibers according to "the-smaller-the-better" approachment. In addition, analysis of variance (ANOVA) was evaluated to conclude the statistical significance of the process parameters. The smallest diameter of PAN nanofibers was observed. According to the S/N ratio response results, the most effective parameter on finding out of nanofiber diameter was determined. Finally, the Taguchi design of experiments method has been found to be an effective method to statistically optimize the critical electrospinning parameters used in nanofiber production. After determining the optimum process parameters of nanofiber production, electrical conductivity and fuel cell performance of electrospun PAN nanofibers on the carbon papers will be evaluated.

Keywords: nanofiber, electrospinning, polyacrylonitrile, Taguchi method

Procedia PDF Downloads 184
797 The Trajectory of the Ball in Football Game

Authors: Mahdi Motahari, Mojtaba Farzaneh, Ebrahim Sepidbar

Abstract:

Tracking of moving and flying targets is one of the most important issues in image processing topic. Estimating of trajectory of desired object in short-term and long-term scale is more important than tracking of moving and flying targets. In this paper, a new way of identifying and estimating of future trajectory of a moving ball in long-term scale is estimated by using synthesis and interaction of image processing algorithms including noise removal and image segmentation, Kalman filter algorithm in order to estimating of trajectory of ball in football game in short-term scale and intelligent adaptive neuro-fuzzy algorithm based on time series of traverse distance. The proposed system attain more than 96% identify accuracy by using aforesaid methods and relaying on aforesaid algorithms and data base video in format of synthesis and interaction. Although the present method has high precision, it is time consuming. By comparing this method with other methods we realize the accuracy and efficiency of that.

Keywords: tracking, signal processing, moving targets and flying, artificial intelligent systems, estimating of trajectory, Kalman filter

Procedia PDF Downloads 444
796 Video Object Segmentation for Automatic Image Annotation of Ethernet Connectors with Environment Mapping and 3D Projection

Authors: Marrone Silverio Melo Dantas Pedro Henrique Dreyer, Gabriel Fonseca Reis de Souza, Daniel Bezerra, Ricardo Souza, Silvia Lins, Judith Kelner, Djamel Fawzi Hadj Sadok

Abstract:

The creation of a dataset is time-consuming and often discourages researchers from pursuing their goals. To overcome this problem, we present and discuss two solutions adopted for the automation of this process. Both optimize valuable user time and resources and support video object segmentation with object tracking and 3D projection. In our scenario, we acquire images from a moving robotic arm and, for each approach, generate distinct annotated datasets. We evaluated the precision of the annotations by comparing these with a manually annotated dataset, as well as the efficiency in the context of detection and classification problems. For detection support, we used YOLO and obtained for the projection dataset an F1-Score, accuracy, and mAP values of 0.846, 0.924, and 0.875, respectively. Concerning the tracking dataset, we achieved an F1-Score of 0.861, an accuracy of 0.932, whereas mAP reached 0.894. In order to evaluate the quality of the annotated images used for classification problems, we employed deep learning architectures. We adopted metrics accuracy and F1-Score, for VGG, DenseNet, MobileNet, Inception, and ResNet. The VGG architecture outperformed the others for both projection and tracking datasets. It reached an accuracy and F1-score of 0.997 and 0.993, respectively. Similarly, for the tracking dataset, it achieved an accuracy of 0.991 and an F1-Score of 0.981.

Keywords: RJ45, automatic annotation, object tracking, 3D projection

Procedia PDF Downloads 142
795 Solar Radiation Time Series Prediction

Authors: Cameron Hamilton, Walter Potter, Gerrit Hoogenboom, Ronald McClendon, Will Hobbs

Abstract:

A model was constructed to predict the amount of solar radiation that will make contact with the surface of the earth in a given location an hour into the future. This project was supported by the Southern Company to determine at what specific times during a given day of the year solar panels could be relied upon to produce energy in sufficient quantities. Due to their ability as universal function approximators, an artificial neural network was used to estimate the nonlinear pattern of solar radiation, which utilized measurements of weather conditions collected at the Griffin, Georgia weather station as inputs. A number of network configurations and training strategies were utilized, though a multilayer perceptron with a variety of hidden nodes trained with the resilient propagation algorithm consistently yielded the most accurate predictions. In addition, a modeled DNI field and adjacent weather station data were used to bolster prediction accuracy. In later trials, the solar radiation field was preprocessed with a discrete wavelet transform with the aim of removing noise from the measurements. The current model provides predictions of solar radiation with a mean square error of 0.0042, though ongoing efforts are being made to further improve the model’s accuracy.

Keywords: artificial neural networks, resilient propagation, solar radiation, time series forecasting

Procedia PDF Downloads 368
794 Internal Cycles from Hydrometric Data and Variability Detected Through Hydrological Modelling Results, on the Niger River, over 1901-2020

Authors: Salif Koné

Abstract:

We analyze hydrometric data at the Koulikoro station on the Niger River; this basin drains 120600 km2 and covers three countries in West Africa, Guinea, Mali, and Ivory Coast. Two subsequent decadal cycles are highlighted (1925-1936 and 1929-1939) instead of the presumed single decadal one from literature. Moreover, the observed hydrometric data shows a multidecadal 40-year period that is confirmed when graphing a spatial coefficient of variation of runoff over decades (starting at 1901-1910). Spatial runoff data are produced on 48 grids (0.5 degree by 0.5 degree) and through semi-distributed versions of both SimulHyd model and GR2M model - variants of a French Hydrologic model – standing for Genie Rural of 2 parameters at monthly time step. Both extremal decades in terms of runoff coefficient of variation are confronted: 1951-1960 has minimal coefficient of variation, and 1981-1990 shows the maximal value of it during the three months of high-water level (August, September, and October). The mapping of the relative variation of these two decadal situations allows hypothesizing as following: the scale of variation between both extremal situations could serve to fix boundary conditions for further simulations using data from climate scenario.

Keywords: internal cycles, hydrometric data, niger river, gr2m and simulhyd framework, runoff coefficient of variation

Procedia PDF Downloads 77
793 Implementation of Lean Management in Non-Governmental Organizations: A Case Study on WrocłAw Food Bank

Authors: Maciej Pieńkowski

Abstract:

Lean Management is nowadays one of the most dominating management concepts within industrial and service environment, providing compelling business benefits to many companies. At the same time, its application in the non-governmental organizations has not been extensively researched yet. Filling this gap will address clear necessity of efficient management system in NGO environment and significantly improve operational performance of many organizations. The goal of the research is to verify effectiveness of Lean Management implementation in the non-governmental organizations, based on Wrocław Food Bank case study. The case study describes a Lean Management implementation project within analyzed organization. During the project, Wrocław Food Bank went through full 5-step Lean Thinking processes, which consist of value identification, value stream mapping, creation of flow, establishing pull and seeking perfection. The research contains a detailed summary of each of those steps and provides with information regarding results of their implementation. The major findings of the study indicate, that application of Lean Management in NGO environment is possible, however physical implementation of its guidelines can be strongly impeded by multiple constraints, which non-governmental organizations are facing. Due to challenges like limited resources, project based activities and lack of traditional supplier-customer relationship, many NGOs may fail in their efforts to implement Lean Management. Successful Lean application requires therefore strong leadership commitment, which would drive transformation to remove barriers and obstacles.

Keywords: lean management, non-governmental organizations, continuous improvement, lean thinking

Procedia PDF Downloads 281
792 Destination Port Detection For Vessels: An Analytic Tool For Optimizing Port Authorities Resources

Authors: Lubna Eljabu, Mohammad Etemad, Stan Matwin

Abstract:

Port authorities have many challenges in congested ports to allocate their resources to provide a safe and secure loading/ unloading procedure for cargo vessels. Selecting a destination port is the decision of a vessel master based on many factors such as weather, wavelength and changes of priorities. Having access to a tool which leverages AIS messages to monitor vessel’s movements and accurately predict their next destination port promotes an effective resource allocation process for port authorities. In this research, we propose a method, namely, Reference Route of Trajectory (RRoT) to assist port authorities in predicting inflow and outflow traffic in their local environment by monitoring Automatic Identification System (AIS) messages. Our RRoT method creates a reference route based on historical AIS messages. It utilizes some of the best trajectory similarity measure to identify the destination of a vessel using their recent movement. We evaluated five different similarity measures such as Discrete Fr´echet Distance (DFD), Dynamic Time Warping (DTW), Partial Curve Mapping (PCM), Area between two curves (Area) and Curve length (CL). Our experiments show that our method identifies the destination port with an accuracy of 98.97% and an fmeasure of 99.08% using Dynamic Time Warping (DTW) similarity measure.

Keywords: spatial temporal data mining, trajectory mining, trajectory similarity, resource optimization

Procedia PDF Downloads 98
791 An Improved Image Steganography Technique Based on Least Significant Bit Insertion

Authors: Olaiya Folorunsho, Comfort Y. Daramola, Joel N. Ugwu, Lawrence B. Adewole, Olufisayo S. Ekundayo

Abstract:

In today world, there is a tremendous rise in the usage of internet due to the fact that almost all the communication and information sharing is done over the web. Conversely, there is a continuous growth of unauthorized access to confidential data. This has posed a challenge to information security expertise whose major goal is to curtail the menace. One of the approaches to secure the safety delivery of data/information to the rightful destination without any modification is steganography. Steganography is the art of hiding information inside an embedded information. This research paper aimed at designing a secured algorithm with the use of image steganographic technique that makes use of Least Significant Bit (LSB) algorithm for embedding the data into the bit map image (bmp) in order to enhance security and reliability. In the LSB approach, the basic idea is to replace the LSB of the pixels of the cover image with the Bits of the messages to be hidden without destroying the property of the cover image significantly. The system was implemented using C# programming language of Microsoft.NET framework. The performance evaluation of the proposed system was experimented by conducting a benchmarking test for analyzing the parameters like Mean Squared Error (MSE) and Peak Signal to Noise Ratio (PSNR). The result showed that image steganography performed considerably in securing data hiding and information transmission over the networks.

Keywords: steganography, image steganography, least significant bits, bit map image

Procedia PDF Downloads 243
790 Information Theoretic Approach for Beamforming in Wireless Communications

Authors: Syed Khurram Mahmud, Athar Naveed, Shoaib Arif

Abstract:

Beamforming is a signal processing technique extensively utilized in wireless communications and radars for desired signal intensification and interference signal minimization through spatial selectivity. In this paper, we present a method for calculation of optimal weight vectors for smart antenna array, to achieve a directive pattern during transmission and selective reception in interference prone environment. In proposed scheme, Mutual Information (MI) extrema are evaluated through an energy constrained objective function, which is based on a-priori information of interference source and desired array factor. Signal to Interference plus Noise Ratio (SINR) performance is evaluated for both transmission and reception. In our scheme, MI is presented as an index to identify trade-off between information gain, SINR, illumination time and spatial selectivity in an energy constrained optimization problem. The employed method yields lesser computational complexity, which is presented through comparative analysis with conventional methods in vogue. MI based beamforming offers enhancement of signal integrity in degraded environment while reducing computational intricacy and correlating key performance indicators.

Keywords: beamforming, interference, mutual information, wireless communications

Procedia PDF Downloads 262
789 Peak Data Rate Enhancement Using Switched Micro-Macro Diversity in Cellular Multiple-Input-Multiple-Output Systems

Authors: Jihad S. Daba, J. P. Dubois, Yvette Antar

Abstract:

With the exponential growth of cellular users, a new generation of cellular networks is needed to enhance the required peak data rates. The co-channel interference between neighboring base stations inhibits peak data rate increase. To overcome this interference, multi-cell cooperation known as coordinated multipoint transmission is proposed. Such a solution makes use of multiple-input-multiple-output (MIMO) systems under two different structures: Micro- and macro-diversity. In this paper, we study the capacity and bit error rate in cellular networks using MIMO technology. We analyse both micro- and macro-diversity schemes and develop a hybrid model that switches between macro- and micro-diversity in the case of hard handoff based on a cut-off range of signal-to-noise ratio values. We conclude that our hybrid switched micro-macro MIMO system outperforms classical MIMO systems at the cost of increased hardware and software complexity.

Keywords: cooperative multipoint transmission, ergodic capacity, hard handoff, macro-diversity, micro-diversity, multiple-input-multiple output systems, orthogonal frequency division multiplexing

Procedia PDF Downloads 289
788 Effects of Diluent Gas Velocity on Formation of Moderate or Intense Low-Oxygen Dilution Combustion with Fuel Spray for Gas Turbine

Authors: ChunLoon Cha, HoYeon Lee, SangSoon Hwang

Abstract:

Mild combustion is characterized with its distinguished features, such as suppressed pollutant emission, homogeneous temperature distribution, reduced noise and thermal stress. However, most studies for MILD combustion have been focused on gas phase fuel. Therefore further study on MILD combustion using liquid fuel is needed for the application to liquid fueled gas turbine especially. In this work, we will focus on numerical simulation of the effects of diluent gas velocity on the formation of liquid fuel MILD combustion used in gas turbine area. A series of numerical simulations using Ansys fluent 18.2 have been carried out in order to investigate the detail effect of the flow field in the furnace on the formation of MILD combustion. The operating conditions were fixed at relatively lower heat intensity of 1.28 MW/m³ atm and various global equivalence ratios were changed. The results show that the local high temperature region was decreased and the flame temperature was uniformly distributed due to high velocity of diluted burnt gas. The increasing of diluted burnt gas velocity can be controlled by open ratio of adapter size. It was found that the maximum temperature became lower than 1800K and the average temperature was lower than 1500K that thermal NO formation was suppressed.

Keywords: MILD combustion, spray combustion, liquid fuel, diluent gas velocity, low NOx emission

Procedia PDF Downloads 219
787 Automated Classification of Hypoxia from Fetal Heart Rate Using Advanced Data Models of Intrapartum Cardiotocography

Authors: Malarvizhi Selvaraj, Paul Fergus, Andy Shaw

Abstract:

Uterine contractions produced during labour have the potential to damage the foetus by diminishing the maternal blood flow to the placenta. In order to observe this phenomenon labour and delivery are routinely monitored using cardiotocography monitors. An obstetrician usually makes the diagnosis of foetus hypoxia by interpreting cardiotocography recordings. However, cardiotocography capture and interpretation is time-consuming and subjective, often lead to misclassification that causes damage to the foetus and unnecessary caesarean section. Both of these have a high impact on the foetus and the cost to the national healthcare services. Automatic detection of foetal heart rate may be an objective solution to help to reduce unnecessary medical interventions, as reported in several studies. This paper aim is to provide a system for better identification and interpretation of abnormalities of the fetal heart rate using RStudio. An open dataset of 552 Intrapartum recordings has been filtered with 0.034 Hz filters in an attempt to remove noise while keeping as much of the discriminative data as possible. Features were chosen following an extensive literature review, which concluded with FIGO features such as acceleration, deceleration, mean, variance and standard derivation. The five features were extracted from 552 recordings. Using these features, recordings will be classified either normal or abnormal. If the recording is abnormal, it has got more chances of hypoxia.

Keywords: cardiotocography, foetus, intrapartum, hypoxia

Procedia PDF Downloads 201
786 Modeling of a UAV Longitudinal Dynamics through System Identification Technique

Authors: Asadullah I. Qazi, Mansoor Ahsan, Zahir Ashraf, Uzair Ahmad

Abstract:

System identification of an Unmanned Aerial Vehicle (UAV), to acquire its mathematical model, is a significant step in the process of aircraft flight automation. The need for reliable mathematical model is an established requirement for autopilot design, flight simulator development, aircraft performance appraisal, analysis of aircraft modifications, preflight testing of prototype aircraft and investigation of fatigue life and stress distribution etc.  This research is aimed at system identification of a fixed wing UAV by means of specifically designed flight experiment. The purposely designed flight maneuvers were performed on the UAV and aircraft states were recorded during these flights. Acquired data were preprocessed for noise filtering and bias removal followed by parameter estimation of longitudinal dynamics transfer functions using MATLAB system identification toolbox. Black box identification based transfer function models, in response to elevator and throttle inputs, were estimated using least square error   technique. The identification results show a high confidence level and goodness of fit between the estimated model and actual aircraft response.

Keywords: fixed wing UAV, system identification, black box modeling, longitudinal dynamics, least square error

Procedia PDF Downloads 307
785 Soil Characteristics and Liquefaction Potential of the Bengkulu Region Based on the Microtremor Method

Authors: Aditya Setyo Rahman, Dwikorita Karnawati, Muzli, Dadang Permana, Sigit Pramono, Fajri Syukur Rahmatullah, Oriza Sativa, Moehajirin, Edy Santoso, Nur Hidayati Oktavia, Ardian Yudhi Octantyo, Robby Wallansha, Juwita Sari Pradita, Nur Fani Habibah, Audia Kaluku, Amelia Chelcea, Yoga Dharma Persada, Anton Sugiharto

Abstract:

Earthquake vibrations on the surface are not only affected by the magnitude of the earthquake and the distance from the hypocenter but also by the characteristics of the local soil. Variations and changes in soil characteristics from the depth of the bedrock to the surface can cause an amplification of earthquake vibrations that also affect the impact they may have on the surface. Soil characteristics vary widely even at relatively close distances, so for earthquake hazard mapping in cities with earthquake threats, it is necessary to study the characteristics of the local soil on a detailed or micro-scale (microzonation). This study proposes seismic microzonation and liquefaction potential based on microtremor observations. We carried out 143 microtremor observations, and the observation sites were spread across all populated sub-districts in Bengkulu City; the results showed that the dominance of Bengkulu City had medium soil types with a dominant period value of 0.4 < T₀ < 0.6, and there was one location with soft soil characteristics in the river, shaved with T₀ > 0.6. These results correlate with the potential for liquefaction as indicated by a seismic vulnerability index (K𝓰) greater than 5.

Keywords: microtremor, dominant period, microzonation, seismic vulnerability index

Procedia PDF Downloads 92
784 Spatio-Temporal Pest Risk Analysis with ‘BioClass’

Authors: Vladimir A. Todiras

Abstract:

Spatio-temporal models provide new possibilities for real-time action in pest risk analysis. It should be noted that estimation of the possibility and probability of introduction of a pest and of its economic consequences involves many uncertainties. We present a new mapping technique that assesses pest invasion risk using online BioClass software. BioClass is a GIS tool designed to solve multiple-criteria classification and optimization problems based on fuzzy logic and level set methods. This research describes a method for predicting the potential establishment and spread of a plant pest into new areas using a case study: corn rootworm (Diabrotica spp.), tomato leaf miner (Tuta absoluta) and plum fruit moth (Grapholita funebrana). Our study demonstrated that in BioClass we can combine fuzzy logic and geographic information systems with knowledge of pest biology and environmental data to derive new information for decision making. Pests are sensitive to a warming climate, as temperature greatly affects their survival and reproductive rate and capacity. Changes have been observed in the distribution, frequency and severity of outbreaks of Helicoverpa armigera on tomato. BioClass has demonstrated to be a powerful tool for applying dynamic models and map the potential future distribution of a species, enable resource to make decisions about dangerous and invasive species management and control.

Keywords: classification, model, pest, risk

Procedia PDF Downloads 268
783 An Assessment of the Writing Skills of Reflective Essay of Grade 10 Students in Selected Secondary Schools in Valenzuela City

Authors: Reynald Contreras, Shaina Marie Bho, Kate Roan Dela Cruz, Marvin Dela Cruz

Abstract:

This study was conducted with the aim of determining the skill level of grade ten (Grade 10) students in writing a reflective essay in selected secondary schools of Valenzuela. This research used descriptive and qualitative-quantitative research methods to systematically and accurately describe the level of writing skills of students and used a convenient sampling technique in selecting forty (40) students in grade ten. (Grade 10) at Polo, Wawang Pulo, and Arkong Batong high schools with a total of one hundred and twenty (120) students to assess the written reflective essay using modified rubrics developed based on 6+1 writing traits by Ruth Culham. According to the findings of the study, students at Polo and Wawang Pulo National high schools have low levels of writing skills that need to be developed or are not proficient. Meanwhile, Arkong Bato National High School has achieved a high degree of writing proficiency. Based on the study's findings, the researchers devised a suggested curriculum mapping for the suggested activity or intervention activity that would aid in the development and cultivation of the writing skills of children in grade ten (Grade 10).

Keywords: writing skills, reflective essay, intervention activity, 6+1 writing traits, modified rubrics

Procedia PDF Downloads 94
782 An Optimization of Machine Parameters for Modified Horizontal Boring Tool Using Taguchi Method

Authors: Thirasak Panyaphirawat, Pairoj Sapsmarnwong, Teeratas Pornyungyuen

Abstract:

This paper presents the findings of an experimental investigation of important machining parameters for the horizontal boring tool modified to mouth with a horizontal lathe machine to bore an overlength workpiece. In order to verify a usability of a modified tool, design of experiment based on Taguchi method is performed. The parameters investigated are spindle speed, feed rate, depth of cut and length of workpiece. Taguchi L9 orthogonal array is selected for four factors three level parameters in order to minimize surface roughness (Ra and Rz) of S45C steel tubes. Signal to noise ratio analysis and analysis of variance (ANOVA) is performed to study an effect of said parameters and to optimize the machine setting for best surface finish. The controlled factors with most effect are depth of cut, spindle speed, length of workpiece, and feed rate in order. The confirmation test is performed to test the optimal setting obtained from Taguchi method and the result is satisfactory.

Keywords: design of experiment, Taguchi design, optimization, analysis of variance, machining parameters, horizontal boring tool

Procedia PDF Downloads 421
781 Uncontrolled Urbanization Leads to Main Challenge for Sustainable Development of Mongolia

Authors: Davaanyam Surenjav, Chinzolboo Dandarbaatar, Ganbold Batkhuyag

Abstract:

Primate city induced rapid urbanization has been become one of the main challenges in sustainable development in Mongolia like other developing countries since transition to market economy in 1990. According due to statistical yearbook, population number of Ulaanbaatar city has increased from 0.5 million to 1.5 million for last 30 years and contains now almost half (47%) of total Mongolian population. Rural-Ulaanbaatar and local Cities-Ulaanbaatar city migration leads to social issues like uncontrolled urbanization, income inequality, poverty, overwork of public service, economic over cost for redevelopment and limitation of transport and environmental degradation including air, noise, water and soil pollution. Most thresholds of all of the sustainable urban development main and sub-indicators over exceeded from safety level to unsafety level in Ulaanbaatar. So, there is an urgent need to remove migration pull factors including some administrative and high education functions from Ulaanbaatar city to its satellite cities or secondary cities. Moreover, urban smart transport system and green and renewable energy technologies should be introduced to urban development master plan of Ulaanbaatar city.

Keywords: challenge for sustainable urban development, migration factors, primate city , urban safety thresholds

Procedia PDF Downloads 111
780 Energy Detection Based Sensing and Primary User Traffic Classification for Cognitive Radio

Authors: Urvee B. Trivedi, U. D. Dalal

Abstract:

As wireless communication services grow quickly; the seriousness of spectrum utilization has been on the rise gradually. An emerging technology, cognitive radio has come out to solve today’s spectrum scarcity problem. To support the spectrum reuse functionality, secondary users are required to sense the radio frequency environment, and once the primary users are found to be active, the secondary users are required to vacate the channel within a certain amount of time. Therefore, spectrum sensing is of significant importance. Once sensing is done, different prediction rules apply to classify the traffic pattern of primary user. Primary user follows two types of traffic patterns: periodic and stochastic ON-OFF patterns. A cognitive radio can learn the patterns in different channels over time. Two types of classification methods are discussed in this paper, by considering edge detection and by using autocorrelation function. Edge detection method has a high accuracy but it cannot tolerate sensing errors. Autocorrelation-based classification is applicable in the real environment as it can tolerate some amount of sensing errors.

Keywords: cognitive radio (CR), probability of detection (PD), probability of false alarm (PF), primary user (PU), secondary user (SU), fast Fourier transform (FFT), signal to noise ratio (SNR)

Procedia PDF Downloads 333
779 Coding Structures for Seated Row Simulation of an Active Controlled Vibration Isolation and Stabilization System for Astronaut’s Exercise Platform

Authors: Ziraguen O. Williams, Shield B. Lin, Fouad N. Matari, Leslie J. Quiocho

Abstract:

Simulation for seated row exercise was a continued task to assist NASA in analyzing a one-dimensional vibration isolation and stabilization system for astronaut’s exercise platform. Feedback delay and signal noise were added to the model as previously done in simulation for squat exercise. Simulation runs for this study were conducted in two software simulation tools, Trick and MBDyn, software simulation environments developed at the NASA Johnson Space Center. The exciter force in the simulation was calculated from the motion capture of an exerciser during a seated row exercise. The simulation runs include passive control, active control using a Proportional, Integral, Derivative (PID) controller, and active control using a Piecewise Linear Integral Derivative (PWLID) controller. Output parameters include displacements of the exercise platform, the exerciser, and the counterweight; transmitted force to the wall of spacecraft; and actuator force to the platform. The simulation results showed excellent force reduction in the actively controlled system compared to the passive controlled system, which showed less force reduction.

Keywords: control, counterweight, isolation, vibration.

Procedia PDF Downloads 121
778 A Hybrid Combustion Chamber Design for Diesel Engines

Authors: R. Gopakumar, G. Nagarajan

Abstract:

Both DI and IDI systems possess inherent advantages as well as disadvantages. The objective of the present work is to obtain maximum advantages of both systems by implementing a hybrid design. A hybrid combustion chamber design consists of two combustion chambers viz., the main combustion chamber and an auxiliary combustion chamber. A fuel injector supplies major quantity of fuel to the auxiliary chamber. Due to the increased swirl motion in auxiliary chamber, mixing becomes more efficient which contributes to reduction in soot/particulate emissions. Also, by increasing the fuel injection pressure, NOx emissions can be reduced. The main objective of the hybrid combustion chamber design is to merge the positive features of both DI and IDI combustion chamber designs, which provides increased swirl motion and improved thermal efficiency. Due to the efficient utilization of fuel, low specific fuel consumption can be ensured. This system also aids in increasing the power output for same compression ratio and injection timing as compared with the conventional combustion chamber designs. The present system also reduces heat transfer and fluid dynamic losses which are encountered in IDI diesel engines. Since the losses are reduced, overall efficiency of the engine increases. It also minimizes the combustion noise and NOx emissions in conventional DI diesel engines.

Keywords: DI, IDI, hybrid combustion, diesel engines

Procedia PDF Downloads 504
777 Magnetocaloric Effect in Ho₂O₃ Nanopowder at Cryogenic Temperature

Authors: K. P. Shinde, M. V. Tien, H. Lin, H.-R. Park, S.-C.Yu, K. C. Chung, D.-H. Kim

Abstract:

Magnetic refrigeration provides an attractive alternative cooling technology due to its potential advantages such as high cooling efficiency, environmental friendliness, low noise, and compactness over the conventional cooling techniques based on gas compression. Magnetocaloric effect (MCE) occurs by changes in entropy (ΔS) and temperature (ΔT) under external magnetic fields. We have been focused on identifying materials with large MCE in two temperature regimes, not only room temperature but also at cryogenic temperature for specific technological applications, such as space science and liquefaction of hydrogen in fuel industry. To date, the commonly used materials for cryogenic refrigeration are based on hydrated salts. In the present work, we report giant MCE in rare earth Ho2O3 nanopowder at cryogenic temperature. HoN nanoparticles with average size of 30 nm were prepared by using plasma arc discharge method with gas composition of N2/H2 (80%/20%). The prepared HoN was sintered in air atmosphere at 1200 oC for 24 hrs to convert it into oxide. Structural and morphological properties were studied by XRD and SEM. XRD confirms the pure phase and cubic crystal structure of Ho2O3 without any impurity within error range. It has been discovered that Holmium oxide exhibits giant MCE at low temperature without magnetic hysteresis loss with the second-order antiferromagnetic phase transition with Néels temperature around 2 K. The maximum entropy change was found to be 25.2 J/kgK at an applied field of 6 T.

Keywords: magnetocaloric effect, Ho₂O₃, magnetic entropy change, nanopowder

Procedia PDF Downloads 132
776 Cognitive Semantics Study of Conceptual and Metonymical Expressions in Johnson's Speeches about COVID-19

Authors: Hussain Hameed Mayuuf

Abstract:

The study is an attempt to investigate the conceptual metonymies is used in political discourse about COVID-19. Thus, this study tries to analyze and investigate how the conceptual metonymies in Johnson's speech about coronavirus are constructed. This study aims at: Identifying how are metonymies relevant to understand the messages in Boris Johnson speeches and to find out how can conceptual blending theory help people to understand the messages in the political speech about COVID-19. Lastly, it tries to Point out the kinds of integration networks are common in political speech. The study is based on the hypotheses that conceptual blending theory is a powerful tool for investigating the intended messages in Johnson's speech and there are different processes of blending networks and conceptual mapping that enable the listeners to identify the messages in political speech. This study presents a qualitative and quantitative analysis of four speeches about COVID-19; they are said by Boris Johnson. The selected data have been tackled from the cognitive-semantic perspective by adopting Conceptual Blending Theory as a model for the analysis. It concludes that CBT is applicable to the analysis of metonymies in political discourse. Its mechanisms enable listeners to analyze and understand these speeches. Also the listener can identify and understand the hidden messages in Biden and Johnson's discourse about COVID-19 by using different conceptual networks. Finally, it is concluded that the double scope networks are the most common types of blending of metonymies in the political speech.

Keywords: cognitive, semantics, conceptual, metonymical, Covid-19

Procedia PDF Downloads 101
775 An Improved Data Aided Channel Estimation Technique Using Genetic Algorithm for Massive Multi-Input Multiple-Output

Authors: M. Kislu Noman, Syed Mohammed Shamsul Islam, Shahriar Hassan, Raihana Pervin

Abstract:

With the increasing rate of wireless devices and high bandwidth operations, wireless networking and communications are becoming over crowded. To cope with such crowdy and messy situation, massive MIMO is designed to work with hundreds of low costs serving antennas at a time as well as improve the spectral efficiency at the same time. TDD has been used for gaining beamforming which is a major part of massive MIMO, to gain its best improvement to transmit and receive pilot sequences. All the benefits are only possible if the channel state information or channel estimation is gained properly. The common methods to estimate channel matrix used so far is LS, MMSE and a linear version of MMSE also proposed in many research works. We have optimized these methods using genetic algorithm to minimize the mean squared error and finding the best channel matrix from existing algorithms with less computational complexity. Our simulation result has shown that the use of GA worked beautifully on existing algorithms in a Rayleigh slow fading channel and existence of Additive White Gaussian Noise. We found that the GA optimized LS is better than existing algorithms as GA provides optimal result in some few iterations in terms of MSE with respect to SNR and computational complexity.

Keywords: channel estimation, LMMSE, LS, MIMO, MMSE

Procedia PDF Downloads 173
774 A Selection Approach: Discriminative Model for Nominal Attributes-Based Distance Measures

Authors: Fang Gong

Abstract:

Distance measures are an indispensable part of many instance-based learning (IBL) and machine learning (ML) algorithms. The value difference metrics (VDM) and inverted specific-class distance measure (ISCDM) are among the top-performing distance measures that address nominal attributes. VDM performs well in some domains owing to its simplicity and poorly in others that exist missing value and non-class attribute noise. ISCDM, however, typically works better than VDM on such domains. To maximize their advantages and avoid disadvantages, in this paper, a selection approach: a discriminative model for nominal attributes-based distance measures is proposed. More concretely, VDM and ISCDM are built independently on a training dataset at the training stage, and the most credible one is recorded for each training instance. At the test stage, its nearest neighbor for each test instance is primarily found by any of VDM and ISCDM and then chooses the most reliable model of its nearest neighbor to predict its class label. It is simply denoted as a discriminative distance measure (DDM). Experiments are conducted on the 34 University of California at Irvine (UCI) machine learning repository datasets, and it shows DDM retains the interpretability and simplicity of VDM and ISCDM but significantly outperforms the original VDM and ISCDM and other state-of-the-art competitors in terms of accuracy.

Keywords: distance measure, discriminative model, nominal attributes, nearest neighbor

Procedia PDF Downloads 98
773 Artificial Neural Network Modeling and Genetic Algorithm Based Optimization of Hydraulic Design Related to Seepage under Concrete Gravity Dams on Permeable Soils

Authors: Muqdad Al-Juboori, Bithin Datta

Abstract:

Hydraulic structures such as gravity dams are classified as essential structures, and have the vital role in providing strong and safe water resource management. Three major aspects must be considered to achieve an effective design of such a structure: 1) The building cost, 2) safety, and 3) accurate analysis of seepage characteristics. Due to the complexity and non-linearity relationships of the seepage process, many approximation theories have been developed; however, the application of these theories results in noticeable errors. The analytical solution, which includes the difficult conformal mapping procedure, could be applied for a simple and symmetrical problem only. Therefore, the objectives of this paper are to: 1) develop a surrogate model based on numerical simulated data using SEEPW software to approximately simulate seepage process related to a hydraulic structure, 2) develop and solve a linked simulation-optimization model based on the developed surrogate model to describe the seepage occurring under a concrete gravity dam, in order to obtain optimum and safe design at minimum cost. The result shows that the linked simulation-optimization model provides an efficient and optimum design of concrete gravity dams.

Keywords: artificial neural network, concrete gravity dam, genetic algorithm, seepage analysis

Procedia PDF Downloads 209
772 Research Methods and Design Strategies to Improve Resilience in Coastal and Estuary Cities

Authors: Irene Perez Lopez

Abstract:

Delta and estuary cities are spaces constantly evolving, incessantly altered by the ever-changing actions of water transformation. Strategies that incorporate comprehensive and integrated approaches to planning and design with water will play a powerful role in defining new types of flood defense. These strategies will encourage more resilient and active urban environments, allowing for new spatial and functional programs. This abstract presents the undergoing research in Newcastle, the first urbanized delta in New South Wales (Australia), and the region's second-biggest catchment and estuary. The research methodology is organized in three phases: 1) a projective cartography that analyses maps and data across the region's recorded history, identifying past and present constraints, and predicting future conditions. The cartography aids to identify worst-case scenarios, revealing the implications of land reclamation that have not considered the confronting evolution of climate change and its conflicts with inhabitation; 2) the cartographic studies identify the areas under threat and form the basis for further interdisciplinary research, complimented by community consultation, to reduce flood risk and increase urban resilience and livability; 3) a speculative or prospective phase of design with water to generate evidence-based guidelines that strengthen urban resilience of shorelines and flood prone areas.

Keywords: coastal defense, design, urban resilience, mapping

Procedia PDF Downloads 114
771 Tool Wear of Aluminum/Chromium/Tungsten Based Coated Cemented Carbide Tools in Cutting Sintered Steel

Authors: Tadahiro Wada, Hiroyuki Hanyu

Abstract:

In this study, to clarify the effectiveness of an aluminum/chromium/tungsten-based-coated tool for cutting sintered steel, tool wear was experimentally investigated. The sintered steel was turned with the (Al60,Cr25,W15)N-, (Al60,Cr25,W15)(C,N)- and (Al64,Cr28,W8)(C,N)-coated cemented carbide tools according to the physical vapor deposition (PVD) method. Moreover, the tool wear of the aluminum/chromium/tungsten-based-coated item was compared with that of the (Al,Cr)N coated tool. Furthermore, to clarify the tool wear mechanism of the aluminum/chromium/tungsten-coating film for cutting sintered steel, Scanning Electron Microscope observation and Energy Dispersive x-ray Spectroscopy mapping analysis were conducted on the abraded surface. The following results were obtained: (1) The wear progress of the (Al64,Cr28,W8)(C,N)-coated tool was the slowest among that of the five coated tools. (2) Adding carbon (C) to the aluminum/chromium/tungsten-based-coating film was effective for improving the wear-resistance. (3) The main wear mechanism of the (Al60,Cr25,W15)N-, the (Al60,Cr25,W15)(C,N)- and the (Al64,Cr28,W8)(C,N)-coating films was abrasive wear.

Keywords: cutting, physical vapor deposition coating method, tool wear, tool wear mechanism, (Al, Cr, W)N-coating film, (Al, Cr, W)(C, N)-coating film, sintered steel

Procedia PDF Downloads 360
770 Image Segmentation Using Active Contours Based on Anisotropic Diffusion

Authors: Shafiullah Soomro

Abstract:

Active contour is one of the image segmentation techniques and its goal is to capture required object boundaries within an image. In this paper, we propose a novel image segmentation method by using an active contour method based on anisotropic diffusion feature enhancement technique. The traditional active contour methods use only pixel information to perform segmentation, which produces inaccurate results when an image has some noise or complex background. We use Perona and Malik diffusion scheme for feature enhancement, which sharpens the object boundaries and blurs the background variations. Our main contribution is the formulation of a new SPF (signed pressure force) function, which uses global intensity information across the regions. By minimizing an energy function using partial differential framework the proposed method captures semantically meaningful boundaries instead of catching uninterested regions. Finally, we use a Gaussian kernel which eliminates the problem of reinitialization in level set function. We use several synthetic and real images from different modalities to validate the performance of the proposed method. In the experimental section, we have found the proposed method performance is better qualitatively and quantitatively and yield results with higher accuracy compared to other state-of-the-art methods.

Keywords: active contours, anisotropic diffusion, level-set, partial differential equations

Procedia PDF Downloads 146