Search results for: optimal digital signal processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10279

Search results for: optimal digital signal processing

8869 Albumin-Induced Turn-on Fluorescence in Molecular Engineered Fluorescent Probe for Biomedical Application

Authors: Raja Chinnappan, Huda Alanazi, Shanmugam Easwaramoorthi, Tanveer Mir, Balamurugan Kanagasabai, Ahmed Yaqinuddin, Sandhanasamy Devanesan, Mohamad S. AlSalhi

Abstract:

Serum albumin (SA) is a highly rich water-soluble protein in plasma. It is known to maintain the living organisms' health and help to maintain the proper liver function, kidney function, and plasma osmolality in the body. Low levels of serum albumin are an indication of liver failure and chronic hepatitis. Therefore, it is important to have a low-cost, accurate and rapid method. In this study, we designed a fluorescent probe, triphenylamine rhodanine-3-acetic acid (mRA), which triggers the fluorescence signal upon binding with serum albumin (SA). mRA is a bifunctional molecule with twisted intramolecular charge transfer (TICT)-induced emission characteristics. An aqueous solution of mRA has an insignificant fluorescence signal; however, when mRA binds to SA, it undergoes TICT and turns on the fluorescence emission. A SA dose-dependent fluorescence signal was performed, and the limit of detection was found to be less than ng/mL. The specific binding of SA was tested from the cross-reactivity study using similar structural or functional proteins.

Keywords: serum albumin, fluorescent sensing probe, liver diseases, twisted intramolecular charge transfer

Procedia PDF Downloads 21
8868 Using Jumping Particle Swarm Optimization for Optimal Operation of Pump in Water Distribution Networks

Authors: R. Rajabpour, N. Talebbeydokhti, M. H. Ahmadi

Abstract:

Carefully scheduling the operations of pumps can be resulted to significant energy savings. Schedules can be defined either implicit, in terms of other elements of the network such as tank levels, or explicit by specifying the time during which each pump is on/off. In this study, two new explicit representations based on time-controlled triggers were analyzed, where the maximum number of pump switches was established beforehand, and the schedule may contain fewer switches than the maximum. The optimal operation of pumping stations was determined using a Jumping Particle Swarm Optimization (JPSO) algorithm to achieve the minimum energy cost. The model integrates JPSO optimizer and EPANET hydraulic network solver. The optimal pump operation schedule of VanZyl water distribution system was determined using the proposed model and compared with those from Genetic and Ant Colony algorithms. The results indicate that the proposed model utilizing the JPSP algorithm outperformed the others and is a versatile management model for the operation of real-world water distribution system.

Keywords: JPSO, operation, optimization, water distribution system

Procedia PDF Downloads 247
8867 Human Identification Using Local Roughness Patterns in Heartbeat Signal

Authors: Md. Khayrul Bashar, Md. Saiful Islam, Kimiko Yamashita, Yano Midori

Abstract:

Despite having some progress in human authentication, conventional biometrics (e.g., facial features, fingerprints, retinal scans, gait, voice patterns) are not robust against falsification because they are neither confidential nor secret to an individual. As a non-invasive tool, electrocardiogram (ECG) has recently shown a great potential in human recognition due to its unique rhythms characterizing the variability of human heart structures (chest geometry, sizes, and positions). Moreover, ECG has a real-time vitality characteristic that signifies the live signs, which ensure legitimate individual to be identified. However, the detection accuracy of the current ECG-based methods is not sufficient due to a high variability of the individual’s heartbeats at a different instance of time. These variations may occur due to muscle flexure, the change of mental or emotional states, and the change of sensor positions or long-term baseline shift during the recording of ECG signal. In this study, a new method is proposed for human identification, which is based on the extraction of the local roughness of ECG heartbeat signals. First ECG signal is preprocessed using a second order band-pass Butterworth filter having cut-off frequencies of 0.00025 and 0.04. A number of local binary patterns are then extracted by applying a moving neighborhood window along the ECG signal. At each instant of the ECG signal, the pattern is formed by comparing the ECG intensities at neighboring time points with the central intensity in the moving window. Then, binary weights are multiplied with the pattern to come up with the local roughness description of the signal. Finally, histograms are constructed that describe the heartbeat signals of individual subjects in the database. One advantage of the proposed feature is that it does not depend on the accuracy of detecting QRS complex, unlike the conventional methods. Supervised recognition methods are then designed using minimum distance to mean and Bayesian classifiers to identify authentic human subjects. An experiment with sixty (60) ECG signals from sixty adult subjects from National Metrology Institute of Germany (NMIG) - PTB database, showed that the proposed new method is promising compared to a conventional interval and amplitude feature-based method.

Keywords: human identification, ECG biometrics, local roughness patterns, supervised classification

Procedia PDF Downloads 405
8866 A Genetic Algorithm for the Load Balance of Parallel Computational Fluid Dynamics Computation with Multi-Block Structured Mesh

Authors: Chunye Gong, Ming Tie, Jie Liu, Weimin Bao, Xinbiao Gan, Shengguo Li, Bo Yang, Xuguang Chen, Tiaojie Xiao, Yang Sun

Abstract:

Large-scale CFD simulation relies on high-performance parallel computing, and the load balance is the key role which affects the parallel efficiency. This paper focuses on the load-balancing problem of parallel CFD simulation with structured mesh. A mathematical model for this load-balancing problem is presented. The genetic algorithm, fitness computing, two-level code are designed. Optimal selector, robust operator, and local optimization operator are designed. The properties of the presented genetic algorithm are discussed in-depth. The effects of optimal selector, robust operator, and local optimization operator are proved by experiments. The experimental results of different test sets, DLR-F4, and aircraft design applications show the presented load-balancing algorithm is robust, quickly converged, and is useful in real engineering problems.

Keywords: genetic algorithm, load-balancing algorithm, optimal variation, local optimization

Procedia PDF Downloads 185
8865 Signal Strength Based Multipath Routing for Mobile Ad Hoc Networks

Authors: Chothmal

Abstract:

In this paper, we present a route discovery process which uses the signal strength on a link as a parameter of its inclusion in the route discovery method. The proposed signal-to-interference and noise ratio (SINR) based multipath reactive routing protocol is named as SINR-MP protocol. The proposed SINR-MP routing protocols has two following two features: a) SINR-MP protocol selects routes based on the SINR of the links during the route discovery process therefore it select the routes which has long lifetime and low frame error rate for data transmission, and b) SINR-MP protocols route discovery process is multipath which discovers more than one SINR based route between a given source destination pair. The multiple routes selected by our SINR-MP protocol are node-disjoint in nature which increases their robustness against link failures, as failure of one route will not affect the other route. The secondary route is very useful in situations where the primary route is broken because we can now use the secondary route without causing a new route discovery process. Due to this, the network overhead caused by a route discovery process is avoided. This increases the network performance greatly. The proposed SINR-MP routing protocol is implemented in the trail version of network simulator called Qualnet.

Keywords: ad hoc networks, quality of service, video streaming, H.264/SVC, multiple routes, video traces

Procedia PDF Downloads 251
8864 Study for an Optimal Cable Connection within an Inner Grid of an Offshore Wind Farm

Authors: Je-Seok Shin, Wook-Won Kim, Jin-O Kim

Abstract:

The offshore wind farm needs to be designed carefully considering economics and reliability aspects. There are many decision-making problems for designing entire offshore wind farm, this paper focuses on an inner grid layout which means the connection between wind turbines as well as between wind turbines and an offshore substation. A methodology proposed in this paper determines the connections and the cable type for each connection section using K-clustering, minimum spanning tree and cable selection algorithms. And then, a cost evaluation is performed in terms of investment, power loss and reliability. Through the cost evaluation, an optimal layout of inner grid is determined so as to have the lowest total cost. In order to demonstrate the validity of the methodology, the case study is conducted on 240MW offshore wind farm, and the results show that it is helpful to design optimally offshore wind farm.

Keywords: offshore wind farm, optimal layout, k-clustering algorithm, minimum spanning algorithm, cable type selection, power loss cost, reliability cost

Procedia PDF Downloads 386
8863 The Teaching and Learning Process and Information and Communication Technologies from the Remote Perspective

Authors: Rosiris Maturo Domingues, Patricia Luissa Masmo, Cibele Cavalheiro Neves, Juliana Dalla Martha Rodriguez

Abstract:

This article reports the experience of the pedagogical consultants responsible for the curriculum development of Senac São Paulo courses when facing the emergency need to maintain the pedagogical process in their schools in the face of the Covid-19 pandemic. The urgent adjustment to distance education resulted in the improvement of the process and the adoption of new teaching and learning strategies mediated by technologies. The processes for preparing and providing guidelines for professional education courses were also readjusted. Thus, a bank of teaching-learning strategies linked to digital resources was developed, categorized, and identified by their didactic-pedagogical potential, having as an intersection didactic planning based on learning objectives based on Bloom's taxonomy (revised), given its convergence with the competency approach adopted by Senac. Methodologically, a relationship was established between connectivity and digital networks and digital evolution in school environments, culminating in new paradigms and processes of educational communication and new trends in teaching and learning. As a result, teachers adhered to the use of digital tools in their practices, transposing face-to-face classroom methodologies and practices to online media, whose criticism was the use of ICTs in an instrumental way, reducing methodologies and practices to teaching only transmissive. There was recognition of the insertion of technology as a facilitator of the educational process in a non-palliative way and the development of a web curriculum, now and fully, carried out in contexts of ubiquity.

Keywords: technologies, education, teaching-learning strategies, Bloom taxonomy

Procedia PDF Downloads 91
8862 Impact of Hard Limited Clipping Crest Factor Reduction Technique on Bit Error Rate in OFDM Based Systems

Authors: Theodore Grosch, Felipe Koji Godinho Hoshino

Abstract:

In wireless communications, 3GPP LTE is one of the solutions to meet the greater transmission data rate demand. One issue inherent to this technology is the PAPR (Peak-to-Average Power Ratio) of OFDM (Orthogonal Frequency Division Multiplexing) modulation. This high PAPR affects the efficiency of power amplifiers. One approach to mitigate this effect is the Crest Factor Reduction (CFR) technique. In this work, we simulate the impact of Hard Limited Clipping Crest Factor Reduction technique on BER (Bit Error Rate) in OFDM based Systems. In general, the results showed that CFR has more effects on higher digital modulation schemes, as expected. More importantly, we show the worst-case degradation due to CFR on QPSK, 16QAM, and 64QAM signals in a linear system. For example, hard clipping of 9 dB results in a 2 dB increase in signal to noise energy at a 1% BER for 64-QAM modulation.

Keywords: bit error rate, crest factor reduction, OFDM, physical layer simulation

Procedia PDF Downloads 366
8861 Compilation of Load Spectrum of Loader Drive Axle

Authors: Wei Yongxiang, Zhu Haoyue, Tang Heng, Yuan Qunwei

Abstract:

In order to study the preparation method of gear fatigue load spectrum for loaders, the load signal of four typical working conditions of loader is collected. The signal that reflects the law of load change is obtained by preprocessing the original signal. The torque of the drive axle is calculated by using the rain flow counting method. According to the operating time ratio of each working condition, the two-dimensional load spectrum based on the real working conditions of the drive axle of loader is established by the cycle extrapolation and synthesis method. The two-dimensional load spectrum is converted into one-dimensional load spectrum by means of the mean of torque equal damage method. Torque amplification includes the maximum load torque of the main reduction gear. Based on the theory of equal damage, the accelerated cycles are calculated. In this way, the load spectrum of the loading condition of the drive axle is prepared to reflect loading condition of the loader. The load spectrum can provide reference for fatigue life test and life prediction of loader drive axle.

Keywords: load spectrum, axle, torque, rain-flow counting method, extrapolation

Procedia PDF Downloads 366
8860 Pantograph-Catenary Contact Force: Features Evaluation for Catenary Diagnostics

Authors: Mehdi Brahimi, Kamal Medjaher, Noureddine Zerhouni, Mohammed Leouatni

Abstract:

The Prognostics and Health Management is a system engineering discipline which provides solutions and models to the implantation of a predictive maintenance. The approach is based on extracting useful information from monitoring data to assess the “health” state of an industrial equipment or an asset. In this paper, we examine multiple extracted features from Pantograph-Catenary contact force in order to select the most relevant ones to achieve a diagnostics function. The feature extraction methodology is based on simulation data generated thanks to a Pantograph-Catenary simulation software called INPAC and measurement data. The feature extraction method is based on both statistical and signal processing analyses. The feature selection method is based on statistical criteria.

Keywords: catenary/pantograph interaction, diagnostics, Prognostics and Health Management (PHM), quality of current collection

Procedia PDF Downloads 290
8859 A Digital Twin Approach for Sustainable Territories Planning: A Case Study on District Heating

Authors: Ahmed Amrani, Oussama Allali, Amira Ben Hamida, Felix Defrance, Stephanie Morland, Eva Pineau, Thomas Lacroix

Abstract:

The energy planning process is a very complex task that involves several stakeholders and requires the consideration of several local and global factors and constraints. In order to optimize and simplify this process, we propose a tool-based iterative approach applied to district heating planning. We build our tool with the collaboration of a French territory using actual district data and implementing the European incentives. We set up an iterative process including data visualization and analysis, identification and extraction of information related to the area concerned by the operation, design of sustainable planning scenarios leveraging local renewable and recoverable energy sources, and finally, the evaluation of scenarios. The last step is performed by a dynamic digital twin replica of the city. Territory’s energy experts confirm that the tool provides them with valuable support towards sustainable energy planning.

Keywords: climate change, data management, decision support, digital twin, district heating, energy planning, renewables, smart city

Procedia PDF Downloads 174
8858 Efficient Alias-Free Level Crossing Sampling

Authors: Negar Riazifar, Nigel G. Stocks

Abstract:

This paper proposes strategies in level crossing (LC) sampling and reconstruction that provide alias-free high-fidelity signal reconstruction for speech signals without exponentially increasing sample number with increasing bit-depth. We introduce methods in LC sampling that reduce the sampling rate close to the Nyquist frequency even for large bit-depth. The results indicate that larger variation in the sampling intervals leads to an alias-free sampling scheme; this is achieved by either reducing the bit-depth or adding jitter to the system for high bit-depths. In conjunction with windowing, the signal is reconstructed from the LC samples using an efficient Toeplitz reconstruction algorithm.

Keywords: alias-free, level crossing sampling, spectrum, trigonometric polynomial

Procedia PDF Downloads 212
8857 Biogas Control: Methane Production Monitoring Using Arduino

Authors: W. Ait Ahmed, M. Aggour, M. Naciri

Abstract:

Extracting energy from biomass is an important alternative to produce different types of energy (heat, electricity, or both) assuring low pollution and better efficiency. It is a new yet reliable approach to reduce green gas emission by extracting methane from industry effluents and use it to power machinery. We focused in our project on using paper and mill effluents, treated in a UASB reactor. The methane produced is used in the factory’s power supply. The aim of this work is to develop an electronic system using Arduino platform connected to a gas sensor, to measure and display the curve of daily methane production on processing. The sensor will send the gas values in ppm to the Arduino board so that the later sends the RS232 hardware protocol. The code developed with processing will transform the values into a curve and display it on the computer screen.

Keywords: biogas, Arduino, processing, code, methane, gas sensor, program

Procedia PDF Downloads 324
8856 Morphological Processing of Punjabi Text for Sentiment Analysis of Farmer Suicides

Authors: Jaspreet Singh, Gurvinder Singh, Prabhsimran Singh, Rajinder Singh, Prithvipal Singh, Karanjeet Singh Kahlon, Ravinder Singh Sawhney

Abstract:

Morphological evaluation of Indian languages is one of the burgeoning fields in the area of Natural Language Processing (NLP). The evaluation of a language is an eminent task in the era of information retrieval and text mining. The extraction and classification of knowledge from text can be exploited for sentiment analysis and morphological evaluation. This study coalesce morphological evaluation and sentiment analysis for the task of classification of farmer suicide cases reported in Punjab state of India. The pre-processing of Punjabi text involves morphological evaluation and normalization of Punjabi word tokens followed by the training of proposed model using deep learning classification on Punjabi language text extracted from online Punjabi news reports. The class-wise accuracies of sentiment prediction for four negatively oriented classes of farmer suicide cases are 93.85%, 88.53%, 83.3%, and 95.45% respectively. The overall accuracy of sentiment classification obtained using proposed framework on 275 Punjabi text documents is found to be 90.29%.

Keywords: deep neural network, farmer suicides, morphological processing, punjabi text, sentiment analysis

Procedia PDF Downloads 328
8855 Mathematical Modeling and Optimization of Burnishing Parameters for 15NiCr6 Steel

Authors: Tarek Litim, Ouahiba Taamallah

Abstract:

The present paper is an investigation of the effect of burnishing on the surface integrity of a component made of 15NiCr6 steel. This work shows a statistical study based on regression, and Taguchi's design has allowed the development of mathematical models to predict the output responses as a function of the technological parameters studied. The response surface methodology (RSM) showed a simultaneous influence of the burnishing parameters and observe the optimal processing parameters. ANOVA analysis of the results resulted in the validation of the prediction model with a determination coefficient R=90.60% and 92.41% for roughness and hardness, respectively. Furthermore, a multi-objective optimization allowed to identify a regime characterized by P=10kgf, i=3passes, and f=0.074mm/rev, which favours minimum roughness and maximum hardness. The result was validated by the desirability of D= (0.99 and 0.95) for roughness and hardness, respectively.

Keywords: 15NiCr6 steel, burnishing, surface integrity, Taguchi, RSM, ANOVA

Procedia PDF Downloads 194
8854 Image Rotation Using an Augmented 2-Step Shear Transform

Authors: Hee-Choul Kwon, Heeyong Kwon

Abstract:

Image rotation is one of main pre-processing steps for image processing or image pattern recognition. It is implemented with a rotation matrix multiplication. It requires a lot of floating point arithmetic operations and trigonometric calculations, so it takes a long time to execute. Therefore, there has been a need for a high speed image rotation algorithm without two major time-consuming operations. However, the rotated image has a drawback, i.e. distortions. We solved the problem using an augmented two-step shear transform. We compare the presented algorithm with the conventional rotation with images of various sizes. Experimental results show that the presented algorithm is superior to the conventional rotation one.

Keywords: high-speed rotation operation, image rotation, transform matrix, image processing, pattern recognition

Procedia PDF Downloads 278
8853 Implementation of an Image Processing System Using Artificial Intelligence for the Diagnosis of Malaria Disease

Authors: Mohammed Bnebaghdad, Feriel Betouche, Malika Semmani

Abstract:

Image processing become more sophisticated over time due to technological advances, especially artificial intelligence (AI) technology. Currently, AI image processing is used in many areas, including surveillance, industry, science, and medicine. AI in medical image processing can help doctors diagnose diseases faster, with minimal mistakes, and with less effort. Among these diseases is malaria, which remains a major public health challenge in many parts of the world. It affects millions of people every year, particularly in tropical and subtropical regions. Early detection of malaria is essential to prevent serious complications and reduce the burden of the disease. In this paper, we propose and implement a scheme based on AI image processing to enhance malaria disease diagnosis through automated analysis of blood smear images. The scheme is based on the convolutional neural network (CNN) method. So, we have developed a model that classifies infected and uninfected single red cells using images available on Kaggle, as well as real blood smear images obtained from the Central Laboratory of Medical Biology EHS Laadi Flici (formerly El Kettar) in Algeria. The real images were segmented into individual cells using the watershed algorithm in order to match the images from the Kaagle dataset. The model was trained and tested, achieving an accuracy of 99% and 97% accuracy for new real images. This validates that the model performs well with new real images, although with slightly lower accuracy. Additionally, the model has been embedded in a Raspberry Pi4, and a graphical user interface (GUI) was developed to visualize the malaria diagnostic results and facilitate user interaction.

Keywords: medical image processing, malaria parasite, classification, CNN, artificial intelligence

Procedia PDF Downloads 23
8852 Adoption of Digital Storytelling Tool to Teach 21st Century Skills by Malaysian Pre-service Teachers

Authors: Siti Aisyah binti Jumpaan

Abstract:

21ˢᵗ century skills (PAK-21) integration has made its way into Malaysian curriculum when Ministry of Education introduce its implementation since 2016. This study was conducted to explore pre-service teachers’ readiness in integrating 21st century skills in the classroom via the digital storytelling (DST) method and to find gaps between theory and practice that can be integral towards pre-service teachers’ professional growth. Qualitative research method was used in this research involving six respondents who were selected using a purposive sampling method. Their response from interviews and lesson plan analysis were analysed using narrative analysis. The findings showed that pre-service teachers showed a moderate level of readiness in integrating 21st century skills using DST. Pre-service teachers demonstrated high level of preparedness in writing their lesson plan, but their interview revealed that they faced struggles in implementation due to several factors, such as lack of technology and failure to obtain students’ participation. This study further strengthens the need for specialised curriculum for pre-service teachers in teaching 21st century skills via DST.

Keywords: digital storytelling, 21ˢᵗ century skills, preservice teachers, teacher training

Procedia PDF Downloads 93
8851 A Computer-Aided System for Tooth Shade Matching

Authors: Zuhal Kurt, Meral Kurt, Bilge T. Bal, Kemal Ozkan

Abstract:

Shade matching and reproduction is the most important element of success in prosthetic dentistry. Until recently, shade matching procedure was implemented by dentists visual perception with the help of shade guides. Since many factors influence visual perception; tooth shade matching using visual devices (shade guides) is highly subjective and inconsistent. Subjective nature of this process has lead to the development of instrumental devices. Nowadays, colorimeters, spectrophotometers, spectroradiometers and digital image analysing systems are used for instrumental shade selection. Instrumental devices have advantages that readings are quantifiable, can obtain more rapidly and simply, objectively and precisely. However, these devices have noticeable drawbacks. For example, translucent structure and irregular surfaces of teeth lead to defects on measurement with these devices. Also between the results acquired by devices with different measurement principles may make inconsistencies. So, its obligatory to search for new methods for dental shade matching process. A computer-aided system device; digital camera has developed rapidly upon today. Currently, advances in image processing and computing have resulted in the extensive use of digital cameras for color imaging. This procedure has a much cheaper process than the use of traditional contact-type color measurement devices. Digital cameras can be taken by the place of contact-type instruments for shade selection and overcome their disadvantages. Images taken from teeth show morphology and color texture of teeth. In last decades, a new method was recommended to compare the color of shade tabs taken by a digital camera using color features. This method showed that visual and computer-aided shade matching systems should be used as concatenated. Recently using methods of feature extraction techniques are based on shape description and not used color information. However, color is mostly experienced as an essential property in depicting and extracting features from objects in the world around us. When local feature descriptors with color information are extended by concatenating color descriptor with the shape descriptor, that descriptor will be effective on visual object recognition and classification task. Therefore, the color descriptor is to be used in combination with a shape descriptor it does not need to contain any spatial information, which leads us to use local histograms. This local color histogram method is remain reliable under variation of photometric changes, geometrical changes and variation of image quality. So, coloring local feature extraction methods are used to extract features, and also the Scale Invariant Feature Transform (SIFT) descriptor used to for shape description in the proposed method. After the combination of these descriptors, the state-of-art descriptor named by Color-SIFT will be used in this study. Finally, the image feature vectors obtained from quantization algorithm are fed to classifiers such as Nearest Neighbor (KNN), Naive Bayes or Support Vector Machines (SVM) to determine label(s) of the visual object category or matching. In this study, SVM are used as classifiers for color determination and shade matching. Finally, experimental results of this method will be compared with other recent studies. It is concluded from the study that the proposed method is remarkable development on computer aided tooth shade determination system.

Keywords: classifiers, color determination, computer-aided system, tooth shade matching, feature extraction

Procedia PDF Downloads 448
8850 Identification of Landslide Features Using Back-Propagation Neural Network on LiDAR Digital Elevation Model

Authors: Chia-Hao Chang, Geng-Gui Wang, Jee-Cheng Wu

Abstract:

The prediction of a landslide is a difficult task because it requires a detailed study of past activities using a complete range of investigative methods to determine the changing condition. In this research, first step, LiDAR 1-meter by 1-meter resolution of digital elevation model (DEM) was used to generate six environmental factors of landslide. Then, back-propagation neural networks (BPNN) was adopted to identify scarp, landslide areas and non-landslide areas. The BPNN uses 6 environmental factors in input layer and 1 output layer. Moreover, 6 landslide areas are used as training areas and 4 landslide areas as test areas in the BPNN. The hidden layer is set to be 1 and 2; the hidden layer neurons are set to be 4, 5, 6, 7 and 8; the learning rates are set to be 0.01, 0.1 and 0.5. When using 1 hidden layer with 7 neurons and the learning rate sets to be 0.5, the result of Network training root mean square error is 0.001388. Finally, evaluation of BPNN classification accuracy by the confusion matrix shows that the overall accuracy can reach 94.4%, and the Kappa value is 0.7464.

Keywords: digital elevation model, DEM, environmental factors, back-propagation neural network, BPNN, LiDAR

Procedia PDF Downloads 145
8849 Iterative Linear Quadratic Regulator (iLQR) vs LQR Controllers for Quadrotor Path Tracking

Authors: Wesam Jasim, Dongbing Gu

Abstract:

This paper presents an iterative linear quadratic regulator optimal control technique to solve the problem of quadrotors path tracking. The dynamic motion equations are represented based on unit quaternion representation and include some modelled aerodynamical effects as a nonlinear part. Simulation results prove the ability and effectiveness of iLQR to stabilize the quadrotor and successfully track different paths. It also shows that iLQR controller outperforms LQR controller in terms of fast convergence and tracking errors.

Keywords: iLQR controller, optimal control, path tracking, quadrotor UAVs

Procedia PDF Downloads 448
8848 Monitoring Deforestation Using Remote Sensing And GIS

Authors: Tejaswi Agarwal, Amritansh Agarwal

Abstract:

Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from Indian institute of remote Sensing (IIRS), Dehradoon in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud free and did not belong to dry and leafless season. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean, we have analysed the change in ground biomass. Through this paper, we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques, it is clearly shown that the total forest cover is continuously degrading and transforming into various land use/land cover category.

Keywords: remote sensing, deforestation, supervised classification, NDVI, change detection

Procedia PDF Downloads 1207
8847 Flood Mapping Using Height above the Nearest Drainage Model: A Case Study in Fredericton, NB, Canada

Authors: Morteza Esfandiari, Shabnam Jabari, Heather MacGrath, David Coleman

Abstract:

Flood is a severe issue in different places in the world as well as the city of Fredericton, New Brunswick, Canada. The downtown area of Fredericton is close to the Saint John River, which is susceptible to flood around May every year. Recently, the frequency of flooding seems to be increased, especially after the fact that the downtown area and surrounding urban/agricultural lands got flooded in two consecutive years in 2018 and 2019. In order to have an explicit vision of flood span and damage to affected areas, it is necessary to use either flood inundation modelling or satellite data. Due to contingent availability and weather dependency of optical satellites, and limited existing data for the high cost of hydrodynamic models, it is not always feasible to rely on these sources of data to generate quality flood maps after or during the catastrophe. Height Above the Nearest Drainage (HAND), a state-of-the-art topo-hydrological index, normalizes the height of a basin based on the relative elevation along with the stream network and specifies the gravitational or the relative drainage potential of an area. HAND is a relative height difference between the stream network and each cell on a Digital Terrain Model (DTM). The stream layer is provided through a multi-step, time-consuming process which does not always result in an optimal representation of the river centerline depending on the topographic complexity of that region. HAND is used in numerous case studies with quite acceptable and sometimes unexpected results because of natural and human-made features on the surface of the earth. Some of these features might cause a disturbance in the generated model, and consequently, the model might not be able to predict the flow simulation accurately. We propose to include a previously existing stream layer generated by the province of New Brunswick and benefit from culvert maps to improve the water flow simulation and accordingly the accuracy of HAND model. By considering these parameters in our processing, we were able to increase the accuracy of the model from nearly 74% to almost 92%. The improved model can be used for generating highly accurate flood maps, which is necessary for future urban planning and flood damage estimation without any need for satellite imagery or hydrodynamic computations.

Keywords: HAND, DTM, rapid floodplain, simplified conceptual models

Procedia PDF Downloads 153
8846 Multi-Objective Optimization of Assembly Manufacturing Factory Setups

Authors: Andreas Lind, Aitor Iriondo Pascual, Dan Hogberg, Lars Hanson

Abstract:

Factory setup lifecycles are most often described and prepared in CAD environments; the preparation is based on experience and inputs from several cross-disciplinary processes. Early in the factory setup preparation, a so-called block layout is created. The intention is to describe a high-level view of the intended factory setup and to claim area reservations and allocations. Factory areas are then blocked, i.e., targeted to be used for specific intended resources and processes, later redefined with detailed factory setup layouts. Each detailed layout is based on the block layout and inputs from cross-disciplinary preparation processes, such as manufacturing sequence, productivity, workers’ workplace requirements, and resource setup preparation. However, this activity is often not carried out with all variables considered simultaneously, which might entail a risk of sub-optimizing the detailed layout based on manual decisions. Therefore, this work aims to realize a digital method for assembly manufacturing layout planning where productivity, area utilization, and ergonomics can be considered simultaneously in a cross-disciplinary manner. The purpose of the digital method is to support engineers in finding optimized designs of detailed layouts for assembly manufacturing factories, thereby facilitating better decisions regarding setups of future factories. Input datasets are company-specific descriptions of required dimensions for specific area reservations, such as defined dimensions of a worker’s workplace, material façades, aisles, and the sequence to realize the product assembly manufacturing process. To test and iteratively develop the digital method, a demonstrator has been developed with an adaptation of existing software that simulates and proposes optimized designs of detailed layouts. Since the method is to consider productivity, ergonomics, area utilization, and constraints from the automatically generated block layout, a multi-objective optimization approach is utilized. In the demonstrator, the input data are sent to the simulation software industrial path solutions (IPS). Based on the input and Lua scripts, the IPS software generates a block layout in compliance with the company’s defined dimensions of area reservations. Communication is then established between the IPS and the software EPP (Ergonomics in Productivity Platform), including intended resource descriptions, assembly manufacturing process, and manikin (digital human) resources. Using multi-objective optimization approaches, the EPP software then calculates layout proposals that are sent iteratively and simulated and rendered in IPS, following the rules and regulations defined in the block layout as well as productivity and ergonomics constraints and objectives. The software demonstrator is promising. The software can handle several parameters to optimize the detailed layout simultaneously and can put forward several proposals. It can optimize multiple parameters or weight the parameters to fine-tune the optimal result of the detailed layout. The intention of the demonstrator is to make the preparation between cross-disciplinary silos transparent and achieve a common preparation of the assembly manufacturing factory setup, thereby facilitating better decisions.

Keywords: factory setup, multi-objective, optimization, simulation

Procedia PDF Downloads 153
8845 NDVI as a Measure of Change in Forest Biomass

Authors: Amritansh Agarwal, Tejaswi Agarwal

Abstract:

Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000 km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from USGS website in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud and aerosol free by making using of FLAASH atmospheric correction technique. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean we have analysed the change in ground biomass. Through this paper we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques it is clearly shows that the total forest cover is continuously degrading and transforming into various land use/land cover category.

Keywords: remote sensing, deforestation, supervised classification, NDVI change detection

Procedia PDF Downloads 402
8844 Short Text Classification Using Part of Speech Feature to Analyze Students' Feedback of Assessment Components

Authors: Zainab Mutlaq Ibrahim, Mohamed Bader-El-Den, Mihaela Cocea

Abstract:

Students' textual feedback can hold unique patterns and useful information about learning process, it can hold information about advantages and disadvantages of teaching methods, assessment components, facilities, and other aspects of teaching. The results of analysing such a feedback can form a key point for institutions’ decision makers to advance and update their systems accordingly. This paper proposes a data mining framework for analysing end of unit general textual feedback using part of speech feature (PoS) with four machine learning algorithms: support vector machines, decision tree, random forest, and naive bays. The proposed framework has two tasks: first, to use the above algorithms to build an optimal model that automatically classifies the whole data set into two subsets, one subset is tailored to assessment practices (assessment related), and the other one is the non-assessment related data. Second task to use the same algorithms to build an optimal model for whole data set, and the new data subsets to automatically detect their sentiment. The significance of this paper is to compare the performance of the above four algorithms using part of speech feature to the performance of the same algorithms using n-grams feature. The paper follows Knowledge Discovery and Data Mining (KDDM) framework to construct the classification and sentiment analysis models, which is understanding the assessment domain, cleaning and pre-processing the data set, selecting and running the data mining algorithm, interpreting mined patterns, and consolidating the discovered knowledge. The results of this paper experiments show that both models which used both features performed very well regarding first task. But regarding the second task, models that used part of speech feature has underperformed in comparison with models that used unigrams and bigrams.

Keywords: assessment, part of speech, sentiment analysis, student feedback

Procedia PDF Downloads 143
8843 Bridge Healthcare Access Gap with Artifical Intelligence

Authors: Moshmi Sangavarapu

Abstract:

The US healthcare industry has undergone tremendous digital transformation in recent years, but critical care access to lower-income ethnicities is still in its nascency. This population has historically showcased substantial hesitation to seek any medical assistance. While the lack of sufficient financial resources plays a critical role, the existing cultural and knowledge barriers also contribute significantly to widening the access gap. It is imperative to break these barriers to ensure timely access to therapeutic procedures that can save important lives! Based on ongoing research, healthcare access barriers can be best addressed by tapping the untapped potential of caregiver communities first. They play a critical role in patients’ diagnoses, building healthcare knowledge and instilling confidence in required therapeutic procedures. Recent technological advancements have opened many avenues by developing smart ways of reaching the large caregiver community. A digitized go-to-market strategy featuring connected media coupled with smart IoT devices and geo-location targeting can be collectively leveraged to reach this key audience group. AI/ML algorithms can be thoroughly trained to identify relevant data signals from users' location and browsing behavior and determine useful marketing touchpoints. The web behavior can be further assimilated with natural language processing to identify contextually relevant interest topics and decipher potential caregivers on digital avenues to serve that brand message. In conclusion, grasping the true health access journey of any lower-income ethnic group is important to design beneficial touchpoints that can alleviate patients’ concerns and allow them to break their own access barriers and opt for timely and quality healthcare.

Keywords: healthcare access, market access, diversity barriers, patient journey

Procedia PDF Downloads 56
8842 Performants: A Digital Event Manager-Organizer

Authors: Ioannis Andrianakis, Manolis Falelakis, Maria Pavlidou, Konstantinos Papakonstantinou, Ermioni Avramidou, Dimitrios Kalogiannis, Nikolaos Milios, Katerina Bountakidou, Kiriakos Chatzidimitriou, Panagiotis Panagiotopoulos

Abstract:

Artistic events, such as concerts and performances, are challenging to organize because they involve many people with different skill sets. Small and medium venues often struggle to afford the costs and overheads of booking and hosting remote artists, especially if they lack sponsors or subsidies. This limits the opportunities for both venues and artists, especially those outside of big cities. However, more and more research shows that audiences prefer smaller-scale events and concerts, which benefit local economies and communities. To address this challenge, our project “PerformAnts: Digital Event Manager-Organizer” aims to develop a smart digital tool that automates and optimizes the processes and costs of live shows and tours. By using machine learning, applying best practices and training users through workshops, our platform offers a comprehensive solution for a growing market, enhances the mobility of artists and the accessibility of venues and allows professionals to focus on the creative aspects of concert production.

Keywords: event organization, creative industries, event promotion, machine learning

Procedia PDF Downloads 87
8841 The Commercialization of eSports and the Emergence of Fan Hierarchies: Gender Dynamics, Emotional Engagement, and Community Tensions in Digital Fandom

Authors: Anwen Ren

Abstract:

This study explores the commercialization of eSports and its impact on fan hierarchies, focusing on gender dynamics and emotional engagement. Through mixed-methods research, it examines the divide between "traditional fans" and "groupie fans," highlighting how gendered stereotypes marginalize female fans. Using the case of professional eSports player Scout, the paper analyzes parasocial relationships and their role in the fan economy. The findings reveal the need for inclusivity in fan culture to address gender bias and optimize eSports' commercial potential. This work contributes to understanding the intersection of gender, representation, and digital fandoms.

Keywords: eSports, fan culture, gender dynamics, commercialization

Procedia PDF Downloads 14
8840 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring

Authors: Zheng Wang, Zhenhong Li, Jon Mills

Abstract:

Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.

Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring

Procedia PDF Downloads 163