Search results for: sub-pixel accuracy
1011 Applications of Out-of-Sequence Thrust Movement for Earthquake Mitigation: A Review
Authors: Rajkumar Ghosh
Abstract:
The study presents an overview of the many uses and approaches for estimating out-of-sequence thrust movement in earthquake mitigation. The study investigates how knowing and forecasting thrust movement during seismic occurrences might assist to effective earthquake mitigation measures. The review begins by discussing out-of-sequence thrust movement and its importance in earthquake mitigation strategies. It explores how typical techniques of estimating thrust movement may not capture the full complexity of seismic occurrences and emphasizes the benefits of include out-of-sequence data in the analysis. A thorough review of existing research and studies on out-of-sequence thrust movement estimates for earthquake mitigation. The study demonstrates how to estimate out-of-sequence thrust movement using multiple data sources such as GPS measurements, satellite imagery, and seismic recordings. The study also examines the use of out-of-sequence thrust movement estimates in earthquake mitigation measures. It investigates how precise calculation of thrust movement may help improve structural design, analyse infrastructure risk, and develop early warning systems. The potential advantages of using out-of-sequence data in these applications to improve the efficiency of earthquake mitigation techniques. The difficulties and limits of estimating out-of-sequence thrust movement for earthquake mitigation. It addresses data quality difficulties, modelling uncertainties, and computational complications. To address these obstacles and increase the accuracy and reliability of out-of-sequence thrust movement estimates, the authors recommend topics for additional study and improvement. The study is a helpful resource for seismic monitoring and earthquake risk assessment researchers, engineers, and policymakers, supporting innovations in earthquake mitigation measures based on a better knowledge of thrust movement dynamics.Keywords: earthquake mitigation, out-of-sequence thrust, satellite imagery, seismic recordings, GPS measurements
Procedia PDF Downloads 901010 Automatic Detection and Filtering of Negative Emotion-Bearing Contents from Social Media in Amharic Using Sentiment Analysis and Deep Learning Methods
Authors: Derejaw Lake Melie, Alemu Kumlachew Tegegne
Abstract:
The increasing prevalence of social media in Ethiopia has exacerbated societal challenges by fostering the proliferation of negative emotional posts and comments. Illicit use of social media has further exacerbated divisions among the population. Addressing these issues through manual identification and aggregation of emotions from millions of users for swift decision-making poses significant challenges, particularly given the rapid growth of Amharic language usage on social platforms. Consequently, there is a critical need to develop an intelligent system capable of automatically detecting and categorizing negative emotional content into social, religious, and political categories while also filtering out toxic online content. This paper aims to leverage sentiment analysis techniques to achieve automatic detection and filtering of negative emotional content from Amharic social media texts, employing a comparative study of deep learning algorithms. The study utilized a dataset comprising 29,962 comments collected from social media platforms using comment exporter software. Data pre-processing techniques were applied to enhance data quality, followed by the implementation of deep learning methods for training, testing, and evaluation. The results showed that CNN, GRU, LSTM, and Bi-LSTM classification models achieved accuracies of 83%, 50%, 84%, and 86%, respectively. Among these models, Bi-LSTM demonstrated the highest accuracy of 86% in the experiment.Keywords: negative emotion, emotion detection, social media filtering sentiment analysis, deep learning.
Procedia PDF Downloads 401009 Design Study on a Contactless Material Feeding Device for Electro Conductive Workpieces
Authors: Oliver Commichau, Richard Krimm, Bernd-Arno Behrens
Abstract:
A growing demand on the production rate of modern presses leads to higher stroke rates. Commonly used material feeding devices for presses like grippers and roll-feeding systems can only achieve high stroke rates along with high gripper forces, to avoid stick-slip. These forces are limited by the sensibility of the surfaces of the workpieces. Stick-slip leads to scratches on the surface and false positioning of the workpiece. In this paper, a new contactless feeding device is presented, which develops higher feeding force without damaging the surface of the workpiece through gripping forces. It is based on the principle of the linear induction motor. A primary part creates a magnetic field and induces eddy currents in the electrically conductive material. A Lorentz-Force applies to the workpiece in feeding direction as a mutual reaction between the eddy-currents and the magnetic induction. In this study, the FEA model of this approach is shown. The calculation of this model was used to identify the influence of various design parameters on the performance of the feeder and thus showing the promising capabilities and limits of this technology. In order to validate the study, a prototype of the feeding device has been built. An experimental setup was used to measure pulling forces and placement accuracy of the experimental feeder in order to give an outlook of a potential industrial application of this approach.Keywords: conductive material, contactless feeding, linear induction, Lorentz-Force
Procedia PDF Downloads 1831008 Assessing the NYC's Single-Family Housing Typology for Urban Heat Vulnerability and Occupants’ Health Risk under the Climate Change Emergency
Authors: Eleni Stefania Kalapoda
Abstract:
Recurring heat waves due to the global climate change emergency pose continuous risks to human health and urban resources. Local and state decision-makers incorporate Heat Vulnerability Indices (HVIs) to quantify and map the relative impact on human health in emergencies. These maps enable government officials to identify the highest-risk districts and to concentrate emergency planning efforts and available resources accordingly (e.g., to reevaluate the location and the number of heat-relief centers). Even though the framework of conducting an HVI is unique per municipality, its accuracy in assessing the heat risk is limited. To resolve this issue, varied housing-related metrics should be included. This paper quantifies and classifies NYC’s single detached housing typology within high-vulnerable NYC districts using detailed energy simulations and post-processing calculations. The results show that the variation in indoor heat risk depends significantly on the dwelling’s design/operation characteristics, concluding that low-ventilated dwellings are the most vulnerable ones. Also, it confirmed that when building-level determinants of exposure are excluded from the assessment, HVI fails to capture important components of heat vulnerability. Lastly, the overall vulnerability ratio of the housing units was calculated between 0.11 to 1.6 indoor heat degrees in terms of ventilation and shading capacity, insulation degree, and other building attributes.Keywords: heat vulnerability index, energy efficiency, urban heat, resiliency to heat, climate adaptation, climate mitigation, building energy
Procedia PDF Downloads 861007 New Off-Line SPE-GC-MS/MS Method for Determination of Mineral Oil Saturated Hydrocarbons/Mineral Oil Hydrocarbons in Animal Feed, Foods, Infant Formula and Vegetable Oils
Authors: Ovanes Chakoyan
Abstract:
MOH (mineral oil hydrocarbons), which consist of mineral oil saturated hydrocarbons(MOSH) and mineral oil aromatic hydrocarbons(MOAH), are present in various products such as vegetable oils, animal feed, foods, and infant formula. Contamination of foods with mineral oil hydrocarbons, particularly mineral oil aromatic hydrocarbons(MOAH), exhibiting carcinogenic, mutagenic, and hormone-disruptive effects. Identifying toxic substances among the many thousands comprising mineral oils in food samples is a difficult analytical challenge. A method based on an offline-solid phase extraction approach coupled with gas chromatography-triple quadrupole(GC-MS/MS) was developed for the determination of MOSH/MOAH in various products such as vegetable oils, animal feed, foods, and infant formula. A glass solid phase extraction cartridge loaded with 7 g of activated silica gel impregnated with 10 % silver nitrate for removal of olefins and lipids. The MOSH/MOAH fractions were eluated with hexane and hexane: dichloromethane : toluene, respectively. Each eluate was concentrated to 50 µl in toluene and injected on splitless mode into GC-MS/MS. Accuracy of the method was estimated as measurement of recovery of spiked oil samples at 2.0, 15.0, and 30.0 mg kg -1, and recoveries varied from 85 to 105 %. The method was applied to the different types of samples (sunflower meal, chocolate ships, santa milk chocolate, biscuits, infant milk, cornflakes, refined sunflower oil, crude sunflower oil), detecting MOSH up to 56 mg/kg and MOAH up to 5 mg/kg. The limit of quantification(LOQ) of the proposed method was estimated at 0.5 mg/kg and 0.3 mg/kg for MOSH and MOAH, respectively.Keywords: MOSH, MOAH, GC-MS/MS, foods, solid phase extraction
Procedia PDF Downloads 961006 Analytical Study and Conservation Processes of a Wooden Coffin of Middel Kingdom, Ancient Egypt
Authors: Mohamed Ahmed Abd El Kader
Abstract:
This paper describes the conservation processes of an Ancient Egyptian wooden coffin dating back to the Middle Kingdom, ancient Egypt, using several scientific and analytical methods in order to provide a deeper understanding of the deterioration status and a greater awareness of how well preserved the object is. Visual observation and 2D Programs, as well as Optical Microscopy (OM), Environmental scanning Electron Microscopy (ESEM), X-ray Diffraction (XRD) and Fourier Transform Infrared Spectroscopy (FTIR) were used in our study. The identification of wood species and the composition of the pigments and previous restoration materials were made. The coffin was previously conserved and stored in improper conditions, which led to its further deterioration; the surface of the lid dust, which obscured the decorations as well as all necessary restoration work was promptly carried out as soon as the coffin was transferred from the display hall from the Egyptian Museum to the Wood Conservation Laboratory of the Grand Egyptian Museum-Conservation Center (GEM-CC). The analyses provided detailed information concerning the original materials and the materials added during the previous treatment interventions, which was considered when applying the conservation plan. Conservation procedures have been applied with high accuracy to conserve the coffin including cleaning, consolidation of fragile painted layers, and the wooden boards forming the sides of the coffin were reassembled in their original positions. The materials and methods that were applied were extremely effective in stability and reinforcement of the coffin without harmfulness to the original materials and the coffin was successfully conserved and ready to display in the Grand Egyptian Museum (GEM).Keywords: coffin, middle kingdom, deterioration, 2d program
Procedia PDF Downloads 571005 Opinion Mining to Extract Community Emotions on Covid-19 Immunization Possible Side Effects
Authors: Yahya Almurtadha, Mukhtar Ghaleb, Ahmed M. Shamsan Saleh
Abstract:
The world witnessed a fierce attack from the Covid-19 virus, which affected public life socially, economically, healthily and psychologically. The world's governments tried to confront the pandemic by imposing a number of precautionary measures such as general closure, curfews and social distancing. Scientists have also made strenuous efforts to develop an effective vaccine to train the immune system to develop antibodies to combat the virus, thus reducing its symptoms and limiting its spread. Artificial intelligence, along with researchers and medical authorities, has accelerated the vaccine development process through big data processing and simulation. On the other hand, one of the most important negatives of the impact of Covid 19 was the state of anxiety and fear due to the blowout of rumors through social media, which prompted governments to try to reassure the public with the available means. This study aims to proposed using Sentiment Analysis (AKA Opinion Mining) and deep learning as efficient artificial intelligence techniques to work on retrieving the tweets of the public from Twitter and then analyze it automatically to extract their opinions, expression and feelings, negatively or positively, about the symptoms they may feel after vaccination. Sentiment analysis is characterized by its ability to access what the public post in social media within a record time and at a lower cost than traditional means such as questionnaires and interviews, not to mention the accuracy of the information as it comes from what the public expresses voluntarily.Keywords: deep learning, opinion mining, natural language processing, sentiment analysis
Procedia PDF Downloads 1751004 A Novel Hybrid Deep Learning Architecture for Predicting Acute Kidney Injury Using Patient Record Data and Ultrasound Kidney Images
Authors: Sophia Shi
Abstract:
Acute kidney injury (AKI) is the sudden onset of kidney damage in which the kidneys cannot filter waste from the blood, requiring emergency hospitalization. AKI patient mortality rate is high in the ICU and is virtually impossible for doctors to predict because it is so unexpected. Currently, there is no hybrid model predicting AKI that takes advantage of two types of data. De-identified patient data from the MIMIC-III database and de-identified kidney images and corresponding patient records from the Beijing Hospital of the Ministry of Health were collected. Using data features including serum creatinine among others, two numeric models using MIMIC and Beijing Hospital data were built, and with the hospital ultrasounds, an image-only model was built. Convolutional neural networks (CNN) were used, VGG and Resnet for numeric data and Resnet for image data, and they were combined into a hybrid model by concatenating feature maps of both types of models to create a new input. This input enters another CNN block and then two fully connected layers, ending in a binary output after running through Softmax and additional code. The hybrid model successfully predicted AKI and the highest AUROC of the model was 0.953, achieving an accuracy of 90% and F1-score of 0.91. This model can be implemented into urgent clinical settings such as the ICU and aid doctors by assessing the risk of AKI shortly after the patient’s admission to the ICU, so that doctors can take preventative measures and diminish mortality risks and severe kidney damage.Keywords: Acute kidney injury, Convolutional neural network, Hybrid deep learning, Patient record data, ResNet, Ultrasound kidney images, VGG
Procedia PDF Downloads 1351003 Time-Domain Analysis Approaches of Soil-Structure Interaction: A Comparative Study
Authors: Abdelrahman Taha, Niloofar Malekghaini, Hamed Ebrahimian, Ramin Motamed
Abstract:
This paper compares the substructure and direct methods for soil-structure interaction (SSI) analysis in the time domain. In the substructure SSI method, the soil domain is replaced by a set of springs and dashpots, also referred to as the impedance function, derived through the study of the behavior of a massless rigid foundation. The impedance function is inherently frequency dependent, i.e., it varies as a function of the frequency content of the structural response. To use the frequency-dependent impedance function for time-domain SSI analysis, the impedance function is approximated at the fundamental frequency of the structure-soil system. To explore the potential limitations of the substructure modeling process, a two-dimensional reinforced concrete frame structure is modeled using substructure and direct methods in this study. The results show discrepancies between the simulated responses of the substructure and the direct approaches. To isolate the effects of higher modal responses, the same study is repeated using a harmonic input motion, in which a similar discrepancy is still observed between the substructure and direct approaches. It is concluded that the main source of discrepancy between the substructure and direct SSI approaches is likely attributed to the way the impedance functions are calculated, i.e., assuming a massless rigid foundation without considering the presence of the superstructure. Hence, a refined impedance function, considering the presence of the superstructure, shall be developed. This refined impedance function is expected to significantly improve the simulation accuracy of the substructure approach for structural systems whose behavior is dominated by the fundamental mode response.Keywords: direct approach, impedance function, soil-structure interaction, substructure approach
Procedia PDF Downloads 1221002 Hybrid Approach for Face Recognition Combining Gabor Wavelet and Linear Discriminant Analysis
Authors: A: Annis Fathima, V. Vaidehi, S. Ajitha
Abstract:
Face recognition system finds many applications in surveillance and human computer interaction systems. As the applications using face recognition systems are of much importance and demand more accuracy, more robustness in the face recognition system is expected with less computation time. In this paper, a hybrid approach for face recognition combining Gabor Wavelet and Linear Discriminant Analysis (HGWLDA) is proposed. The normalized input grayscale image is approximated and reduced in dimension to lower the processing overhead for Gabor filters. This image is convolved with bank of Gabor filters with varying scales and orientations. LDA, a subspace analysis techniques are used to reduce the intra-class space and maximize the inter-class space. The techniques used are 2-dimensional Linear Discriminant Analysis (2D-LDA), 2-dimensional bidirectional LDA ((2D)2LDA), Weighted 2-dimensional bidirectional Linear Discriminant Analysis (Wt (2D)2 LDA). LDA reduces the feature dimension by extracting the features with greater variance. k-Nearest Neighbour (k-NN) classifier is used to classify and recognize the test image by comparing its feature with each of the training set features. The HGWLDA approach is robust against illumination conditions as the Gabor features are illumination invariant. This approach also aims at a better recognition rate using less number of features for varying expressions. The performance of the proposed HGWLDA approaches is evaluated using AT&T database, MIT-India face database and faces94 database. It is found that the proposed HGWLDA approach provides better results than the existing Gabor approach.Keywords: face recognition, Gabor wavelet, LDA, k-NN classifier
Procedia PDF Downloads 4701001 Analysis of Vibration of Thin-Walled Parts During Milling Made of EN AW-7075 Alloy
Authors: Jakub Czyżycki, Paweł Twardowski
Abstract:
Thin-walled components made of aluminum alloys are increasingly found in many fields of industry, and they dominate the aerospace industry. The machining of thinwalled structures encounters many difficulties related to the high susceptibility of the workpiece, which causes vibrations including the most unfavorable ones called chatter. The effect of these phenomena is the difficulty in obtaining the required geometric dimensions and surface quality. The purpose of this study is to analyze vibrations arising during machining of thin-walled workpieces made of aluminum alloy EN AW-7075. Samples representing actual thin-walled workpieces were examined in a different range of dimensions characterizing thin-walled workpieces. The tests were carried out in HSM high-speed machining (cutting speed vc = 1400 m/min) using a monolithic solid carbide endmill. Measurement of vibration was realized using a singlecomponent piezoelectric accelerometer 4508C from Brüel&Kjær which was mounted directly on the sample before machining, the measurement was made in the normal feed direction AfN. In addition, the natural frequency of the tested thin-walled components was investigated using a laser vibrometer for an broader analysis of the tested samples. The effect of vibrations on machining accuracy was presented in the form of surface images taken with an optical measuring device from Alicona. A classification of the vibrations produced during the test was carried out, and were analyzed in both the time and frequency domains. Observed significant influence of the thickness of the thin-walled component on the course of vibrations during machining.Keywords: high-speed machining, thin-walled elements, thin-walled components, milling, vibrations
Procedia PDF Downloads 621000 The Concept of Accounting in Islamic Transactions
Authors: Ahmad Abdulkadir Ibrahim
Abstract:
The Islamic law of transactions laid down the methods and instruments of accounting and analyzed its basic assumptions in the modern world. There is a need to examine the implications of accounting initiatives in the Muslim world and attempt to outline the important characteristics of Islamic accounting and how Islamic accounting resolves the problem of measuring the cost of Murabaha goods in case of exchange rate variation. The research tends to discuss an analytical approach to the Islamic accounting concept as well as elaborating the jurisprudential matter and practical aspects of accounting in Islamic financial transactions. It also aims to alert the practitioners of accounting in the Islamic world to be aware of the concept of accounting in Islamic jurisprudence and its historical development. The methodology adopted in this research is the qualitative method through the consultation of relevant literature, which focuses on the thematic study of the subject matter. This is followed by an analysis and discussion of the contents of the materials used. It is concluded that Islamic accounting is unique in its norms as it has been characterized by fairness, accuracy in measuring tools, truthfulness, mutual trust, moderation in making a profit, and tolerance. It was also qualified by capacity and flexibility in terms of the tools and terminology used and invented by Islamic jurisprudence in the accounting system, which indicates its validity and consistency anytime and anywhere. An important conclusion of the research also lies in the refutation of the popular idea that an Italian writer known as Luca Pacilio was the first writer who developed the basis of double-entry due to the presented proofs by Muslim scholars of critical accounting developments, which cannot be ignored. It concludes further that Islamic jurisprudence draws the accounting system codified in the foundations of a market that is far from usury, fraud, cheating, and unfair competition in all areas.Keywords: accounting, Islamic accounting, Islamic transactions, Islamic jurisprudence, double entry, murabaha, characteristics
Procedia PDF Downloads 67999 Monitoring Urban Green Space Cover Change Using GIS and Remote Sensing in Two Rapidly Urbanizing Cities, Debre Berhan and Debre Markos, Ethiopia
Authors: Alemaw Kefale, Aramde Fetene, Hayal Desta
Abstract:
Monitoring the amount of green space in urban areas is important for ensuring sustainable development and proper management. The study analyzed changes in urban green space coverage over the past 20 years in two rapidly urbanizing cities in Ethiopia, Debre Berhan and Debre Markos, using GIS and remote sensing. The researchers used Landsat 5 and 8 data with a spatial resolution of 30 m to determine different land use and land cover classes, including urban green spaces, barren and croplands, built-up areas, and water bodies. The classification accuracy ranged between 90% and 91.4%, with a Kappa Statistic of 0.85 to 0.88. The results showed that both cities experienced significant decreases in vegetation cover in their urban cores between 2000 and 2020, with radical changes observed from green spaces and croplands to built-up areas. In Debre Berhan, barren and croplands decreased by 32.96%, while built-up and green spaces increased by 357.9% and 37.4%, respectively, in 2020. In Debre Markos, built-up areas increased by 224.2%, while green spaces and barren and croplands decreased by 41% and 5.71%, respectively. The spatial structure of cities and planning policies were noticed as the major factors for big green cover change. Thus it has an implication for other rapidly urbanized cities in Africa and Asia. Overall, rapid urbanization threatens green spaces and agricultural areas, highlighting the need for ecological-based spatial planning in rapidly urbanizing cities.Keywords: green space coverage, GIS and remote sensing, Landsat, LULC, Ethiopia
Procedia PDF Downloads 66998 Performance Analysis of New Types of Reference Targets Based on Spaceborne and Airborne SAR Data
Authors: Y. S. Zhou, C. R. Li, L. L. Tang, C. X. Gao, D. J. Wang, Y. Y. Guo
Abstract:
Triangular trihedral corner reflector (CR) has been widely used as point target for synthetic aperture radar (SAR) calibration and image quality assessment. The additional “tip” of the triangular plate does not contribute to the reflector’s theoretical RCS and if it interacts with a perfectly reflecting ground plane, it will yield an increase of RCS at the radar bore-sight and decrease the accuracy of SAR calibration and image quality assessment. Regarding this problem, two types of CRs were manufactured. One was the hexagonal trihedral CR. It is a self-illuminating CR with relatively small plate edge length, while large edge length usually introduces unexpected edge diffraction error. The other was the triangular trihedral CR with extended bottom plate which considers the effect of ‘tip’ into the total RCS. In order to assess the performance of the two types of new CRs, flight campaign over the National Calibration and Validation Site for High Resolution Remote Sensors was carried out. Six hexagonal trihedral CRs and two bottom-extended trihedral CRs, as well as several traditional triangular trihedral CRs, were deployed. KOMPSAT-5 X-band SAR image was acquired for the performance analysis of the hexagonal trihedral CRs. C-band airborne SAR images were acquired for the performance analysis of the bottom-extended trihedral CRs. The analysis results showed that the impulse response function of both the hexagonal trihedral CRs and bottom-extended trihedral CRs were much closer to the ideal sinc-function than the traditional triangular trihedral CRs. The flight campaign results validated the advantages of new types of CRs and they might be useful in the future SAR calibration mission.Keywords: synthetic aperture radar, calibration, corner reflector, KOMPSAT-5
Procedia PDF Downloads 277997 An IoT-Enabled Crop Recommendation System Utilizing Message Queuing Telemetry Transport (MQTT) for Efficient Data Transmission to AI/ML Models
Authors: Prashansa Singh, Rohit Bajaj, Manjot Kaur
Abstract:
In the modern agricultural landscape, precision farming has emerged as a pivotal strategy for enhancing crop yield and optimizing resource utilization. This paper introduces an innovative Crop Recommendation System (CRS) that leverages the Internet of Things (IoT) technology and the Message Queuing Telemetry Transport (MQTT) protocol to collect critical environmental and soil data via sensors deployed across agricultural fields. The system is designed to address the challenges of real-time data acquisition, efficient data transmission, and dynamic crop recommendation through the application of advanced Artificial Intelligence (AI) and Machine Learning (ML) models. The CRS architecture encompasses a network of sensors that continuously monitor environmental parameters such as temperature, humidity, soil moisture, and nutrient levels. This sensor data is then transmitted to a central MQTT server, ensuring reliable and low-latency communication even in bandwidth-constrained scenarios typical of rural agricultural settings. Upon reaching the server, the data is processed and analyzed by AI/ML models trained to correlate specific environmental conditions with optimal crop choices and cultivation practices. These models consider historical crop performance data, current agricultural research, and real-time field conditions to generate tailored crop recommendations. This implementation gets 99% accuracy.Keywords: Iot, MQTT protocol, machine learning, sensor, publish, subscriber, agriculture, humidity
Procedia PDF Downloads 72996 EEG Correlates of Trait and Mathematical Anxiety during Lexical and Numerical Error-Recognition Tasks
Authors: Alexander N. Savostyanov, Tatiana A. Dolgorukova, Elena A. Esipenko, Mikhail S. Zaleshin, Margherita Malanchini, Anna V. Budakova, Alexander E. Saprygin, Tatiana A. Golovko, Yulia V. Kovas
Abstract:
EEG correlates of mathematical and trait anxiety level were studied in 52 healthy Russian-speakers during execution of error-recognition tasks with lexical, arithmetic and algebraic conditions. Event-related spectral perturbations were used as a measure of brain activity. The ERSP plots revealed alpha/beta desynchronizations within a 500-3000 ms interval after task onset and slow-wave synchronization within an interval of 150-350 ms. Amplitudes of these intervals reflected the accuracy of error recognition, and were differently associated with the three conditions. The correlates of anxiety were found in theta (4-8 Hz) and beta2 (16-20 Hz) frequency bands. In theta band the effects of mathematical anxiety were stronger expressed in lexical, than in arithmetic and algebraic condition. The mathematical anxiety effects in theta band were associated with differences between anterior and posterior cortical areas, whereas the effects of trait anxiety were associated with inter-hemispherical differences. In beta1 and beta2 bands effects of trait and mathematical anxiety were directed oppositely. The trait anxiety was associated with increase of amplitude of desynchronization, whereas the mathematical anxiety was associated with decrease of this amplitude. The effect of mathematical anxiety in beta2 band was insignificant for lexical condition but was the strongest in algebraic condition. EEG correlates of anxiety in theta band could be interpreted as indexes of task emotionality, whereas the reaction in beta2 band is related to tension of intellectual resources.Keywords: EEG, brain activity, lexical and numerical error-recognition tasks, mathematical and trait anxiety
Procedia PDF Downloads 563995 Optimization Modeling of the Hybrid Antenna Array for the DoA Estimation
Authors: Somayeh Komeylian
Abstract:
The direction of arrival (DoA) estimation is the crucial aspect of the radar technologies for detecting and dividing several signal sources. In this scenario, the antenna array output modeling involves numerous parameters including noise samples, signal waveform, signal directions, signal number, and signal to noise ratio (SNR), and thereby the methods of the DoA estimation rely heavily on the generalization characteristic for establishing a large number of the training data sets. Hence, we have analogously represented the two different optimization models of the DoA estimation; (1) the implementation of the decision directed acyclic graph (DDAG) for the multiclass least-squares support vector machine (LS-SVM), and (2) the optimization method of the deep neural network (DNN) radial basis function (RBF). We have rigorously verified that the LS-SVM DDAG algorithm is capable of accurately classifying DoAs for the three classes. However, the accuracy and robustness of the DoA estimation are still highly sensitive to technological imperfections of the antenna arrays such as non-ideal array design and manufacture, array implementation, mutual coupling effect, and background radiation and thereby the method may fail in representing high precision for the DoA estimation. Therefore, this work has a further contribution on developing the DNN-RBF model for the DoA estimation for overcoming the limitations of the non-parametric and data-driven methods in terms of array imperfection and generalization. The numerical results of implementing the DNN-RBF model have confirmed the better performance of the DoA estimation compared with the LS-SVM algorithm. Consequently, we have analogously evaluated the performance of utilizing the two aforementioned optimization methods for the DoA estimation using the concept of the mean squared error (MSE).Keywords: DoA estimation, Adaptive antenna array, Deep Neural Network, LS-SVM optimization model, Radial basis function, and MSE
Procedia PDF Downloads 104994 Determination of a Novel Artificial Sweetener Advantame in Food by Liquid Chromatography Tandem Mass Spectrometry
Authors: Fangyan Li, Lin Min Lee, Hui Zhu Peh, Shoet Harn Chan
Abstract:
Advantame, a derivative of aspartame, is the latest addition to a family of low caloric and high potent dipeptide sweeteners which include aspartame, neotame and alitame. The use of advantame as a high-intensity sweetener in food was first accepted by Food Standards Australia New Zealand in 2011 and subsequently by US and EU food authorities in 2014, with the results from toxicity and exposure studies showing advantame poses no safety concern to the public at regulated levels. To our knowledge, currently there is barely any detailed information on the analytical method of advantame in food matrix, except for one report published in Japanese, stating a high performance liquid chromatography (HPLC) and liquid chromatography/ mass spectrometry (LC-MS) method with a detection limit at ppm level. However, the use of acid in sample preparation and instrumental analysis in the report raised doubt over the reliability of the method, as there is indication that stability of advantame is compromised under acidic conditions. Besides, the method may not be suitable for analyzing food matrices containing advantame at low ppm or sub-ppm level. In this presentation, a simple, specific and sensitive method for the determination of advantame in food is described. The method involved extraction with water and clean-up via solid phase extraction (SPE) followed by detection using liquid chromatography tandem mass spectrometry (LC-MS/MS) in negative electrospray ionization mode. No acid was used in the entire procedure. Single laboratory validation of the method was performed in terms of linearity, precision and accuracy. A low detection limit at ppb level was achieved. Satisfactory recoveries were obtained using spiked samples at three different concentration levels. This validated method could be used in the routine inspection of the advantame level in food.Keywords: advantame, food, LC-MS/MS, sweetener
Procedia PDF Downloads 479993 Incorporating Lexical-Semantic Knowledge into Convolutional Neural Network Framework for Pediatric Disease Diagnosis
Authors: Xiaocong Liu, Huazhen Wang, Ting He, Xiaozheng Li, Weihan Zhang, Jian Chen
Abstract:
The utilization of electronic medical record (EMR) data to establish the disease diagnosis model has become an important research content of biomedical informatics. Deep learning can automatically extract features from the massive data, which brings about breakthroughs in the study of EMR data. The challenge is that deep learning lacks semantic knowledge, which leads to impracticability in medical science. This research proposes a method of incorporating lexical-semantic knowledge from abundant entities into a convolutional neural network (CNN) framework for pediatric disease diagnosis. Firstly, medical terms are vectorized into Lexical Semantic Vectors (LSV), which are concatenated with the embedded word vectors of word2vec to enrich the feature representation. Secondly, the semantic distribution of medical terms serves as Semantic Decision Guide (SDG) for the optimization of deep learning models. The study evaluate the performance of LSV-SDG-CNN model on four kinds of Chinese EMR datasets. Additionally, CNN, LSV-CNN, and SDG-CNN are designed as baseline models for comparison. The experimental results show that LSV-SDG-CNN model outperforms baseline models on four kinds of Chinese EMR datasets. The best configuration of the model yielded an F1 score of 86.20%. The results clearly demonstrate that CNN has been effectively guided and optimized by lexical-semantic knowledge, and LSV-SDG-CNN model improves the disease classification accuracy with a clear margin.Keywords: convolutional neural network, electronic medical record, feature representation, lexical semantics, semantic decision
Procedia PDF Downloads 128992 Life Prediction Method of Lithium-Ion Battery Based on Grey Support Vector Machines
Authors: Xiaogang Li, Jieqiong Miao
Abstract:
As for the problem of the grey forecasting model prediction accuracy is low, an improved grey prediction model is put forward. Firstly, use trigonometric function transform the original data sequence in order to improve the smoothness of data , this model called SGM( smoothness of grey prediction model), then combine the improved grey model with support vector machine , and put forward the grey support vector machine model (SGM - SVM).Before the establishment of the model, we use trigonometric functions and accumulation generation operation preprocessing data in order to enhance the smoothness of the data and weaken the randomness of the data, then use support vector machine (SVM) to establish a prediction model for pre-processed data and select model parameters using genetic algorithms to obtain the optimum value of the global search. Finally, restore data through the "regressive generate" operation to get forecasting data. In order to prove that the SGM-SVM model is superior to other models, we select the battery life data from calce. The presented model is used to predict life of battery and the predicted result was compared with that of grey model and support vector machines.For a more intuitive comparison of the three models, this paper presents root mean square error of this three different models .The results show that the effect of grey support vector machine (SGM-SVM) to predict life is optimal, and the root mean square error is only 3.18%. Keywords: grey forecasting model, trigonometric function, support vector machine, genetic algorithms, root mean square errorKeywords: Grey prediction model, trigonometric functions, support vector machines, genetic algorithms, root mean square error
Procedia PDF Downloads 467991 Effect of Minerals in Middlings on the Reactivity of Gasification-Coke by Blending a Large Proportion of Long Flame Coal
Authors: Jianjun Wu, Fanhui Guo, Yixin Zhang
Abstract:
In this study, gasification-coke were produced by blending the middlings (MC), and coking coal (CC) and a large proportion of long flame coal (Shenfu coal, SC), the effects of blending ratio were investigated. Mineral evolution and crystalline order obtained by XRD methods were reproduced within reasonable accuracy. Structure characteristics of partially gasification-coke such as surface area and porosity were determined using the N₂ adsorption and mercury porosimetry. Experimental data of gasification-coke was dominated by the TGA results provided trend, reactivity differences between gasification-cokes are discussed in terms of structure characteristic, crystallinity, and alkali index (AI). The first-order reaction equation was suitable for the gasification reaction kinetics of CO₂ atmosphere which was represented by the volumetric reaction model with linear correlation coefficient above 0.985. The differences in the microporous structure of gasification-coke and catalysis caused by the minerals in parent coals were supposed to be the main factors which affect its reactivity. The addition of MC made the samples enriched with a large amount of ash causing a higher surface area and a lower crystalline order to gasification-coke which was beneficial to gasification reaction. The higher SiO₂ and Al₂O₃ contents, causing a decreasing AI value and increasing activation energy, which reduced the gasification reaction activity. It was found that the increasing amount of MC got a better performance on the coke gasification reactivity by blending > 30% SC with this coking process.Keywords: low-rank coal, middlings, structure characteristic, mineral evolution, alkali index, gasification-coke, gasification kinetics
Procedia PDF Downloads 178990 Reconstruction of Visual Stimuli Using Stable Diffusion with Text Conditioning
Authors: ShyamKrishna Kirithivasan, Shreyas Battula, Aditi Soori, Richa Ramesh, Ramamoorthy Srinath
Abstract:
The human brain, among the most complex and mysterious aspects of the body, harbors vast potential for extensive exploration. Unraveling these enigmas, especially within neural perception and cognition, delves into the realm of neural decoding. Harnessing advancements in generative AI, particularly in Visual Computing, seeks to elucidate how the brain comprehends visual stimuli observed by humans. The paper endeavors to reconstruct human-perceived visual stimuli using Functional Magnetic Resonance Imaging (fMRI). This fMRI data is then processed through pre-trained deep-learning models to recreate the stimuli. Introducing a new architecture named LatentNeuroNet, the aim is to achieve the utmost semantic fidelity in stimuli reconstruction. The approach employs a Latent Diffusion Model (LDM) - Stable Diffusion v1.5, emphasizing semantic accuracy and generating superior quality outputs. This addresses the limitations of prior methods, such as GANs, known for poor semantic performance and inherent instability. Text conditioning within the LDM's denoising process is handled by extracting text from the brain's ventral visual cortex region. This extracted text undergoes processing through a Bootstrapping Language-Image Pre-training (BLIP) encoder before it is injected into the denoising process. In conclusion, a successful architecture is developed that reconstructs the visual stimuli perceived and finally, this research provides us with enough evidence to identify the most influential regions of the brain responsible for cognition and perception.Keywords: BLIP, fMRI, latent diffusion model, neural perception.
Procedia PDF Downloads 72989 Inversion of the Spectral Analysis of Surface Waves Dispersion Curves through the Particle Swarm Optimization Algorithm
Authors: A. Cerrato Casado, C. Guigou, P. Jean
Abstract:
In this investigation, the particle swarm optimization (PSO) algorithm is used to perform the inversion of the dispersion curves in the spectral analysis of surface waves (SASW) method. This inverse problem usually presents complicated solution spaces with many local minima that make difficult the convergence to the correct solution. PSO is a metaheuristic method that was originally designed to simulate social behavior but has demonstrated powerful capabilities to solve inverse problems with complex space solution and a high number of variables. The dispersion curve of the synthetic soils is constructed by the vertical flexibility coefficient method, which is especially convenient for soils where the stiffness does not increase gradually with depth. The reason is that these types of soil profiles are not normally dispersive since the dominant mode of Rayleigh waves is usually not coincident with the fundamental mode. Multiple synthetic soil profiles have been tested to show the characteristics of the convergence process and assess the accuracy of the final soil profile. In addition, the inversion procedure is applied to multiple real soils and the final profile compared with the available information. The combination of the vertical flexibility coefficient method to obtain the dispersion curve and the PSO algorithm to carry out the inversion process proves to be a robust procedure that is able to provide good solutions for complex soil profiles even with scarce prior information.Keywords: dispersion, inverse problem, particle swarm optimization, SASW, soil profile
Procedia PDF Downloads 188988 Flow and Heat Transfer Analysis of Copper-Water Nanofluid with Temperature Dependent Viscosity past a Riga Plate
Authors: Fahad Abbasi
Abstract:
Flow of electrically conducting nanofluids is of pivotal importance in countless industrial and medical appliances. Fluctuations in thermophysical properties of such fluids due to variations in temperature have not received due attention in the available literature. Present investigation aims to fill this void by analyzing the flow of copper-water nanofluid with temperature dependent viscosity past a Riga plate. Strong wall suction and viscous dissipation have also been taken into account. Numerical solutions for the resulting nonlinear system have been obtained. Results are presented in the graphical and tabular format in order to facilitate the physical analysis. An estimated expression for skin friction coefficient and Nusselt number are obtained by performing linear regression on numerical data for embedded parameters. Results indicate that the temperature dependent viscosity alters the velocity, as well as the temperature of the nanofluid and, is of considerable importance in the processes where high accuracy is desired. Addition of copper nanoparticles makes the momentum boundary layer thinner whereas viscosity parameter does not affect the boundary layer thickness. Moreover, the regression expressions indicate that magnitude of rate of change in effective skin friction coefficient and Nusselt number with respect to nanoparticles volume fraction is prominent when compared with the rate of change with variable viscosity parameter and modified Hartmann number.Keywords: heat transfer, peristaltic flows, radially varying magnetic field, curved channel
Procedia PDF Downloads 167987 Assessment of Air Pollutant Dispersion and Soil Contamination: The Critical Role of MATLAB Modeling in Evaluating Emissions from the Covanta Municipal Solid Waste Incineration Facility
Authors: Jadon Matthiasa, Cindy Donga, Ali Al Jibouria, Hsin Kuo
Abstract:
The environmental impact of emissions from the Covanta Waste-to-Energy facility in Burnaby, BC, was comprehensively evaluated, focusing on the dispersion of air pollutants and the subsequent assessment of heavy metal contamination in surrounding soils. A Gaussian Plume Model, implemented in MATLAB, was utilized to simulate the dispersion of key pollutants to understand their atmospheric behaviour and potential deposition patterns. The MATLAB code developed for this study enhanced the accuracy of pollutant concentration predictions and provided capabilities for visualizing pollutant dispersion in 3D plots. Furthermore, the code could predict the maximum concentration of pollutants at ground level, eliminating the need to use the Ranchoux model for predictions. Complementing the modelling approach, empirical soil sampling and analysis were conducted to evaluate heavy metal concentrations in the vicinity of the facility. This integrated methodology underscored the importance of computational modelling in air pollution assessment and highlighted the necessity of soil analysis to obtain a holistic understanding of environmental impacts. The findings emphasized the effectiveness of current emissions controls while advocating for ongoing monitoring to safeguard public health and environmental integrity.Keywords: air emissions, Gaussian Plume Model, MATLAB, soil contamination, air pollution monitoring, waste-to-energy, pollutant dispersion visualization, heavy metal analysis, environmental impact assessment, emission control effectiveness
Procedia PDF Downloads 24986 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem
Authors: Ouafa Amira, Jiangshe Zhang
Abstract:
Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.Keywords: clustering, fuzzy c-means, regularization, relative entropy
Procedia PDF Downloads 264985 Grammarly: Great Writings Get Work Done Using AI
Authors: Neha Intikhab Khan, Alanoud AlBalwi, Farah Alqazlan, Tala Almadoudi
Abstract:
Background: Grammarly, a widely utilized writing assistant launched in 2009, leverages advanced artificial intelligence and natural language processing to enhance writing quality across various platforms. Methods: To collect data on user perceptions of Grammarly, a structured survey was designed and distributed via Google Forms. The survey included a series of quantitative and qualitative questions aimed at assessing various aspects of Grammarly's performance. The survey comprised multiple-choice questions, Likert scale items (ranging from "strongly disagree" to "strongly agree"), and open-ended questions to capture detailed user feedback. The target population included students, friends, and family members. The collected responses were analyzed using statistical methods to quantify user satisfaction. Participation in the survey was voluntary, and respondents were assured anonymity and confidentiality. Results: The survey of 28 respondents revealed a generally favorable perception of Grammarly's AI capabilities. A significant 39.3% strongly agreed that it effectively improves text tone, with an additional 46.4% agreeing, while 10.7% remained neutral. For clarity suggestions, 28.6% strongly agreed, and 57.1% agreed, totaling 85.7% recognition of its value. Regarding grammatical accuracy across various genres, 46.4% rated it a perfect score of 5, contributing to 78.5% who found it highly effective. Conclusion: The evolution of Grammarly from a basic grammar checker to a robust AI-driven application underscores its adaptability and commitment to helping users develop their writing skills.Keywords: Grammarly, writing tool, user engagement, AI capabilities, effectiveness
Procedia PDF Downloads 9984 Design and Analysis of a Piezoelectric Linear Motor Based on Rigid Clamping
Authors: Chao Yi, Cunyue Lu, Lingwei Quan
Abstract:
Piezoelectric linear motors have the characteristics of great electromagnetic compatibility, high positioning accuracy, compact structure and no deceleration mechanism, which make it promising to applicate in micro-miniature precision drive systems. However, most piezoelectric motors are employed by flexible clamping, which has insufficient rigidity and is difficult to use in rapid positioning. Another problem is that this clamping method seriously affects the vibration efficiency of the vibrating unit. In order to solve these problems, this paper proposes a piezoelectric stack linear motor based on double-end rigid clamping. First, a piezoelectric linear motor with a length of only 35.5 mm is designed. This motor is mainly composed of a motor stator, a driving foot, a ceramic friction strip, a linear guide, a pre-tightening mechanism and a base. This structure is much simpler and smaller than most similar motors, and it is easy to assemble as well as to realize precise control. In addition, the properties of piezoelectric stack are reviewed and in order to obtain the elliptic motion trajectory of the driving head, a driving scheme of the longitudinal-shear composite stack is innovatively proposed. Finally, impedance analysis and speed performance testing were performed on the piezoelectric linear motor prototype. The motor can measure speed up to 25.5 mm/s under the excitation of signal voltage of 120 V and frequency of 390 Hz. The result shows that the proposed piezoelectric stacked linear motor obtains great performance. It can run smoothly in a large speed range, which is suitable for various precision control in medical images, aerospace, precision machinery and many other fields.Keywords: piezoelectric stack, linear motor, rigid clamping, elliptical trajectory
Procedia PDF Downloads 157983 An MIPSSTWM-based Emergency Vehicle Routing Approach for Quick Response to Highway Incidents
Authors: Siliang Luan, Zhongtai Jiang
Abstract:
The risk of highway incidents is commonly recognized as a major concern for transportation authorities due to the hazardous consequences and negative influence. It is crucial to respond to these unpredictable events as soon as possible faced by emergency management decision makers. In this paper, we focus on path planning for emergency vehicles, one of the most significant processes to avoid congestion and reduce rescue time. A Mixed-Integer Linear Programming with Semi-Soft Time Windows Model (MIPSSTWM) is conducted to plan an optimal routing respectively considering the time consumption of arcs and nodes of the urban road network and the highway network, especially in developing countries with an enormous population. Here, the arcs indicate the road segments and the nodes include the intersections of the urban road network and the on-ramp and off-ramp of the highway networks. An attempt in this research has been made to develop a comprehensive and executive strategy for emergency vehicle routing in heavy traffic conditions. The proposed Cuckoo Search (CS) algorithm is designed by imitating obligate brood parasitic behaviors of cuckoos and Lévy Flights (LF) to solve this hard and combinatorial problem. Using a Chinese city as our case study, the numerical results demonstrate the approach we applied in this paper outperforms the previous method without considering the nodes of the road network for a real-world situation. Meanwhile, the accuracy and validity of the CS algorithm also show better performances than the traditional algorithm.Keywords: emergency vehicle, path planning, cs algorithm, urban traffic management and urban planning
Procedia PDF Downloads 86982 Object-Based Image Analysis for Gully-Affected Area Detection in the Hilly Loess Plateau Region of China Using Unmanned Aerial Vehicle
Authors: Hu Ding, Kai Liu, Guoan Tang
Abstract:
The Chinese Loess Plateau suffers from serious gully erosion induced by natural and human causes. Gully features detection including gully-affected area and its two dimension parameters (length, width, area et al.), is a significant task not only for researchers but also for policy-makers. This study aims at gully-affected area detection in three catchments of Chinese Loess Plateau, which were selected in Changwu, Ansai, and Suide by using unmanned aerial vehicle (UAV). The methodology includes a sequence of UAV data generation, image segmentation, feature calculation and selection, and random forest classification. Two experiments were conducted to investigate the influences of segmentation strategy and feature selection. Results showed that vertical and horizontal root-mean-square errors were below 0.5 and 0.2 m, respectively, which were ideal for the Loess Plateau region. The segmentation strategy adopted in this paper, which considers the topographic information, and optimal parameter combination can improve the segmentation results. Besides, the overall extraction accuracy in Changwu, Ansai, and Suide achieved was 84.62%, 86.46%, and 93.06%, respectively, which indicated that the proposed method for detecting gully-affected area is more objective and effective than traditional methods. This study demonstrated that UAV can bridge the gap between field measurement and satellite-based remote sensing, obtaining a balance in resolution and efficiency for catchment-scale gully erosion research.Keywords: unmanned aerial vehicle (UAV), object-analysis image analysis, gully erosion, gully-affected area, Loess Plateau, random forest
Procedia PDF Downloads 221