Search results for: signal classification
813 The Prognostic Prediction Value of Positive Lymph Nodes Numbers for the Hypopharyngeal Squamous Cell Carcinoma
Authors: Wendu Pang, Yaxin Luo, Junhong Li, Yu Zhao, Danni Cheng, Yufang Rao, Minzi Mao, Ke Qiu, Yijun Dong, Fei Chen, Jun Liu, Jian Zou, Haiyang Wang, Wei Xu, Jianjun Ren
Abstract:
We aimed to compare the prognostic prediction value of positive lymph node number (PLNN) to the American Joint Committee on Cancer (AJCC) tumor, lymph node, and metastasis (TNM) staging system for patients with hypopharyngeal squamous cell carcinoma (HPSCC). A total of 826 patients with HPSCC from the Surveillance, Epidemiology, and End Results database (2004–2015) were identified and split into two independent cohorts: training (n=461) and validation (n=365). Univariate and multivariate Cox regression analyses were used to evaluate the prognostic effects of PLNN in patients with HPSCC. We further applied six Cox regression models to compare the survival predictive values of the PLNN and AJCC TNM staging system. PLNN showed a significant association with overall survival (OS) and cancer-specific survival (CSS) (P < 0.001) in both univariate and multivariable analyses, and was divided into three groups (PLNN 0, PLNN 1-5, and PLNN>5). In the training cohort, multivariate analysis revealed that the increased PLNN of HPSCC gave rise to significantly poor OS and CSS after adjusting for age, sex, tumor size, and cancer stage; this trend was also verified by the validation cohort. Additionally, the survival model incorporating a composite of PLNN and TNM classification (C-index, 0.705, 0.734) performed better than the PLNN and AJCC TNM models. PLNN can serve as a powerful survival predictor for patients with HPSCC and is a surrogate supplement for cancer staging systems.Keywords: hypopharyngeal squamous cell carcinoma, positive lymph nodes number, prognosis, prediction models, survival predictive values
Procedia PDF Downloads 154812 Crude Glycerol Affects Canine Spermatoa Motility: Computer Assister Semen Analysis in Vitro
Authors: P. Massanyi, L. Kichi, T. Slanina, E. Kolesar, J. Danko, N. Lukac, E. Tvrda, R. Stawarz, A. Kolesarova
Abstract:
Target of this study was the analysis of the impact of crude glycerol on canine spermatozoa motility, morphology, viability, and membrane integrity. Experiments were realized in vitro. In the study, semen from 5 large dog breeds was used. They were typical representatives of large breeds, coming from healthy rearing, regularly vaccinated and integrated to the further breeding. Semen collections were realized at the owners of animals and in the veterinary clinic. Subsequently the experiments were realized at the Department of Animal Physiology of the SUA in Nitra. The spermatozoa motility was evaluated using CASA analyzer (SpermVisionTM, Minitub, Germany) at the temperature 5 and 37°C for 5 hours. In the study, 13 motility parameters were evaluated. Generally, crude glycerol has generally negative effect on spermatozoa motility. Morphological analysis was realized using Hancock staining and the preparations were evaluated at magnification 1000x using classification tables of morphologically changed spermatozoa. Data clearly detected the highest number of morphologically changed spermatozoa in the experimental groups (know twisted tails, tail torso and tail coiling). For acrosome alterations swelled acrosomes, removed acrosomes and acrosomes with undulated membrane were detected. In this study also the effect of crude glycerol on spermatozoa membrane integrity were analyzed. The highest crude glycerol concentration significantly affects spermatozoa integrity. Results of this study show that crude glycerol has effect of spermatozoa motility, viability, and membrane integrity. Detected changes are related to crude glycerol concentration, temperature, as well as time of incubation.Keywords: dog, semen, spermatozoa, acrosome, glycerol, CASA, viability
Procedia PDF Downloads 319811 Risks in the Islamic Banking Model and Methods Adopted to Manage Them
Authors: K. P. Fasalu Rahman
Abstract:
The financial services industry of Islam include large number of institutions, such as investment banks and commercial banks, investment companies and mutual insurance companies. All types of these financial institutions should have to deal with many issues and risks in their field of work. Islamic banks should expect to face two types of risks: risks that are similar to those faced by conventional financial intermediaries and risks that are unique to the Islamic Banks due to their compliance with the Shariah. The use of financial services and products that comply with the Shariah principles cause special issues for supervision and risk management. Risks are uncertain future events that could influence the achievement of the bank’s objectives, including strategic, operational, financial and compliance objectives. In Islamic banks, effective risk management deserves special attention. As an operational problem, risk management is the classification and identification of methods, processes, and risks in banks to supervise, monitor and measure them. In comparison to conventional banks, Islamic banks face big difficulties in identifying and managing risks due to bigger complexities emerging from the profit loss sharing (PLS) concept and nature of particular risks of Islamic financing. As the developing of managing risks tool becomes very essential, especially in Islamic banking as most of the products are depending on PLS principle, identifying and measuring each type of risk is highly important and critical in any Islamic finance based systems. This paper highlights the special and general risks surrounding Islamic banking. And it investigates in detail the need for risk management in Islamic banks. In addition to analyzing the effectiveness of risk management strategies adopted by Islamic financial institutions at present, this research is also suggesting strategies for improving risk management process of Islamic banks in future.Keywords: Islamic banking, management, risk, risk management
Procedia PDF Downloads 140810 Regulation Aspects for a Radioisotope Production Installation in Brazil
Authors: Rian O. Miranda, Lidia V. de Sa, Julio C. Suita
Abstract:
The Brazilian Nuclear Energy Commission (CNEN) is the main manufacturer of radiopharmaceuticals in Brazil. The Nuclear Engineering Institute (IEN), located at Rio de Janeiro, is one of its main centers of research and production, attending public and private hospitals in the state. This radiopharmaceutical production is used in diagnostic and therapy procedures and allows one and a half million nuclear medicine procedures annually. Despite this, the country is not self-sufficient to meet national demand, creating the need for importation and consequent dependence on other countries. However, IEN facilities were designed in the 60's, and today its structure is inadequate in relation to the good manufacturing practices established by sanitary regulator (ANVISA) and radiological protection leading to the need for a new project. In order to adapt and increase production in the country, a new plant will be built and integrated to the existing facilities with a new 30 MeV Cyclotron that is actually in project detailing process. Thus, it is proposed to survey current CNEN and ANVISA standards for radiopharmaceutical production facilities, as well as the radiological protection analysis of each area of the plant, following good manufacturing practices recommendations adopted nationally besides licensing exigencies for radioactive facilities. In this way, the main requirements for proper operation, equipment location, building materials, area classification, and maintenance program have been implemented. The access controls, interlocks, segregation zones and pass-through boxes integrated into the project were also analyzed. As a result, IEN will in future have the flexibility to produce all necessary radioisotopes for nuclear medicine application, more efficiently by simultaneously bombarding two targets, allowing the simultaneous production of two different radioisotopes, minimizing radiation exposure and saving operating costs.Keywords: cyclotron, legislation, norms, production, radiopharmaceuticals
Procedia PDF Downloads 135809 A Comparison of Clinical and Pathological TNM Staging in a COVID-19 Era
Authors: Sophie Mills, Leila L. Touil, Richard Sisson
Abstract:
Introduction: The TNM classification is the global standard for the staging of head and neck cancers. Accurate clinical-radiological staging of tumours (cTNM) is essential to predict prognosis, facilitate surgical planning and determine the need for other therapeutic modalities. This study aims to determine the accuracy of pre-operative cTNM staging using pathological TNM (pTNM) and consider possible causes of TNM stage migration, noting any variation throughout the COVID-19 pandemic. Materials and Methods: A retrospective cohort study examined records of patients with surgical management of head and neck cancer at a tertiary head and neck centre from November 2019 to November 2020. Data was extracted from Somerset Cancer Registry and histopathology reports. cTNM and pTNM were compared before and during the first wave of COVID-19, as well as with other potential prognostic factors such as tumour site and tumour stage. Results: 119 cases were identified, of which 52.1% (n=62) were male, and 47.9% (n=57) were female with a mean age of 67 years. Clinical and pathological staging differed in 54.6% (n=65) of cases. Of the patients with stage migration, 40.4% (n=23) were up-staged and 59.6% (n=34) were down-staged compared with pTNM. There was no significant difference in the accuracy of cTNM staging compared with age, sex, or tumour site. There was a statistically highly significant (p < 0.001) correlation between cTNM accuracy and tumour stage, with the accuracy of cTNM staging decreasing with the advancement of pTNM staging. No statistically significant variation was noted between patients staged prior to and during COVID-19. Conclusions: Discrepancies in staging can impact management and outcomes for patients. This study found that the higher the pTNM, the more likely stage migration will occur. These findings are concordant with the oncology literature, which highlights the need to improve the accuracy of cTNM staging for more advanced tumours.Keywords: COVID-19, head and neck cancer, stage migration, TNM staging
Procedia PDF Downloads 109808 Biaxial Fatigue Specimen Design and Testing Rig Development
Authors: Ahmed H. Elkholy
Abstract:
An elastic analysis is developed to obtain the distribution of stresses, strains, bending moment and deformation for a thin hollow, variable thickness cylindrical specimen when subjected to different biaxial loadings. The specimen was subjected to a combination of internal pressure, axial tensile loading and external pressure. Several axial to circumferential stress ratios were investigated in detail. The analytical model was then validated using experimental results obtained from a test rig using several biaxial loadings. Based on the preliminary results obtained, the specimen was then modified geometrically to ensure uniform strain distribution through its wall thickness and along its gauge length. The new design of the specimen has a higher buckling strength and a maximum value of equivalent stress according to the maximum distortion energy theory. A cyclic function generator of the standard servo-controlled, electro-hydraulic testing machine is used to generate a specific signal shape (sine, square,…) at a certain frequency. The two independent controllers of the electronic circuit cause an independent movement to each servo-valve piston. The movement of each piston pressurizes the upper and lower sides of the actuators alternately. So, the specimen will be subjected to axial and diametral loads independent of each other. The hydraulic system has two different pressures: one pressure will be responsible for axial stress produced in the specimen and the other will be responsible for the tangential stress. Changing the two pressure ratios will change the stress ratios accordingly. The only restriction on the maximum stress obtained is the capacity of the testing system and specimen instability due to buckling.Keywords: biaxial, fatigue, stress, testing
Procedia PDF Downloads 128807 Scalable and Accurate Detection of Pathogens from Whole-Genome Shotgun Sequencing
Authors: Janos Juhasz, Sandor Pongor, Balazs Ligeti
Abstract:
Next-generation sequencing, especially whole genome shotgun sequencing, is becoming a common approach to gain insight into the microbiomes in a culture-independent way, even in clinical practice. It does not only give us information about the species composition of an environmental sample but opens the possibility to detect antimicrobial resistance and novel, or currently unknown, pathogens. Accurately and reliably detecting the microbial strains is a challenging task. Here we present a sensitive approach for detecting pathogens in metagenomics samples with special regard to detecting novel variants of known pathogens. We have developed a pipeline that uses fast, short read aligner programs (i.e., Bowtie2/BWA) and comprehensive nucleotide databases. Taxonomic binning is based on the lowest common ancestor (LCA) principle; each read is assigned to a taxon, covering the most significantly hit taxa. This approach helps in balancing between sensitivity and running time. The program was tested both on experimental and synthetic data. The results implicate that our method performs as good as the state-of-the-art BLAST-based ones, furthermore, in some cases, it even proves to be better, while running two orders magnitude faster. It is sensitive and capable of identifying taxa being present only in small abundance. Moreover, it needs two orders of magnitude less reads to complete the identification than MetaPhLan2 does. We analyzed an experimental anthrax dataset (B. anthracis strain BA104). The majority of the reads (96.50%) was classified as Bacillus anthracis, a small portion, 1.2%, was classified as other species from the Bacillus genus. We demonstrate that the evaluation of high-throughput sequencing data is feasible in a reasonable time with good classification accuracy.Keywords: metagenomics, taxonomy binning, pathogens, microbiome, B. anthracis
Procedia PDF Downloads 137806 Normalizing Flow to Augmented Posterior: Conditional Density Estimation with Interpretable Dimension Reduction for High Dimensional Data
Authors: Cheng Zeng, George Michailidis, Hitoshi Iyatomi, Leo L. Duan
Abstract:
The conditional density characterizes the distribution of a response variable y given other predictor x and plays a key role in many statistical tasks, including classification and outlier detection. Although there has been abundant work on the problem of Conditional Density Estimation (CDE) for a low-dimensional response in the presence of a high-dimensional predictor, little work has been done for a high-dimensional response such as images. The promising performance of normalizing flow (NF) neural networks in unconditional density estimation acts as a motivating starting point. In this work, the authors extend NF neural networks when external x is present. Specifically, they use the NF to parameterize a one-to-one transform between a high-dimensional y and a latent z that comprises two components [zₚ, zₙ]. The zₚ component is a low-dimensional subvector obtained from the posterior distribution of an elementary predictive model for x, such as logistic/linear regression. The zₙ component is a high-dimensional independent Gaussian vector, which explains the variations in y not or less related to x. Unlike existing CDE methods, the proposed approach coined Augmented Posterior CDE (AP-CDE) only requires a simple modification of the common normalizing flow framework while significantly improving the interpretation of the latent component since zₚ represents a supervised dimension reduction. In image analytics applications, AP-CDE shows good separation of 𝑥-related variations due to factors such as lighting condition and subject id from the other random variations. Further, the experiments show that an unconditional NF neural network based on an unsupervised model of z, such as a Gaussian mixture, fails to generate interpretable results.Keywords: conditional density estimation, image generation, normalizing flow, supervised dimension reduction
Procedia PDF Downloads 96805 An Analysis of Different Essential Components of Flight Plan Operations at Low Altitude
Authors: Apisit Nawapanpong, Natthapat Boonjerm
Abstract:
This project aims to analyze and identify the flight plan of low-altitude aviation in Thailand and other countries. The development of UAV technology has led the innovation and revolution in the aviation industry; this includes the development of new modes of passenger or freight transportation, and it has also affected other industries widely. At present, this technology is being developed rapidly and has been tested all over the world to make the most efficient for technology or innovation, and it is likely to grow more extensively. However, no flight plan for low-altitude operation has been published by the government organization; when compared with high-altitude aviation with manned aircraft, various unique factors are different, whether mission, operation, altitude range or airspace restrictions. In the study of the essential components of low-altitude operation measures to be practical and tangible, there were major problems, so the main consideration of this project is to analyze the components of low-altitude operations which are conducted up to the altitudes of 400 ft or 120 meters above ground level referring to the terrain, for example, air traffic management, classification of aircraft, basic necessity and safety, and control area. This research will focus on confirming the theory through qualitative and quantitative research combined with theoretical modeling and regulatory framework and by gaining insights from various positions in aviation industries, including aviation experts, government officials, air traffic controllers, pilots, and airline operators to identify the critical essential components of low-altitude flight operation. This project analyzes by using computer programs for science and statistics research to prove that the result is equivalent to the theory and be beneficial for regulating the flight plan for low-altitude operation by different essential components from this project and can be further developed for future studies and research in aviation industries.Keywords: low-altitude aviation, UAV technology, flight plan, air traffic management, safety measures
Procedia PDF Downloads 68804 Fast Return Path Planning for Agricultural Autonomous Terrestrial Robot in a Known Field
Authors: Carlo Cernicchiaro, Pedro D. Gaspar, Martim L. Aguiar
Abstract:
The agricultural sector is becoming more critical than ever in view of the expected overpopulation of the Earth. The introduction of robotic solutions in this field is an increasingly researched topic to make the most of the Earth's resources, thus going to avoid the problems of wear and tear of the human body due to the harsh agricultural work, and open the possibility of a constant careful processing 24 hours a day. This project is realized for a terrestrial autonomous robot aimed to navigate in an orchard collecting fallen peaches below the trees. When it receives the signal indicating the low battery, it has to return to the docking station where it will replace its battery and then return to the last work point and resume its routine. Considering a preset path in orchards with tree rows with variable length by which the robot goes iteratively using the algorithm D*. In case of low battery, the D* algorithm is still used to determine the fastest return path to the docking station as well as to come back from the docking station to the last work point. MATLAB simulations were performed to analyze the flexibility and adaptability of the developed algorithm. The simulation results show an enormous potential for adaptability, particularly in view of the irregularity of orchard field, since it is not flat and undergoes modifications over time from fallen branch as well as from other obstacles and constraints. The D* algorithm determines the best route in spite of the irregularity of the terrain. Moreover, in this work, it will be shown a possible solution to improve the initial points tracking and reduce time between movements.Keywords: path planning, fastest return path, agricultural autonomous terrestrial robot, docking station
Procedia PDF Downloads 134803 Artificial Intelligence Based Online Monitoring System for Cardiac Patient
Authors: Syed Qasim Gilani, Muhammad Umair, Muhammad Noman, Syed Bilawal Shah, Aqib Abbasi, Muhammad Waheed
Abstract:
Cardiovascular Diseases(CVD's) are the major cause of death in the world. The main reason for these deaths is the unavailability of first aid for heart failure. In many cases, patients die before reaching the hospital. We in this paper are presenting innovative online health service for Cardiac Patients. The proposed online health system has two ends. Users through device developed by us can communicate with their doctor through a mobile application. This interface provides them with first aid.Also by using this service, they have an easy interface with their doctors for attaining medical advice. According to the proposed system, we developed a device called Cardiac Care. Cardiac Care is a portable device which a patient can use at their home for monitoring heart condition. When a patient checks his/her heart condition, Electrocardiogram (ECG), Blood Pressure(BP), Temperature are sent to the central database. The severity of patients condition is checked using Artificial Intelligence Algorithm at the database. If the patient is suffering from the minor problem, our algorithm will suggest a prescription for patients. But if patient's condition is severe, patients record is sent to doctor through the mobile Android application. Doctor after reviewing patients condition suggests next step. If a doctor identifies the patient condition as critical, then the message is sent to the central database for sending an ambulance for the patient. Ambulance starts moving towards patient for bringing him/her to hospital. We have implemented this model at prototype level. This model will be life-saving for millions of people around the globe. According to this proposed model patients will be in contact with their doctors all the time.Keywords: cardiovascular disease, classification, electrocardiogram, blood pressure
Procedia PDF Downloads 184802 Beneficial Effect of Chromium Supplementation on Glucose, HbA1C and Lipid Variables in Individuals with Newly Onset Type-2 Diabetes
Authors: Baljinder Singh, Navneet Sharma
Abstract:
Chromium is an essential nutrient involved in normal carbohydrate and lipid metabolism. It influences glucose metabolism by potentiating the action as taking part in insulin signal amplification mechanism. A placebo-controlled single blind, prospective study was carried out to investigate the effect of chromium supplementation on blood glucose, HbA1C and lipid profile in newly onset patients with type-2 diabetes. Total 40 newly onset type-2 diabetics were selected and after one month stabilization further randomly divided into two groups viz. study group and placebo group. The study group received 9 gm brewer’s yeast (42 μ Cr) daily and the other placebo group received yeast devoid of chromium for 3 months. Subjects were instructed not to change their normal eating and living habits. Fasting blood glucose, HbA1C and lipid profile were analyzed at beginning and completion of the study. Results revealed that fasting blood glucose level significantly reduced in the subjects consuming yeast supplemented with chromium (197.65±6.68 to 103.68±6.64 mg/dl; p<0.001). HbA1C values improved significantly from 9.51±0.26% to 6.86±0.28%; p<0.001 indicating better glycaemic control. In experimental group total cholesterol, TG and LDL levels were also significantly reduced from 199.66±3.11 to 189.26±3.01 mg/dl; p<0.02, 144.94±8.31 to 126.01±8.26; p<0.05 and 119.19±1.71 to 99.58±1.10; p<0.001 respectively. These data demonstrate beneficial effect of chromium supplementation on glycaemic control and lipid variables in subjects with newly onset type-2 diabetes.Keywords: type-2 diabetes, chromium, glucose, HbA1C
Procedia PDF Downloads 242801 AI-Powered Models for Real-Time Fraud Detection in Financial Transactions to Improve Financial Security
Authors: Shanshan Zhu, Mohammad Nasim
Abstract:
Financial fraud continues to be a major threat to financial institutions across the world, causing colossal money losses and undermining public trust. Fraud prevention techniques, based on hard rules, have become ineffective due to evolving patterns of fraud in recent times. Against such a background, the present study probes into distinct methodologies that exploit emergent AI-driven techniques to further strengthen fraud detection. We would like to compare the performance of generative adversarial networks and graph neural networks with other popular techniques, like gradient boosting, random forests, and neural networks. To this end, we would recommend integrating all these state-of-the-art models into one robust, flexible, and smart system for real-time anomaly and fraud detection. To overcome the challenge, we designed synthetic data and then conducted pattern recognition and unsupervised and supervised learning analyses on the transaction data to identify which activities were fishy. With the use of actual financial statistics, we compare the performance of our model in accuracy, speed, and adaptability versus conventional models. The results of this study illustrate a strong signal and need to integrate state-of-the-art, AI-driven fraud detection solutions into frameworks that are highly relevant to the financial domain. It alerts one to the great urgency that banks and related financial institutions must rapidly implement these most advanced technologies to continue to have a high level of security.Keywords: AI-driven fraud detection, financial security, machine learning, anomaly detection, real-time fraud detection
Procedia PDF Downloads 42800 Electrochemical Behavior of Cocaine on Carbon Paste Electrode Chemically Modified with Cu(II) Trans 3-MeO Salcn Complex
Authors: Alex Soares Castro, Matheus Manoel Teles de Menezes, Larissa Silva de Azevedo, Ana Carolina Caleffi Patelli, Osmair Vital de Oliveira, Aline Thais Bruni, Marcelo Firmino de Oliveira
Abstract:
Considering the problem of the seizure of illicit drugs, as well as the development of electrochemical sensors using chemically modified electrodes, this work shows the study of the electrochemical activity of cocaine in carbon paste electrode chemically modified with Cu (II) trans 3-MeO salcn complex. In this context, cyclic voltammetry was performed on 0.1 mol.L⁻¹ KCl supporting electrolyte at a scan speed of 100 mV s⁻¹, using an electrochemical cell composed of three electrodes: Ag /AgCl electrode (filled KCl 3 mol.L⁻¹) from Metrohm® (reference electrode); a platinum spiral electrode, as an auxiliary electrode, and a carbon paste electrode chemically modified with Cu (II) trans 3-MeO complex (as working electrode). Two forms of cocaine were analyzed: cocaine hydrochloride (pH 3) and cocaine free base form (pH 8). The PM7 computational method predicted that the hydrochloride form is more stable than the free base form of cocaine, so with cyclic voltammetry, we found electrochemical signal only for cocaine in the form of hydrochloride, with an anodic peak at 1.10 V, with a linearity range between 2 and 20 μmol L⁻¹ had LD and LQ of 2.39 and 7.26x10-5 mol L⁻¹, respectively. The study also proved that cocaine is adsorbed on the surface of the working electrode, where through an irreversible process, where only anode peaks are observed, we have the oxidation of cocaine, which occurs in the hydrophilic region due to the loss of two electrons. The mechanism of this reaction was confirmed by the ab-inito quantum method.Keywords: ab-initio computational method, analytical method, cocaine, Schiff base complex, voltammetry
Procedia PDF Downloads 194799 Facilitating Written Biology Assessment in Large-Enrollment Courses Using Machine Learning
Authors: Luanna B. Prevost, Kelli Carter, Margaurete Romero, Kirsti Martinez
Abstract:
Writing is an essential scientific practice, yet, in several countries, the increasing university science class-size limits the use of written assessments. Written assessments allow students to demonstrate their learning in their own words and permit the faculty to evaluate students’ understanding. However, the time and resources required to grade written assessments prohibit their use in large-enrollment science courses. This study examined the use of machine learning algorithms to automatically analyze student writing and provide timely feedback to the faculty about students' writing in biology. Written responses to questions about matter and energy transformation were collected from large-enrollment undergraduate introductory biology classrooms. Responses were analyzed using the LightSide text mining and classification software. Cohen’s Kappa was used to measure agreement between the LightSide models and human raters. Predictive models achieved agreement with human coding of 0.7 Cohen’s Kappa or greater. Models captured that when writing about matter-energy transformation at the ecosystem level, students focused on primarily on the concepts of heat loss, recycling of matter, and conservation of matter and energy. Models were also produced to capture writing about processes such as decomposition and biochemical cycling. The models created in this study can be used to provide automatic feedback about students understanding of these concepts to biology faculty who desire to use formative written assessments in larger enrollment biology classes, but do not have the time or personnel for manual grading.Keywords: machine learning, written assessment, biology education, text mining
Procedia PDF Downloads 281798 Chaotic Sequence Noise Reduction and Chaotic Recognition Rate Improvement Based on Improved Local Geometric Projection
Authors: Rubin Dan, Xingcai Wang, Ziyang Chen
Abstract:
A chaotic time series noise reduction method based on the fusion of the local projection method, wavelet transform, and particle swarm algorithm (referred to as the LW-PSO method) is proposed to address the problem of false recognition due to noise in the recognition process of chaotic time series containing noise. The method first uses phase space reconstruction to recover the original dynamical system characteristics and removes the noise subspace by selecting the neighborhood radius; then it uses wavelet transform to remove D1-D3 high-frequency components to maximize the retention of signal information while least-squares optimization is performed by the particle swarm algorithm. The Lorenz system containing 30% Gaussian white noise is simulated and verified, and the phase space, SNR value, RMSE value, and K value of the 0-1 test method before and after noise reduction of the Schreiber method, local projection method, wavelet transform method, and LW-PSO method are compared and analyzed, which proves that the LW-PSO method has a better noise reduction effect compared with the other three common methods. The method is also applied to the classical system to evaluate the noise reduction effect of the four methods and the original system identification effect, which further verifies the superiority of the LW-PSO method. Finally, it is applied to the Chengdu rainfall chaotic sequence for research, and the results prove that the LW-PSO method can effectively reduce the noise and improve the chaos recognition rate.Keywords: Schreiber noise reduction, wavelet transform, particle swarm optimization, 0-1 test method, chaotic sequence denoising
Procedia PDF Downloads 199797 Close-Range Remote Sensing Techniques for Analyzing Rock Discontinuity Properties
Authors: Sina Fatolahzadeh, Sergio A. Sepúlveda
Abstract:
This paper presents advanced developments in close-range, terrestrial remote sensing techniques to enhance the characterization of rock masses. The study integrates two state-of-the-art laser-scanning technologies, the HandySCAN and GeoSLAM laser scanners, to extract high-resolution geospatial data for rock mass analysis. These instruments offer high accuracy, precision, low acquisition time, and high efficiency in capturing intricate geological features in small to medium size outcrops and slope cuts. Using the HandySCAN and GeoSLAM laser scanners facilitates real-time, three-dimensional mapping of rock surfaces, enabling comprehensive assessments of rock mass characteristics. The collected data provide valuable insights into structural complexities, surface roughness, and discontinuity patterns, which are essential for geological and geotechnical analyses. The synergy of these advanced remote sensing technologies contributes to a more precise and straightforward understanding of rock mass behavior. In this case, the main parameters of RQD, joint spacing, persistence, aperture, roughness, infill, weathering, water condition, and joint orientation in a slope cut along the Sea-to-Sky Highway, BC, were remotely analyzed to calculate and evaluate the Rock Mass Rating (RMR) and Geological Strength Index (GSI) classification systems. Automatic and manual analyses of the acquired data are then compared with field measurements. The results show the usefulness of the proposed remote sensing methods and their appropriate conformity with the actual field data.Keywords: remote sensing, rock mechanics, rock engineering, slope stability, discontinuity properties
Procedia PDF Downloads 66796 Electromagnetic Fields Characterization of an Urban Area in Lagos De Moreno Mexico and Its Correlation with Public Health Hazards
Authors: Marco Vinicio Félix Lerma, Efrain Rubio Rosas, Fernando Ricardez Rueda, Victor Manuel Castaño Meneses
Abstract:
This paper reports a spectral analysis of the exposure levels of radiofrequency electromagnetic fields originating from a wide variety of telecommunications sources present in an urban area of Lagos de Moreno, Jalisco, Mexico. The electromagnetic characterization of the urban zone under study was carried out by measurements in 118 sites. Measurements of TETRA,ISM434, LTE800, ISM868, GSM900, GSM1800, 3G UMTS,4G UMTS, Wlan2.4, LTE2.6, DECT, VHF Television and FM radio signals were performed at distances ranging over 10 to 1000m from 87 broadcasting towers concentrated in an urban area of about 3 hectares. The aim of these measurements is the evaluation of the electromagnetic fields power levels generated by communication systems because of their interaction with the human body. We found that in certain regions the general public exposure limits determined by ICNIRP (International Commission of Non Ionizing Radiation Protection) are overpassed from 5% up to 61% of the upper values, indicating an imminent health public hazard, whereas in other regions we found that these limits are not overpassed. This work proposes an electromagnetic pollution classification for urban zones according with ICNIRP standards. We conclude that the urban zone under study presents diverse levels of pollution and that in certain regions an electromagnetic shielding solution is needed in order to safeguard the health of the population that lives there. A practical solution in the form of paint coatings and fiber curtains for the buildings present in this zone is also proposed.Keywords: electromagnetic field, telecommunication systems, electropollution, health hazards
Procedia PDF Downloads 393795 Hands-off Parking: Deep Learning Gesture-based System for Individuals with Mobility Needs
Authors: Javier Romera, Alberto Justo, Ignacio Fidalgo, Joshue Perez, Javier Araluce
Abstract:
Nowadays, individuals with mobility needs face a significant challenge when docking vehicles. In many cases, after parking, they encounter insufficient space to exit, leading to two undesired outcomes: either avoiding parking in that spot or settling for improperly placed vehicles. To address this issue, the following paper presents a parking control system employing gestural teleoperation. The system comprises three main phases: capturing body markers, interpreting gestures, and transmitting orders to the vehicle. The initial phase is centered around the MediaPipe framework, a versatile tool optimized for real-time gesture recognition. MediaPipe excels at detecting and tracing body markers, with a special emphasis on hand gestures. Hands detection is done by generating 21 reference points for each hand. Subsequently, after data capture, the project employs the MultiPerceptron Layer (MPL) for indepth gesture classification. This tandem of MediaPipe's extraction prowess and MPL's analytical capability ensures that human gestures are translated into actionable commands with high precision. Furthermore, the system has been trained and validated within a built-in dataset. To prove the domain adaptation, a framework based on the Robot Operating System (ROS), as a communication backbone, alongside CARLA Simulator, is used. Following successful simulations, the system is transitioned to a real-world platform, marking a significant milestone in the project. This real vehicle implementation verifies the practicality and efficiency of the system beyond theoretical constructs.Keywords: gesture detection, mediapipe, multiperceptron layer, robot operating system
Procedia PDF Downloads 100794 Quantum Engine Proposal using Two-level Atom Like Manipulation and Relativistic Motoring Control
Authors: Montree Bunruangses, Sonath Bhattacharyya, Somchat Sonasang, Preecha Yupapin
Abstract:
A two-level system is manipulated by a microstrip add-drop circuit configured as an atom like system for wave-particle behavior investigation when its traveling speed along the circuit perimeter is the speed of light. The entangled pair formed by the upper and lower sideband peaks is bound by the angular displacement, which is given by 0≤θ≤π/2. The control signals associated with 3-peak signal frequencies are applied by the external inputs via the microstrip add-drop multiplexer ports, where they are time functions without the space term involved. When a system satisfies the speed of light conditions, the mass term has been changed to energy based on the relativistic limit described by the Lorentz factor and Einstein equation. The different applied frequencies can be utilized to form the 3-phase torques that can be applied for quantum engines. The experiment will use the two-level system circuit and be conducted in the laboratory. The 3-phase torques will be recorded and investigated for quantum engine driving purpose. The obtained results will be compared to the simulation. The optimum amplification of torque can be obtained by the resonant successive filtering operation. Torque will be vanished when the system is balanced at the stopped position, where |Time|=0, which is required to be a system stability condition. It will be discussed for future applications. A larger device may be tested in the future for realistic use. A synchronous and asynchronous driven motor is also discussed for the warp drive use.Keywords: quantum engine, relativistic motor, 3-phase torque, atomic engine
Procedia PDF Downloads 63793 Modelling the Effect of Biomass Appropriation for Human Use on Global Biodiversity
Authors: Karina Reiter, Stefan Dullinger, Christoph Plutzar, Dietmar Moser
Abstract:
Due to population growth and changing patterns of production and consumption, the demand for natural resources and, as a result, the pressure on Earth’s ecosystems are growing. Biodiversity mapping can be a useful tool for assessing species endangerment or detecting hotspots of extinction risks. This paper explores the benefits of using the change in trophic energy flows as a consequence of the human alteration of the biosphere in biodiversity mapping. To this end, multiple linear regression models were developed to explain species richness in areas where there is no human influence (i.e. wilderness) for three taxonomic groups (birds, mammals, amphibians). The models were then applied to predict (I) potential global species richness using potential natural vegetation (NPPpot) and (II) global ‘actual’ species richness after biomass appropriation using NPP remaining in ecosystems after harvest (NPPeco). By calculating the difference between predicted potential and predicted actual species numbers, maps of estimated species richness loss were generated. Results show that biomass appropriation for human use can indeed be linked to biodiversity loss. Areas for which the models predicted high species loss coincide with areas where species endangerment and extinctions are recorded to be particularly high by the International Union for Conservation of Nature and Natural Resources (IUCN). Furthermore, the analysis revealed that while the species distribution maps of the IUCN Red List of Threatened Species used for this research can determine hotspots of biodiversity loss in large parts of the world, the classification system for threatened and extinct species needs to be revised to better reflect local risks of extinction.Keywords: biodiversity loss, biomass harvest, human appropriation of net primary production, species richness
Procedia PDF Downloads 130792 Analysing “The Direction of Artificial Intelligence Legislation from a Global Perspective” from the Perspective of “AIGC Copyright Protection” Content
Authors: Xiaochen Mu
Abstract:
Due to the diversity of stakeholders and the ambiguity of ownership boundaries, the current protection models for Artificial Intelligence Generated Content (AIGC) have many disadvantages. In response to this situation, there are three different protection models worldwide. The United States Copyright Office stipulates that works autonomously generated by artificial intelligence ‘lack’ the element of human creation, and non-human AI cannot create works. To protect and promote investment in the field of artificial intelligence, UK legislation, through Section 9(3) of the CDPA, designates the author of AI-generated works as ‘the person by whom the arrangements necessary for the creation of the work are undertaken.’ China neither simply excludes the work attributes of AI-generated content based on the lack of a natural person subject as the sole reason, nor does it generalize that AIGC should or should not be protected. Instead, it combines specific case circumstances and comprehensively evaluates the degree of originality of AIGC and the contributions of natural persons to AIGC. In China's first AI drawing case, the court determined that the image in question was the result of the plaintiff's design and selection through inputting prompt words and setting parameters, reflecting the plaintiff's intellectual investment and personalized expression, and should be recognized as a work in the sense of copyright law. Despite opposition, the ruling also established the feasibility of the AIGC copyright protection path. The recognition of the work attributes of AIGC will not lead to overprotection that hinders the overall development of the AI industry. Just as with the legislation and regulation of AI by various countries, there is a need for a balance between protection and development. For example, the provisional agreement reached on the EU AI Act, based on a risk classification approach, seeks a dynamic balance between copyright protection and the development of the AI industry.Keywords: generative artificial intelligence, originality, works, copyright
Procedia PDF Downloads 42791 A Knowledge-Based Development of Risk Management Approaches for Construction Projects
Authors: Masoud Ghahvechi Pour
Abstract:
Risk management is a systematic and regular process of identifying, analyzing and responding to risks throughout the project's life cycle in order to achieve the optimal level of elimination, reduction or control of risk. The purpose of project risk management is to increase the probability and effect of positive events and reduce the probability and effect of unpleasant events on the project. Risk management is one of the most fundamental parts of project management, so that unmanaged or untransmitted risks can be one of the primary factors of failure in a project. Effective risk management does not apply to risk regression, which is apparently the cheapest option of the activity. However, the main problem with this option is the economic sensitivity, because what is potentially profitable is by definition risky, and what does not pose a risk is economically interesting and does not bring tangible benefits. Therefore, in relation to the implemented project, effective risk management is finding a "middle ground" in its management, which includes, on the one hand, protection against risk from a negative direction by means of accurate identification and classification of risk, which leads to analysis And it becomes a comprehensive analysis. On the other hand, management using all mathematical and analytical tools should be based on checking the maximum benefits of these decisions. Detailed analysis, taking into account all aspects of the company, including stakeholder analysis, will allow us to add what will become tangible benefits for our project in the future to effective risk management. Identifying the risk of the project is based on the theory that which type of risk may affect the project, and also refers to specific parameters and estimating the probability of their occurrence in the project. These conditions can be divided into three groups: certainty, uncertainty, and risk, which in turn support three types of investment: risk preference, risk neutrality, specific risk deviation, and its measurement. The result of risk identification and project analysis is a list of events that indicate the cause and probability of an event, and a final assessment of its impact on the environment.Keywords: risk, management, knowledge, risk management
Procedia PDF Downloads 66790 Effect of Humor on Pain and Anxiety in Patients with Rheumatoi̇d Arthri̇ti̇s: A Prospective, Randomized Controlled Study
Authors: Burcu Babadağ Savaş, Nihal Orlu, Güler Balcı Alparslan, Ertuğrul Çolak, Cengiz Korkmaz
Abstract:
Introduction/objectives: We aimed to investigate the effect of humor on pain and state anxiety in patients with rheumatoid arthritis (RA) receiving biologic intravenous (IV) infusion therapy. Method: The study sample consisted of 36 patients who met the classification criteria for RA and inclusion criteria in a rheumatology outpatient clinic at a university hospital between September 2020 and November 2021. Two sample groups were formed: the intervention group (watching a comedy movie) (n=18) and the control group (n=18). The intervention group consisted of the patient watching a comedy movie of his/her choice from an archive created by the researchers during the biological IV infusion therapy (approximately 90-120 minutes). The data collection instruments used before and after the test were the descriptive identification form, the visual analog scale (VAS), and the state anxiety scale. Results: The mean VAS scores of patients in the intervention group were 5.05 ± 2.01 in the pre-test and 2.61 ± 1.91 in the post-test. The mean state anxiety scores of patients in the intervention group were 45.94 ± 9.97 in the pre-test and 34.22 ± 6.57 in the post-test. Thus, patients who watched comedy movies during biologic IV infusion therapy in the infusion center had a greater reduction in pain scores than the control group and the effect size was small. Although there was a decrease in state anxiety scores in both groups, there was no significant difference between groups and the effect size was not relevant. Conclusions: During IV infusion therapy, watching comedy movies is recommended as a nursing care intervention for reducing pain in patients with RA in cooperation with other health professionals.Keywords: watching comedy movie, humor, pain, anxiety, nursing, care
Procedia PDF Downloads 139789 Functional Feeding Groups and Trophic Levels of Benthic Macroinvertebrates Assemblages in Albertine Rift Rivers and Streams in South Western Uganda
Authors: Peace Liz Sasha Musonge
Abstract:
Behavioral aspects of species nutrition such as feeding methods and food type are archetypal biological traits signifying how species have adapted to their environment. This concept of functional feeding groups (FFG) analysis is currently used to ascertain the trophic levels of the aquatic food web in a specific microhabitat. However, in Eastern Africa, information about the FFG classification of benthic macroinvertebrates in highland rivers and streams is almost absent, and existing studies have fragmented datasets. For this reason, we carried out a robust study to determine the feed type, trophic level and FFGs, of 56 macroinvertebrate taxa (identified to family level) from Albertine rift valley streams. Our findings showed that all five major functional feeding groups were represented; Gatherer Collectors (GC); Predators (PR); shredders (SH); Scrapers (SC); and Filterer collectors. The most dominant functional feeding group was the Gatherer Collectors (GC) that accounted for 53.5% of the total population. The most abundant (GC) families were Baetidae (7813 individuals), Chironomidae NTP (5628) and Caenidae (1848). Majority of the macroinvertebrate population feed on Fine particulate organic matter (FPOM) from the stream bottom. In terms of taxa richness the Predators (PR) had the highest value of 24 taxa and the Filterer Collectors group had the least number of taxa (3). The families that had the highest number of predators (PR) were Corixidae (1024 individuals), Coenagrionidae (445) and Libellulidae (283). However, Predators accounted for only 7.4% of the population. The findings highlighted the functional feeding groups and habitat type of macroinvertebrate communities along an altitudinal gradient.Keywords: trophic levels, functional feeding groups, macroinvertebrates, Albertine rift
Procedia PDF Downloads 235788 Evolutionary Swarm Robotics: Dynamic Subgoal-Based Path Formation and Task Allocation for Exploration and Navigation in Unknown Environments
Authors: Lavanya Ratnabala, Robinroy Peter, E. Y. A. Charles
Abstract:
This research paper addresses the challenges of exploration and navigation in unknown environments from an evolutionary swarm robotics perspective. Path formation plays a crucial role in enabling cooperative swarm robots to accomplish these tasks. The paper presents a method called the sub-goal-based path formation, which establishes a path between two different locations by exploiting visually connected sub-goals. Simulation experiments conducted in the Argos simulator demonstrate the successful formation of paths in the majority of trials. Furthermore, the paper tackles the problem of inter-collision (traffic) among a large number of robots engaged in path formation, which negatively impacts the performance of the sub-goal-based method. To mitigate this issue, a task allocation strategy is proposed, leveraging local communication protocols and light signal-based communication. The strategy evaluates the distance between points and determines the required number of robots for the path formation task, reducing unwanted exploration and traffic congestion. The performance of the sub-goal-based path formation and task allocation strategy is evaluated by comparing path length, time, and resource reduction against the A* algorithm. The simulation experiments demonstrate promising results, showcasing the scalability, robustness, and fault tolerance characteristics of the proposed approach.Keywords: swarm, path formation, task allocation, Argos, exploration, navigation, sub-goal
Procedia PDF Downloads 42787 Advances of Image Processing in Precision Agriculture: Using Deep Learning Convolution Neural Network for Soil Nutrient Classification
Authors: Halimatu S. Abdullahi, Ray E. Sheriff, Fatima Mahieddine
Abstract:
Agriculture is essential to the continuous existence of human life as they directly depend on it for the production of food. The exponential rise in population calls for a rapid increase in food with the application of technology to reduce the laborious work and maximize production. Technology can aid/improve agriculture in several ways through pre-planning and post-harvest by the use of computer vision technology through image processing to determine the soil nutrient composition, right amount, right time, right place application of farm input resources like fertilizers, herbicides, water, weed detection, early detection of pest and diseases etc. This is precision agriculture which is thought to be solution required to achieve our goals. There has been significant improvement in the area of image processing and data processing which has being a major challenge. A database of images is collected through remote sensing, analyzed and a model is developed to determine the right treatment plans for different crop types and different regions. Features of images from vegetations need to be extracted, classified, segmented and finally fed into the model. Different techniques have been applied to the processes from the use of neural network, support vector machine, fuzzy logic approach and recently, the most effective approach generating excellent results using the deep learning approach of convolution neural network for image classifications. Deep Convolution neural network is used to determine soil nutrients required in a plantation for maximum production. The experimental results on the developed model yielded results with an average accuracy of 99.58%.Keywords: convolution, feature extraction, image analysis, validation, precision agriculture
Procedia PDF Downloads 316786 Iron Response Element-mRNA Binding to Iron Response Protein: Metal Ion Sensing
Authors: Mateen A. Khan, Elizabeth J. Theil, Dixie J. Goss
Abstract:
Cellular iron homeostasis is accomplished by the coordinated regulated expression of iron uptake, storage, and export. Iron regulate the translation of ferritin and mitochondrial aconitase iron responsive element (IRE)-mRNA by interaction with an iron regulatory protein (IRPs). Iron increases protein biosynthesis encoded in iron responsive element. The noncoding structure IRE-mRNA, approximately 30-nt, folds into a stem loop to control synthesis of proteins in iron trafficking, cell cycling, and nervous system function. Fluorescence anisotropy measurements showed the presence of one binding site on IRP1 for ferritin and mitochondrial aconitase IRE-mRNA. Scatchard analysis revealed the binding affinity (Kₐ) and average binding sites (n) for ferritin and mitochondrial aconitase IRE-mRNA were 68.7 x 10⁶ M⁻¹ and 9.2 x 10⁶ M⁻¹, respectively. In order to understand the relative importance of equilibrium and stability, we further report the contribution of electrostatic interactions in the overall binding of two IRE-mRNA with IRP1. The fluorescence quenching of IRP1 protein was measured at different ionic strengths. The binding affinity of IRE-mRNA to IRP1 decreases with increasing ionic strength, but the number of binding sites was independent of ionic strength. Such results indicate a differential contribution of electrostatics to the interaction of IRE-mRNA with IRP1, possibly related to helix bending or stem interactions and an overall conformational change. Selective destabilization of ferritin and mitochondrial aconitase RNA/protein complexes as reported here explain in part the quantitative differences in signal response to iron in vivo and indicate possible new regulatory interactions.Keywords: IRE-mRNA, IRP1, binding, ionic strength
Procedia PDF Downloads 128785 Effective Dose and Size Specific Dose Estimation with and without Tube Current Modulation for Thoracic Computed Tomography Examinations: A Phantom Study
Authors: S. Gharbi, S. Labidi, M. Mars, M. Chelli, F. Ladeb
Abstract:
The purpose of this study is to reduce radiation dose for chest CT examination by including Tube Current Modulation (TCM) to a standard CT protocol. A scan of an anthropomorphic male Alderson phantom was performed on a 128-slice scanner. The estimation of effective dose (ED) in both scans with and without mAs modulation was done via multiplication of Dose Length Product (DLP) to a conversion factor. Results were compared to those measured with a CT-Expo software. The size specific dose estimation (SSDE) values were obtained by multiplication of the volume CT dose index (CTDIvol) with a conversion size factor related to the phantom’s effective diameter. Objective assessment of image quality was performed with Signal to Noise Ratio (SNR) measurements in phantom. SPSS software was used for data analysis. Results showed including CARE Dose 4D; ED was lowered by 48.35% and 51.51% using DLP and CT-expo, respectively. In addition, ED ranges between 7.01 mSv and 6.6 mSv in case of standard protocol, while it ranges between 3.62 mSv and 3.2 mSv with TCM. Similar results are found for SSDE; dose was higher without TCM of 16.25 mGy and was lower by 48.8% including TCM. The SNR values calculated were significantly different (p=0.03<0.05). The highest one is measured on images acquired with TCM and reconstructed with Filtered back projection (FBP). In conclusion, this study proves the potential of TCM technique in SSDE and ED reduction and in conserving image quality with high diagnostic reference level for thoracic CT examinations.Keywords: anthropomorphic phantom, computed tomography, CT-expo, radiation dose
Procedia PDF Downloads 221784 An Improved Robust Algorithm Based on Cubature Kalman Filter for Single-Frequency Global Navigation Satellite System/Inertial Navigation Tightly Coupled System
Authors: Hao Wang, Shuguo Pan
Abstract:
The Global Navigation Satellite System (GNSS) signal received by the dynamic vehicle in the harsh environment will be frequently interfered with and blocked, which generates gross error affecting the positioning accuracy of the GNSS/Inertial Navigation System (INS) integrated navigation. Therefore, this paper put forward an improved robust Cubature Kalman filter (CKF) algorithm for single-frequency GNSS/INS tightly coupled system ambiguity resolution. Firstly, the dynamic model and measurement model of a single-frequency GNSS/INS tightly coupled system was established, and the method for GNSS integer ambiguity resolution with INS aided is studied. Then, we analyzed the influence of pseudo-range observation with gross error on GNSS/INS integrated positioning accuracy. To reduce the influence of outliers, this paper improved the CKF algorithm and realized an intelligent selection of robust strategies by judging the ill-conditioned matrix. Finally, a field navigation test was performed to demonstrate the effectiveness of the proposed algorithm based on the double-differenced solution mode. The experiment has proved the improved robust algorithm can greatly weaken the influence of separate, continuous, and hybrid observation anomalies for enhancing the reliability and accuracy of GNSS/INS tightly coupled navigation solutions.Keywords: GNSS/INS integrated navigation, ambiguity resolution, Cubature Kalman filter, Robust algorithm
Procedia PDF Downloads 100