Search results for: real time anomaly detection
21517 Ultra Wideband Breast Cancer Detection by Using SAR for Indication the Tumor Location
Authors: Wittawat Wasusathien, Samran Santalunai, Thanaset Thosdeekoraphat, Chanchai Thongsopa
Abstract:
This paper presents breast cancer detection by observing the specific absorption rate (SAR) intensity for identification tumor location, the tumor is identified in coordinates (x,y,z) system. We examined the frequency between 4-8 GHz to look for the most appropriate frequency. Results are simulated in frequency 4-8 GHz, the model overview include normal breast with 50 mm radian, 5 mm diameter of tumor, and ultra wideband (UWB) bowtie antenna. The models are created and simulated in CST Microwave Studio. For this simulation, we changed antenna to 5 location around the breast, the tumor can be detected when an antenna is close to the tumor location, which the coordinate of maximum SAR is approximated the tumor location. For reliable, we experiment by random tumor location to 3 position in the same size of tumor and simulation the result again by varying the antenna position in 5 position again, and it also detectable the tumor position from the antenna that nearby tumor position by maximum value of SAR, which it can be detected the tumor with precision in all frequency between 4-8 GHz.Keywords: specific absorption rate (SAR), ultra wideband (UWB), coordinates, cancer detection
Procedia PDF Downloads 40321516 Biosorption of Cu (II) and Zn (II) from Real Wastewater onto Cajanus cajan Husk
Authors: Mallappa A. Devani, John U. Kennedy Oubagaranadin, Basudeb Munshi
Abstract:
In this preliminary work, locally available husk of Cajanus cajan (commonly known in India as Tur or Arhar), a bio-waste, has been used in its physically treated and chemically activated form for the removal of binary Cu (II) and Zn(II) ions from the real waste water obtained from an electroplating industry in Bangalore, Karnataka, India and from laboratory prepared binary solutions having almost similar composition of the metal ions, for comparison. The real wastewater after filtration and dilution for five times was used for biosorption studies at the normal pH of the solutions at room temperature. Langmuir's binary model was used to calculate the metal uptake capacities of the biosorbents. It was observed that Cu(II) is more competitive than Zn(II) in biosorption. In individual metal biosorption, Cu(II) uptake was found to be more than that of the Zn(II) and a similar trend was observed in the binary metal biosorption from real wastewater and laboratory prepared solutions. FTIR analysis was carried out to identify the functional groups in the industrial wastewater and EDAX for the elemental analysis of the biosorbents after experiments.Keywords: biosorption, Cajanus cajan, multi metal remediation, wastewater
Procedia PDF Downloads 38621515 DEEPMOTILE: Motility Analysis of Human Spermatozoa Using Deep Learning in Sri Lankan Population
Authors: Chamika Chiran Perera, Dananjaya Perera, Chirath Dasanayake, Banuka Athuraliya
Abstract:
Male infertility is a major problem in the world, and it is a neglected and sensitive health issue in Sri Lanka. It can be determined by analyzing human semen samples. Sperm motility is one of many factors that can evaluate male’s fertility potential. In Sri Lanka, this analysis is performed manually. Manual methods are time consuming and depend on the person, but they are reliable and it can depend on the expert. Machine learning and deep learning technologies are currently being investigated to automate the spermatozoa motility analysis, and these methods are unreliable. These automatic methods tend to produce false positive results and false detection. Current automatic methods support different techniques, and some of them are very expensive. Due to the geographical variance in spermatozoa characteristics, current automatic methods are not reliable for motility analysis in Sri Lanka. The suggested system, DeepMotile, is to explore a method to analyze motility of human spermatozoa automatically and present it to the andrology laboratories to overcome current issues. DeepMotile is a novel deep learning method for analyzing spermatozoa motility parameters in the Sri Lankan population. To implement the current approach, Sri Lanka patient data were collected anonymously as a dataset, and glass slides were used as a low-cost technique to analyze semen samples. Current problem was identified as microscopic object detection and tackling the problem. YOLOv5 was customized and used as the object detector, and it achieved 94 % mAP (mean average precision), 86% Precision, and 90% Recall with the gathered dataset. StrongSORT was used as the object tracker, and it was validated with andrology experts due to the unavailability of annotated ground truth data. Furthermore, this research has identified many potential ways for further investigation, and andrology experts can use this system to analyze motility parameters with realistic accuracy.Keywords: computer vision, deep learning, convolutional neural networks, multi-target tracking, microscopic object detection and tracking, male infertility detection, motility analysis of human spermatozoa
Procedia PDF Downloads 10621514 One-Step Synthesis of Fluorescent Carbon Dots in a Green Way as Effective Fluorescent Probes for Detection of Iron Ions and pH Value
Authors: Mostafa Ghasemi, Andrew Urquhart
Abstract:
In this study, fluorescent carbon dots (CDs) were synthesized in a green way using a one-step hydrothermal method. Carbon dots are carbon-based nanomaterials with a size of less than 10 nm, unique structure, and excellent properties such as low toxicity, good biocompatibility, tunable fluorescence, excellent photostability, and easy functionalization. These properties make them a good candidate to use in different fields such as biological sensing, photocatalysis, photodynamic, and drug delivery. Fourier transformed infrared (FTIR) spectra approved OH/NH groups on the surface of the as-synthesized CDs, and UV-vis spectra showed excellent fluorescence quenching effect of Fe (III) ion on the as-synthesized CDs with high selectivity detection compared with other metal ions. The probe showed a linear response concentration range (0–2.0 mM) to Fe (III) ion, and the limit of detection was calculated to be about 0.50 μM. In addition, CDs also showed good sensitivity to the pH value in the range from 2 to 14, indicating great potential as a pH sensor.Keywords: carbon dots, fluorescence, pH sensing, metal ions sensor
Procedia PDF Downloads 7521513 Rapid Atmospheric Pressure Photoionization-Mass Spectrometry (APPI-MS) Method for the Detection of Polychlorinated Dibenzo-P-Dioxins and Dibenzofurans in Real Environmental Samples Collected within the Vicinity of Industrial Incinerators
Authors: M. Amo, A. Alvaro, A. Astudillo, R. Mc Culloch, J. C. del Castillo, M. Gómez, J. M. Martín
Abstract:
Polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) of course comprise a range of highly toxic compounds that may exist as particulates within the air or accumulate within water supplies, soil, or vegetation. They may be created either ubiquitously or naturally within the environment as a product of forest fires or volcanic eruptions. It is only since the industrial revolution, however, that it has become necessary to closely monitor their generation as a byproduct of manufacturing/combustion processes, in an effort to mitigate widespread contamination events. Of course, the environmental concentrations of these toxins are expected to be extremely low, therefore highly sensitive and accurate methods are required for their determination. Since ionization of non-polar compounds through electrospray and APCI is difficult and inefficient, we evaluate the performance of a novel low-flow Atmospheric Pressure Photoionization (APPI) source for the trace detection of various dioxins and furans using rapid Mass Spectrometry workflows. Air, soil and biota (vegetable matter) samples were collected monthly during one year from various locations within the vicinity of an industrial incinerator in Spain. Analytes were extracted and concentrated using soxhlet extraction in toluene and concentrated by rotavapor and nitrogen flow. Various ionization methods as electrospray (ES) and atmospheric pressure chemical ionization (APCI) were evaluated, however, only the low-flow APPI source was capable of providing the necessary performance, in terms of sensitivity, required for detecting all targeted analytes. In total, 10 analytes including 2,3,7,8-tetrachlorodibenzodioxin (TCDD) were detected and characterized using the APPI-MS method. Both PCDDs and PCFDs were detected most efficiently in negative ionization mode. The most abundant ion always corresponded to the loss of a chlorine and addition of an oxygen, yielding [M-Cl+O]- ions. MRM methods were created in order to provide selectivity for each analyte. No chromatographic separation was employed; however, matrix effects were determined to have a negligible impact on analyte signals. Triple Quadrupole Mass Spectrometry was chosen because of its unique potential for high sensitivity and selectivity. The mass spectrometer used was a Sciex´s Qtrap3200 working in negative Multi Reacting Monitoring Mode (MRM). Typically mass detection limits were determined to be near the 1-pg level. The APPI-MS2 technology applied to the detection of PCDD/Fs allows fast and reliable atmospheric analysis, minimizing considerably operational times and costs, with respect other technologies available. In addition, the limit of detection can be easily improved using a more sensitive mass spectrometer since the background in the analysis channel is very low. The APPI developed by SEADM allows polar and non-polar compounds ionization with high efficiency and repeatability.Keywords: atmospheric pressure photoionization-mass spectrometry (APPI-MS), dioxin, furan, incinerator
Procedia PDF Downloads 20821512 Robust Segmentation of Salient Features in Automatic Breast Ultrasound (ABUS) Images
Authors: Lamees Nasser, Yago Diez, Robert Martí, Joan Martí, Ibrahim Sadek
Abstract:
Automated 3D breast ultrasound (ABUS) screening is a novel modality in medical imaging because of its common characteristics shared with other ultrasound modalities in addition to the three orthogonal planes (i.e., axial, sagittal, and coronal) that are useful in analysis of tumors. In the literature, few automatic approaches exist for typical tasks such as segmentation or registration. In this work, we deal with two problems concerning ABUS images: nipple and rib detection. Nipple and ribs are the most visible and salient features in ABUS images. Determining the nipple position plays a key role in some applications for example evaluation of registration results or lesion follow-up. We present a nipple detection algorithm based on color and shape of the nipple, besides an automatic approach to detect the ribs. In point of fact, rib detection is considered as one of the main stages in chest wall segmentation. This approach consists of four steps. First, images are normalized in order to minimize the intensity variability for a given set of regions within the same image or a set of images. Second, the normalized images are smoothed by using anisotropic diffusion filter. Next, the ribs are detected in each slice by analyzing the eigenvalues of the 3D Hessian matrix. Finally, a breast mask and a probability map of regions detected as ribs are used to remove false positives (FP). Qualitative and quantitative evaluation obtained from a total of 22 cases is performed. For all cases, the average and standard deviation of the root mean square error (RMSE) between manually annotated points placed on the rib surface and detected points on rib borders are 15.1188 mm and 14.7184 mm respectively.Keywords: Automated 3D Breast Ultrasound, Eigenvalues of Hessian matrix, Nipple detection, Rib detection
Procedia PDF Downloads 33021511 In vitro P-Glycoprotein Modulation: Combinatorial Approach Using Natural Products
Authors: Jagdish S. Patel, Piyush Chudasama
Abstract:
Context: Over-expression of P-glycoprotein (P-gp) plays critical role in absorption of many drug candidates which results into lower bioavailability of the drug. P-glycoprotein also over expresses in many pathological conditions like diabetes, affecting the drug therapy. Modulation of P-gp expression using inhibitors can help in designing novel formulation enhancing the bioavailability of the drug in question. Objectives: The main focus of the study was to develop advanced glycation end products (AGEs) induced P-gp over expression in Caco-2 cells. Curcumin, piperine and epigallocatechin gallate were used to evaluate their P-gp inhibitory action using combinatorial approach. Materials and methods: Methylglyoxal (MG) induced P-gp over expression was checked in Caco-2 cells using real time PCR. P-gp inhibitory effects of the phytochemicals were measured after induction with MG alone and in combination of any two compounds. Cytotoxicity of each of the phytochemical was evaluated using MTT assay. Results: Induction with MG (100mM) significantly induced the over expression of P-glycoprotein in Caco-2 cells after 24 hr. Curcumin, piperine and epigallocatechin gallate alone significantly reduced the level of P-gp within 6 hr of treatment period monitored by real time PCR. The combination of any two phytochemical also down regulated the expression of P-gp in cells. Combinations of Curcumin and epigallocatechin gallate have shown significant down regulation when compared with other two combinations. Conclusions: Combinatorial approach for down regulating the expression of P-gp, in pathological conditions like diabetes, has demonstrated promising approach for therapeutic purpose.Keywords: p-glycoprotein, curcumin, piperine, epigallocatechin gallate, p-gp inhibition
Procedia PDF Downloads 33421510 Synthesis and Characterization of CNPs Coated Carbon Nanorods for Cd2+ Ion Adsorption from Industrial Waste Water and Reusable for Latent Fingerprint Detection
Authors: Bienvenu Gael Fouda Mbanga
Abstract:
This study reports a new approach of preparation of carbon nanoparticles coated cerium oxide nanorods (CNPs/CeONRs) nanocomposite and reusing the spent adsorbent of Cd2+- CNPs/CeONRs nanocomposite for latent fingerprint detection (LFP) after removing Cd2+ ions from aqueous solution. CNPs/CeONRs nanocomposite was prepared by using CNPs and CeONRs with adsorption processes. The prepared nanocomposite was then characterized by using UV-visible spectroscopy (UV-visible), Fourier transforms infrared spectroscopy (FTIR), X-ray diffraction pattern (XRD), scanning electron microscope (SEM), Transmission electron microscopy (TEM), Energy-dispersive X-ray spectroscopy (EDS), Zeta potential, X-ray photoelectron spectroscopy (XPS). The average size of the CNPs was 7.84nm. The synthesized CNPs/CeONRs nanocomposite has proven to be a good adsorbent for Cd2+ removal from water with optimum pH 8, dosage 0. 5 g / L. The results were best described by the Langmuir model, which indicated a linear fit (R2 = 0.8539-0.9969). The adsorption capacity of CNPs/CeONRs nanocomposite showed the best removal of Cd2+ ions with qm = (32.28-59.92 mg/g), when compared to previous reports. This adsorption followed pseudo-second order kinetics and intra particle diffusion processes. ∆G and ∆H values indicated spontaneity at high temperature (40oC) and the endothermic nature of the adsorption process. CNPs/CeONRs nanocomposite therefore showed potential as an effective adsorbent. Furthermore, the metal loaded on the adsorbent Cd2+- CNPs/CeONRs has proven to be sensitive and selective for LFP detection on various porous substrates. Hence Cd2+-CNPs/CeONRs nanocomposite can be reused as a good fingerprint labelling agent in LFP detection so as to avoid secondary environmental pollution by disposal of the spent adsorbent.Keywords: Cd2+-CNPs/CeONRs nanocomposite, cadmium adsorption, isotherm, kinetics, thermodynamics, reusable for latent fingerprint detection
Procedia PDF Downloads 12121509 Generating Swarm Satellite Data Using Long Short-Term Memory and Generative Adversarial Networks for the Detection of Seismic Precursors
Authors: Yaxin Bi
Abstract:
Accurate prediction and understanding of the evolution mechanisms of earthquakes remain challenging in the fields of geology, geophysics, and seismology. This study leverages Long Short-Term Memory (LSTM) networks and Generative Adversarial Networks (GANs), a generative model tailored to time-series data, for generating synthetic time series data based on Swarm satellite data, which will be used for detecting seismic anomalies. LSTMs demonstrated commendable predictive performance in generating synthetic data across multiple countries. In contrast, the GAN models struggled to generate synthetic data, often producing non-informative values, although they were able to capture the data distribution of the time series. These findings highlight both the promise and challenges associated with applying deep learning techniques to generate synthetic data, underscoring the potential of deep learning in generating synthetic electromagnetic satellite data.Keywords: LSTM, GAN, earthquake, synthetic data, generative AI, seismic precursors
Procedia PDF Downloads 3221508 A Novel Nano-Chip Card Assay as Rapid Test for Diagnosis of Lymphatic Filariasis Compared to Nano-Based Enzyme Linked Immunosorbent Assay
Authors: Ibrahim Aly, Manal Ahmed, Mahmoud M. El-Shall
Abstract:
Filariasis is a parasitic disease caused by small roundworms. The filarial worms are transmitted and spread by blood-feeding black flies and mosquitoes. Lymphatic filariasis (Elephantiasis) is caused by Wuchereriabancrofti, Brugiamalayi, and Brugiatimori. Elimination of Lymphatic filariasis necessitates an increasing demand for valid, reliable, and rapid diagnostic kits. Nanodiagnostics involve the use of nanotechnology in clinical diagnosis to meet the demands for increased sensitivity, specificity, and early detection in less time. The aim of this study was to evaluate the nano-based enzymelinked immunosorbent assay (ELISA) and novel nano-chip card as a rapid test for detection of filarial antigen in serum samples of human filariasis in comparison with traditional -ELISA. Serum samples were collected from an infected human with filarial gathered across Egypt's governorates. After receiving informed consenta total of 45 blood samples of infected individuals residing in different villages in Gharbea governorate, which isa nonendemic region for bancroftianfilariasis, healthy persons living in nonendemic locations (20 persons), as well as sera from 20 other parasites, affected patients were collected. The microfilaria was checked in thick smears of 20 µl night blood samples collected during 20-22 hrs. All of these individuals underwent the following procedures: history taking, clinical examination, and laboratory investigations, which included examination of blood samples for microfilaria using thick blood film and serological tests for detection of the circulating filarial antigen using polyclonal antibody- ELISA, nano-based ELISA, and nano-chip card. In the present study, a recently reported polyoclonal antibody specific to tegumental filarial antigen was used in developing nano-chip card and nano-ELISA compared to traditional ELISA for the detection of circulating filarial antigen in sera of patients with bancroftianfilariasis. The performance of the ELISA was evaluated using 45 serum samples. The ELISA was positive with sera from microfilaremicbancroftianfilariasis patients (n = 36) with a sensitivity of 80 %. Circulating filarial antigen was detected in 39/45 patients who were positive for circulating filarial antigen using nano-ELISA with a sensitivity of 86.6 %. On the other hand, 42 out of 45 patients were positive for circulating filarial antigen using nano-chip card with a sensitivity of 93.3%.In conclusion, using a novel nano-chip assay could potentially be a promising alternative antigen detection test for bancroftianfilariasis.Keywords: lymphatic filariasis, nanotechnology, rapid diagnosis, elisa technique
Procedia PDF Downloads 11521507 Real-Time Gesture Recognition System Using Microsoft Kinect
Authors: Ankita Wadhawan, Parteek Kumar, Umesh Kumar
Abstract:
Gesture is any body movement that expresses some attitude or any sentiment. Gestures as a sign language are used by deaf people for conveying messages which helps in eliminating the communication barrier between deaf people and normal persons. Nowadays, everybody is using mobile phone and computer as a very important gadget in their life. But there are some physically challenged people who are blind/deaf and the use of mobile phone or computer like device is very difficult for them. So, there is an immense need of a system which works on body gesture or sign language as input. In this research, Microsoft Kinect Sensor, SDK V2 and Hidden Markov Toolkit (HTK) are used to recognize the object, motion of object and human body joints through Touch less NUI (Natural User Interface) in real-time. The depth data collected from Microsoft Kinect has been used to recognize gestures of Indian Sign Language (ISL). The recorded clips are analyzed using depth, IR and skeletal data at different angles and positions. The proposed system has an average accuracy of 85%. The developed Touch less NUI provides an interface to recognize gestures and controls the cursor and click operation in computer just by waving hand gesture. This research will help deaf people to make use of mobile phones, computers and socialize among other persons in the society.Keywords: gesture recognition, Indian sign language, Microsoft Kinect, natural user interface, sign language
Procedia PDF Downloads 30621506 The Optimal Irrigation in the Mitidja Plain
Authors: Gherbi Khadidja
Abstract:
In the Mediterranean region, water resources are limited and very unevenly distributed in space and time. The main objective of this project is the development of a wireless network for the management of water resources in northern Algeria, the Mitidja plain, which helps farmers to irrigate in the most optimized way and solve the problem of water shortage in the region. Therefore, we will develop an aid tool that can modernize and replace some traditional techniques, according to the real needs of the crops and according to the soil conditions as well as the climatic conditions (soil moisture, precipitation, characteristics of the unsaturated zone), These data are collected in real-time by sensors and analyzed by an algorithm and displayed on a mobile application and the website. The results are essential information and alerts with recommendations for action to farmers to ensure the sustainability of the agricultural sector under water shortage conditions. In the first part: We want to set up a wireless sensor network, for precise management of water resources, by presenting another type of equipment that allows us to measure the water content of the soil, such as the Watermark probe connected to the sensor via the acquisition card and an Arduino Uno, which allows collecting the captured data and then program them transmitted via a GSM module that will send these data to a web site and store them in a database for a later study. In a second part: We want to display the results on a website or a mobile application using the database to remotely manage our smart irrigation system, which allows the farmer to use this technology and offers the possibility to the growers to access remotely via wireless communication to see the field conditions and the irrigation operation, at home or at the office. The tool to be developed will be based on satellite imagery as regards land use and soil moisture. These tools will make it possible to follow the evolution of the needs of the cultures in time, but also to time, and also to predict the impact on water resources. According to the references consulted, if such a tool is used, it can reduce irrigation volumes by up to up to 40%, which represents more than 100 million m3 of savings per year for the Mitidja. This volume is equivalent to a medium-size dam.Keywords: optimal irrigation, soil moisture, smart irrigation, water management
Procedia PDF Downloads 10921505 Mineralized Nanoparticles as a Contrast Agent for Ultrasound and Magnetic Resonance Imaging
Authors: Jae Won Lee, Kyung Hyun Min, Hong Jae Lee, Sang Cheon Lee
Abstract:
To date, imaging techniques have attracted much attention in medicine because the detection of diseases at an early stage provides greater opportunities for successful treatment. Consequently, over the past few decades, diverse imaging modalities including magnetic resonance (MR), positron emission tomography, computed tomography, and ultrasound (US) have been developed and applied widely in the field of clinical diagnosis. However, each of the above-mentioned imaging modalities possesses unique strengths and intrinsic weaknesses, which limit their abilities to provide accurate information. Therefore, multimodal imaging systems may be a solution that can provide improved diagnostic performance. Among the current medical imaging modalities, US is a widely available real-time imaging modality. It has many advantages including safety, low cost and easy access for patients. However, its low spatial resolution precludes accurate discrimination of diseased region such as cancer sites. In contrast, MR has no tissue-penetrating limit and can provide images possessing exquisite soft tissue contrast and high spatial resolution. However, it cannot offer real-time images and needs a comparatively long imaging time. The characteristics of these imaging modalities may be considered complementary, and the modalities have been frequently combined for the clinical diagnostic process. Biominerals such as calcium carbonate (CaCO3) and calcium phosphate (CaP) exhibit pH-dependent dissolution behavior. They demonstrate pH-controlled drug release due to the dissolution of minerals in acidic pH conditions. In particular, the application of this mineralization technique to a US contrast agent has been reported recently. The CaCO3 mineral reacts with acids and decomposes to generate calcium dioxide (CO2) gas in an acidic environment. These gas-generating mineralized nanoparticles generated CO2 bubbles in the acidic environment of the tumor, thereby allowing for strong echogenic US imaging of tumor tissues. On the basis of this previous work, it was hypothesized that the loading of MR contrast agents into the CaCO3 mineralized nanoparticles may be a novel strategy in designing a contrast agent for dual imaging. Herein, CaCO3 mineralized nanoparticles that were capable of generating CO2 bubbles to trigger the release of entrapped MR contrast agents in response to tumoral acidic pH were developed for the purposes of US and MR dual-modality imaging of tumors. Gd2O3 nanoparticles were selected as an MR contrast agent. A key strategy employed in this study was to prepare Gd2O3 nanoparticle-loaded mineralized nanoparticles (Gd2O3-MNPs) using block copolymer-templated CaCO3 mineralization in the presence of calcium cations (Ca2+), carbonate anions (CO32-) and positively charged Gd2O3 nanoparticles. The CaCO3 core was considered suitable because it may effectively shield Gd2O3 nanoparticles from water molecules in the blood (pH 7.4) before decomposing to generate CO2 gas, triggering the release of Gd2O3 nanoparticles in tumor tissues (pH 6.4~7.4). The kinetics of CaCO3 dissolution and CO2 generation from the Gd2O3-MNPs were examined as a function of pH and pH-dependent in vitro magnetic relaxation; additionally, the echogenic properties were estimated to demonstrate the potential of the particles for the tumor-specific US and MR imaging.Keywords: calcium carbonate, mineralization, ultrasound imaging, magnetic resonance imaging
Procedia PDF Downloads 23621504 Development and Evaluation of Virtual Basketball Game Using Motion Capture Technology
Authors: Shunsuke Aoki, Taku Ri, Tatsuya Yamazaki
Abstract:
These days, along with the development of e-sports, video games as a competitive sport is attracting attention. But, in many cases, action in the screen does not match the real motion of operation. Inclusiveness of player motion is needed to increase reality and excitement for sports games. Therefore, in this study, the authors propose a method to recognize player motion by using the motion capture technology and develop a virtual basketball game. The virtual basketball game consists of a screen with nine targets, players, depth sensors, and no ball. The players pretend a two-handed basketball shot without a ball aiming at one of the nine targets on the screen. Time-series data of three-dimensional coordinates of player joints are captured by the depth sensor. 20 joints data are measured for each player to estimate the shooting motion in real-time. The trajectory of the thrown virtual ball is calculated based on the time-series data and hitting on the target is judged as success or failure. The virtual basketball game can be played by 2 to 4 players as a competitive game among the players. The developed game was exhibited to the public for evaluation on the authors' university open campus days. 339 visitors participated in the exhibition and enjoyed the virtual basketball game over the two days. A questionnaire survey on the developed game was conducted for the visitors who experienced the game. As a result of the survey, about 97.3% of the players found the game interesting regardless of whether they had experienced actual basketball before or not. In addition, it is found that women are easy to comfort for shooting motion. The virtual game with motion capture technology has the potential to become a universal entertainment between e-sports and actual sports.Keywords: basketball, motion capture, questionnaire survey, video ga
Procedia PDF Downloads 12621503 Sensor Fault-Tolerant Model Predictive Control for Linear Parameter Varying Systems
Authors: Yushuai Wang, Feng Xu, Junbo Tan, Xueqian Wang, Bin Liang
Abstract:
In this paper, a sensor fault-tolerant control (FTC) scheme using robust model predictive control (RMPC) and set theoretic fault detection and isolation (FDI) is extended to linear parameter varying (LPV) systems. First, a group of set-valued observers are designed for passive fault detection (FD) and the observer gains are obtained through minimizing the size of invariant set of state estimation-error dynamics. Second, an input set for fault isolation (FI) is designed offline through set theory for actively isolating faults after FD. Third, an RMPC controller based on state estimation for LPV systems is designed to control the system in the presence of disturbance and measurement noise and tolerate faults. Besides, an FTC algorithm is proposed to maintain the plant operate in the corresponding mode when the fault occurs. Finally, a numerical example is used to show the effectiveness of the proposed results.Keywords: fault detection, linear parameter varying, model predictive control, set theory
Procedia PDF Downloads 25221502 Efficiency on the Enteric Viral Removal in Four Potable Water Treatment Plants in Northeastern Colombia
Authors: Raquel Amanda Villamizar Gallardo, Oscar Orlando Ortíz Rodríguez
Abstract:
Enteric viruses are cosmopolitan agents present in several environments including water. These viruses can cause different diseases including gastroenteritis, hepatitis, conjunctivitis, respiratory problems among others. Although in Colombia there are not regulations concerning to routine viral analysis of drinking water, an enhanced understanding of viral pollution and resistance to treatments is desired in order to assure pure water to the population. Viral detection is often complex due to the need of specialized and time-consuming procedures. In addition, viruses are highly diluted in water which is a drawback from the analytical point of view. To this end, a fast and selective detection method for detection enteric viruses (i.e. Hepatitis A and Rotavirus) were applied. Micro- magnetic particles were functionalized with monoclonal antibodies anti-Hepatitis and anti-Rotavirus and they were used to capture, concentrate and separate whole viral particles in raw water and drinking water samples from four treatment plants identified as CAR-01, MON-02, POR-03, TON-04 and located in the Northeastern Colombia. Viruses were molecularly by using RT-PCR One Step Superscript III. Each plant was analyzed at the entry and exit points, in order to determine the initial presence and eventual reduction of Hepatitis A and Rotavirus after disinfection. The results revealed the presence of both enteric viruses in a 100 % of raw water analyzed in all plants. This represents a potential health hazard, especially for those people whose use this water for agricultural purposes. However, in drinking water analysis, enteric viruses was only positive in CAR-01, where was found the presence of Rotavirus. As a conclusion, the results confirm Rotavirus as the best indicator to evaluate the efficacy of potable treatment plant in eliminating viruses. CAR potable water plant should improve their disinfection process in order to remove efficiently enteric viruses.Keywords: drinking water, hepatitis A, rotavirus, virus removal
Procedia PDF Downloads 23321501 Testing the Change in Correlation Structure across Markets: High-Dimensional Data
Authors: Malay Bhattacharyya, Saparya Suresh
Abstract:
The Correlation Structure associated with a portfolio is subjected to vary across time. Studying the structural breaks in the time-dependent Correlation matrix associated with a collection had been a subject of interest for a better understanding of the market movements, portfolio selection, etc. The current paper proposes a methodology for testing the change in the time-dependent correlation structure of a portfolio in the high dimensional data using the techniques of generalized inverse, singular valued decomposition and multivariate distribution theory which has not been addressed so far. The asymptotic properties of the proposed test are derived. Also, the performance and the validity of the method is tested on a real data set. The proposed test performs well for detecting the change in the dependence of global markets in the context of high dimensional data.Keywords: correlation structure, high dimensional data, multivariate distribution theory, singular valued decomposition
Procedia PDF Downloads 12521500 Fault Detection and Diagnosis of Broken Bar Problem in Induction Motors Base Wavelet Analysis and EMD Method: Case Study of Mobarakeh Steel Company in Iran
Authors: M. Ahmadi, M. Kafil, H. Ebrahimi
Abstract:
Nowadays, induction motors have a significant role in industries. Condition monitoring (CM) of this equipment has gained a remarkable importance during recent years due to huge production losses, substantial imposed costs and increases in vulnerability, risk, and uncertainty levels. Motor current signature analysis (MCSA) is one of the most important techniques in CM. This method can be used for rotor broken bars detection. Signal processing methods such as Fast Fourier transformation (FFT), Wavelet transformation and Empirical Mode Decomposition (EMD) are used for analyzing MCSA output data. In this study, these signal processing methods are used for broken bar problem detection of Mobarakeh steel company induction motors. Based on wavelet transformation method, an index for fault detection, CF, is introduced which is the variation of maximum to the mean of wavelet transformation coefficients. We find that, in the broken bar condition, the amount of CF factor is greater than the healthy condition. Based on EMD method, the energy of intrinsic mode functions (IMF) is calculated and finds that when motor bars become broken the energy of IMFs increases.Keywords: broken bar, condition monitoring, diagnostics, empirical mode decomposition, fourier transform, wavelet transform
Procedia PDF Downloads 15021499 Scheduling Algorithm Based on Load-Aware Queue Partitioning in Heterogeneous Multi-Core Systems
Authors: Hong Kai, Zhong Jun Jie, Chen Lin Qi, Wang Chen Guang
Abstract:
There are inefficient global scheduling parallelism and local scheduling parallelism prone to processor starvation in current scheduling algorithms. Regarding this issue, this paper proposed a load-aware queue partitioning scheduling strategy by first allocating the queues according to the number of processor cores, calculating the load factor to specify the load queue capacity, and it assigned the awaiting nodes to the appropriate perceptual queues through the precursor nodes and the communication computation overhead. At the same time, real-time computation of the load factor could effectively prevent the processor from being starved for a long time. Experimental comparison with two classical algorithms shows that there is a certain improvement in both performance metrics of scheduling length and task speedup ratio.Keywords: load-aware, scheduling algorithm, perceptual queue, heterogeneous multi-core
Procedia PDF Downloads 14521498 Optimizing Electric Vehicle Charging with Charging Data Analytics
Authors: Tayyibah Khanam, Mohammad Saad Alam, Sanchari Deb, Yasser Rafat
Abstract:
Electric vehicles are considered as viable replacements to gasoline cars since they help in reducing harmful emissions and stimulate power generation through renewable energy sources, hence contributing to sustainability. However, one of the significant obstacles in the mass deployment of electric vehicles is the charging time anxiety among users and, thus, the subsequent large waiting times for available chargers at charging stations. Data analytics, on the other hand, has revolutionized the decision-making tasks of management and operating systems since its arrival. In this paper, we attempt to optimize the choice of EV charging stations for users in their vicinity by minimizing the time taken to reach the charging stations and the waiting times for available chargers. Time taken to travel to the charging station is calculated by the Google Maps API and the waiting times are predicted by polynomial regression of the historical data stored. The proposed framework utilizes real-time data and historical data from all operating charging stations in the city and assists the user in finding the best suitable charging station for their current situation and can be implemented in a mobile phone application. The algorithm successfully predicts the most optimal choice of a charging station and the minimum required time for various sample data sets.Keywords: charging data, electric vehicles, machine learning, waiting times
Procedia PDF Downloads 19421497 Pyramidal Lucas-Kanade Optical Flow Based Moving Object Detection in Dynamic Scenes
Authors: Hyojin Lim, Cuong Nguyen Khac, Yeongyu Choi, Ho-Youl Jung
Abstract:
In this paper, we propose a simple moving object detection, which is based on motion vectors obtained from pyramidal Lucas-Kanade optical flow. The proposed method detects moving objects such as pedestrians, the other vehicles and some obstacles at the front-side of the host vehicle, and it can provide the warning to the driver. Motion vectors are obtained by using pyramidal Lucas-Kanade optical flow, and some outliers are eliminated by comparing the amplitude of each vector with the pre-defined threshold value. The background model is obtained by calculating the mean and the variance of the amplitude of recent motion vectors in the rectangular shaped local region called the cell. The model is applied as the reference to classify motion vectors of moving objects and those of background. Motion vectors are clustered to rectangular regions by using the unsupervised clustering K-means algorithm. Labeling method is applied to label groups which is close to each other, using by distance between each center points of rectangular. Through the simulations tested on four kinds of scenarios such as approaching motorbike, vehicle, and pedestrians to host vehicle, we prove that the proposed is simple but efficient for moving object detection in parking lots.Keywords: moving object detection, dynamic scene, optical flow, pyramidal optical flow
Procedia PDF Downloads 34921496 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks
Authors: Wang Yichen, Haruka Yamashita
Abstract:
In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.Keywords: recurrent neural network, players lineup, basketball data, decision making model
Procedia PDF Downloads 13321495 Advanced Data Visualization Techniques for Effective Decision-making in Oil and Gas Exploration and Production
Authors: Deepak Singh, Rail Kuliev
Abstract:
This research article explores the significance of advanced data visualization techniques in enhancing decision-making processes within the oil and gas exploration and production domain. With the oil and gas industry facing numerous challenges, effective interpretation and analysis of vast and diverse datasets are crucial for optimizing exploration strategies, production operations, and risk assessment. The article highlights the importance of data visualization in managing big data, aiding the decision-making process, and facilitating communication with stakeholders. Various advanced data visualization techniques, including 3D visualization, augmented reality (AR), virtual reality (VR), interactive dashboards, and geospatial visualization, are discussed in detail, showcasing their applications and benefits in the oil and gas sector. The article presents case studies demonstrating the successful use of these techniques in optimizing well placement, real-time operations monitoring, and virtual reality training. Additionally, the article addresses the challenges of data integration and scalability, emphasizing the need for future developments in AI-driven visualization. In conclusion, this research emphasizes the immense potential of advanced data visualization in revolutionizing decision-making processes, fostering data-driven strategies, and promoting sustainable growth and improved operational efficiency within the oil and gas exploration and production industry.Keywords: augmented reality (AR), virtual reality (VR), interactive dashboards, real-time operations monitoring
Procedia PDF Downloads 8621494 Detection of Atrial Fibrillation Using Wearables via Attentional Two-Stream Heterogeneous Networks
Authors: Huawei Bai, Jianguo Yao, Fellow, IEEE
Abstract:
Atrial fibrillation (AF) is the most common form of heart arrhythmia and is closely associated with mortality and morbidity in heart failure, stroke, and coronary artery disease. The development of single spot optical sensors enables widespread photoplethysmography (PPG) screening, especially for AF, since it represents a more convenient and noninvasive approach. To our knowledge, most existing studies based on public and unbalanced datasets can barely handle the multiple noises sources in the real world and, also, lack interpretability. In this paper, we construct a large- scale PPG dataset using measurements collected from PPG wrist- watch devices worn by volunteers and propose an attention-based two-stream heterogeneous neural network (TSHNN). The first stream is a hybrid neural network consisting of a three-layer one-dimensional convolutional neural network (1D-CNN) and two-layer attention- based bidirectional long short-term memory (Bi-LSTM) network to learn representations from temporally sampled signals. The second stream extracts latent representations from the PPG time-frequency spectrogram using a five-layer CNN. The outputs from both streams are fed into a fusion layer for the outcome. Visualization of the attention weights learned demonstrates the effectiveness of the attention mechanism against noise. The experimental results show that the TSHNN outperforms all the competitive baseline approaches and with 98.09% accuracy, achieves state-of-the-art performance.Keywords: PPG wearables, atrial fibrillation, feature fusion, attention mechanism, hyber network
Procedia PDF Downloads 12121493 Trend Analysis of Annual Total Precipitation Data in Konya
Authors: Naci Büyükkaracığan
Abstract:
Hydroclimatic observation values are used in the planning of the project of water resources. Climate variables are the first of the values used in planning projects. At the same time, the climate system is a complex and interactive system involving the atmosphere, land surfaces, snow and bubbles, the oceans and other water structures. The amount and distribution of precipitation, which is an important climate parameter, is a limiting environmental factor for dispersed living things. Trend analysis is applied to the detection of the presence of a pattern or trend in the data set. Many trends work in different parts of the world are usually made for the determination of climate change. The detection and attribution of past trends and variability in climatic variables is essential for explaining potential future alteration resulting from anthropogenic activities. Parametric and non-parametric tests are used for determining the trends in climatic variables. In this study, trend tests were applied to annual total precipitation data obtained in period of 1972 and 2012, in the Konya Basin. Non-parametric trend tests, (Sen’s T, Spearman’s Rho, Mann-Kendal, Sen’s T trend, Wald-Wolfowitz) and parametric test (mean square) were applied to annual total precipitations of 15 stations for trend analysis. The linear slopes (change per unit time) of trends are calculated by using a non-parametric estimator developed by Sen. The beginning of trends is determined by using the Mann-Kendall rank correlation test. In addition, homogeneities in precipitation trends are tested by using a method developed by Van Belle and Hughes. As a result of tests, negative linear slopes were found in annual total precipitations in Konya.Keywords: trend analysis, precipitation, hydroclimatology, Konya
Procedia PDF Downloads 21921492 Survey of Intrusion Detection Systems and Their Assessment of the Internet of Things
Authors: James Kaweesa
Abstract:
The Internet of Things (IoT) has become a critical component of modern technology, enabling the connection of numerous devices to the internet. The interconnected nature of IoT devices, along with their heterogeneous and resource-constrained nature, makes them vulnerable to various types of attacks, such as malware, denial-of-service attacks, and network scanning. Intrusion Detection Systems (IDSs) are a key mechanism for protecting IoT networks and from attacks by identifying and alerting administrators to suspicious activities. In this review, the paper will discuss the different types of IDSs available for IoT systems and evaluate their effectiveness in detecting and preventing attacks. Also, examine the various evaluation methods used to assess the performance of IDSs and the challenges associated with evaluating them in IoT environments. The review will highlight the need for effective and efficient IDSs that can cope with the unique characteristics of IoT networks, including their heterogeneity, dynamic topology, and resource constraints. The paper will conclude by indicating where further research is needed to develop IDSs that can address these challenges and effectively protect IoT systems from cyber threats.Keywords: cyber-threats, iot, intrusion detection system, networks
Procedia PDF Downloads 8021491 A Study of Relational Factors Associated with Online Celebrity Business and Consumer Purchase Intention
Authors: Sixing Chen, Shuai Yang
Abstract:
Online celebrity business, also known as Internet celebrity business (or Wanghong business in Chinese), is an emerging relational C2C business model, and an alternative to traditional C2C transactional business models. There are already millions of these consumers, and this number is growing. In this model, consumer purchase decisions are driven by recommendations and endorsements in videos posted online by celebrities. The purpose of this paper is to determine the relational constructs within consumer relationships in the Internet celebrity business model and to investigate relationships between the constructs and consumer purchase intention. A questionnaire-based study was conducted with consumers who had an awareness of, or prior purchase experience with online celebrities. The results of exploratory factor analysis (EFA) and multiple regression analysis revealed three valid relational constructs: product experience sharing, lifestyle association, and real-time interaction. This study indicated that these constructs had the direct effect on consumer preference and purchase intention. The findings of this study provide insight into a business model in which online shopping is driven by celebrities. They suggest that online celebrities should pay more attention to product experience sharing, life style association and real-time interaction for managing their product promotions. These are the most salient factors with respect to the relational constructs identified in this study.Keywords: customer relationship, customer to customer, Internet celebrity, online celebrity, online marketing, purchase intention
Procedia PDF Downloads 31821490 Application of Data Driven Based Models as Early Warning Tools of High Stream Flow Events and Floods
Authors: Mohammed Seyam, Faridah Othman, Ahmed El-Shafie
Abstract:
The early warning of high stream flow events (HSF) and floods is an important aspect in the management of surface water and rivers systems. This process can be performed using either process-based models or data driven-based models such as artificial intelligence (AI) techniques. The main goal of this study is to develop efficient AI-based model for predicting the real-time hourly stream flow (Q) and apply it as early warning tool of HSF and floods in the downstream area of the Selangor River basin, taken here as a paradigm of humid tropical rivers in Southeast Asia. The performance of AI-based models has been improved through the integration of the lag time (Lt) estimation in the modelling process. A total of 8753 patterns of Q, water level, and rainfall hourly records representing one-year period (2011) were utilized in the modelling process. Six hydrological scenarios have been arranged through hypothetical cases of input variables to investigate how the changes in RF intensity in upstream stations can lead formation of floods. The initial SF was changed for each scenario in order to include wide range of hydrological situations in this study. The performance evaluation of the developed AI-based model shows that high correlation coefficient (R) between the observed and predicted Q is achieved. The AI-based model has been successfully employed in early warning throughout the advance detection of the hydrological conditions that could lead to formations of floods and HSF, where represented by three levels of severity (i.e., alert, warning, and danger). Based on the results of the scenarios, reaching the danger level in the downstream area required high RF intensity in at least two upstream areas. According to results of applications, it can be concluded that AI-based models are beneficial tools to the local authorities for flood control and awareness.Keywords: floods, stream flow, hydrological modelling, hydrology, artificial intelligence
Procedia PDF Downloads 24821489 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks
Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone
Abstract:
Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing
Procedia PDF Downloads 18821488 Blockchain Technology for Secure and Transparent Oil and Gas Supply Chain Management
Authors: Gaurav Kumar Sinha
Abstract:
The oil and gas industry, characterized by its complex and global supply chains, faces significant challenges in ensuring security, transparency, and efficiency. Blockchain technology, with its decentralized and immutable ledger, offers a transformative solution to these issues. This paper explores the application of blockchain technology in the oil and gas supply chain, highlighting its potential to enhance data security, improve transparency, and streamline operations. By leveraging smart contracts, blockchain can automate and secure transactions, reducing the risk of fraud and errors. Additionally, the integration of blockchain with IoT devices enables real-time tracking and monitoring of assets, ensuring data accuracy and integrity throughout the supply chain. Case studies and pilot projects within the industry demonstrate the practical benefits and challenges of implementing blockchain solutions. The findings suggest that blockchain technology can significantly improve trust and collaboration among supply chain participants, ultimately leading to more efficient and resilient operations. This study provides valuable insights for industry stakeholders considering the adoption of blockchain technology to address their supply chain management challenges.Keywords: blockchain technology, oil and gas supply chain, data security, transparency, smart contracts, IoT integration, real-time tracking, asset monitoring, fraud reduction, supply chain efficiency, data integrity, case studies, industry implementation, trust, collaboration.
Procedia PDF Downloads 36