Search results for: Motion Detection
2945 Fusion Models for Cyber Threat Defense: Integrating Clustering, Random Forests, and Support Vector Machines to Against Windows Malware
Authors: Azita Ramezani, Atousa Ramezani
Abstract:
In the ever-escalating landscape of windows malware the necessity for pioneering defense strategies turns into undeniable this study introduces an avant-garde approach fusing the capabilities of clustering random forests and support vector machines SVM to combat the intricate web of cyber threats our fusion model triumphs with a staggering accuracy of 98.67 and an equally formidable f1 score of 98.68 a testament to its effectiveness in the realm of windows malware defense by deciphering the intricate patterns within malicious code our model not only raises the bar for detection precision but also redefines the paradigm of cybersecurity preparedness this breakthrough underscores the potential embedded in the fusion of diverse analytical methodologies and signals a paradigm shift in fortifying against the relentless evolution of windows malicious threats as we traverse through the dynamic cybersecurity terrain this research serves as a beacon illuminating the path toward a resilient future where innovative fusion models stand at the forefront of cyber threat defense.Keywords: fusion models, cyber threat defense, windows malware, clustering, random forests, support vector machines (SVM), accuracy, f1-score, cybersecurity, malicious code detection
Procedia PDF Downloads 712944 Bridging Urban Planning and Environmental Conservation: A Regional Analysis of Northern and Central Kolkata
Authors: Tanmay Bisen, Aastha Shayla
Abstract:
This study introduces an advanced approach to tree canopy detection in urban environments and a regional analysis of Northern and Central Kolkata that delves into the intricate relationship between urban development and environmental conservation. Leveraging high-resolution drone imagery from diverse urban green spaces in Kolkata, we fine-tuned the deep forest model to enhance its precision and accuracy. Our results, characterized by an impressive Intersection over Union (IoU) score of 0.90 and a mean average precision (mAP) of 0.87, underscore the model's robustness in detecting and classifying tree crowns amidst the complexities of aerial imagery. This research not only emphasizes the importance of model customization for specific datasets but also highlights the potential of drone-based remote sensing in urban forestry studies. The study investigates the spatial distribution, density, and environmental impact of trees in Northern and Central Kolkata. The findings underscore the significance of urban green spaces in met-ropolitan cities, emphasizing the need for sustainable urban planning that integrates green infrastructure for ecological balance and human well-being.Keywords: urban greenery, advanced spatial distribution analysis, drone imagery, deep learning, tree detection
Procedia PDF Downloads 562943 Detection of Some Drugs of Abuse from Fingerprints Using Liquid Chromatography-Mass Spectrometry
Authors: Ragaa T. Darwish, Maha A. Demellawy, Haidy M. Megahed, Doreen N. Younan, Wael S. Kholeif
Abstract:
The testing of drug abuse is authentic in order to affirm the misuse of drugs. Several analytical approaches have been developed for the detection of drugs of abuse in pharmaceutical and common biological samples, but few methodologies have been created to identify them from fingerprints. Liquid Chromatography-Mass Spectrometry (LC-MS) plays a major role in this field. The current study aimed at assessing the possibility of detection of some drugs of abuse (tramadol, clonazepam, and phenobarbital) from fingerprints using LC-MS in drug abusers. The aim was extended in order to assess the possibility of detection of the above-mentioned drugs in fingerprints of drug handlers till three days of handling the drugs. The study was conducted on randomly selected adult individuals who were either drug abusers seeking treatment at centers of drug dependence in Alexandria, Egypt or normal volunteers who were asked to handle the different studied drugs (drug handlers). An informed consent was obtained from all individuals. Participants were classified into 3 groups; control group that consisted of 50 normal individuals (neither abusing nor handling drugs), drug abuser group that consisted of 30 individuals who abused tramadol, clonazepam or phenobarbital (10 individuals for each drug) and drug handler group that consisted of 50 individuals who were touching either the powder of drugs of abuse: tramadol, clonazepam or phenobarbital (10 individuals for each drug) or the powder of the control substances which were of similar appearance (white powder) and that might be used in the adulteration of drugs of abuse: acetyl salicylic acid and acetaminophen (10 individuals for each drug). Samples were taken from the handler individuals for three consecutive days for the same individual. The diagnosis of drug abusers was based on the current Diagnostic and Statistical Manual of Mental disorders (DSM-V) and urine screening tests using immunoassay technique. Preliminary drug screening tests of urine samples were also done for drug handlers and the control groups to indicate the presence or absence of the studied drugs of abuse. Fingerprints of all participants were then taken on a filter paper previously soaked with methanol to be analyzed by LC-MS using SCIEX Triple Quad or QTRAP 5500 System. The concentration of drugs in each sample was calculated using the regression equations between concentration in ng/ml and peak area of each reference standard. All fingerprint samples from drug abusers showed positive results with LC-MS for the tested drugs, while all samples from the control individuals showed negative results. A significant difference was noted between the concentration of the drugs and the duration of abuse. Tramadol, clonazepam, and phenobarbital were also successfully detected from fingerprints of drug handlers till 3 days of handling the drugs. The mean concentration of the chosen drugs of abuse among the handlers group decreased when the days of samples intake increased.Keywords: drugs of abuse, fingerprints, liquid chromatography–mass spectrometry, tramadol
Procedia PDF Downloads 1212942 On the Use of Machine Learning for Tamper Detection
Authors: Basel Halak, Christian Hall, Syed Abdul Father, Nelson Chow Wai Kit, Ruwaydah Widaad Raymode
Abstract:
The attack surface on computing devices is becoming very sophisticated, driven by the sheer increase of interconnected devices, reaching 50B in 2025, which makes it easier for adversaries to have direct access and perform well-known physical attacks. The impact of increased security vulnerability of electronic systems is exacerbated for devices that are part of the critical infrastructure or those used in military applications, where the likelihood of being targeted is very high. This continuously evolving landscape of security threats calls for a new generation of defense methods that are equally effective and adaptive. This paper proposes an intelligent defense mechanism to protect from physical tampering, it consists of a tamper detection system enhanced with machine learning capabilities, which allows it to recognize normal operating conditions, classify known physical attacks and identify new types of malicious behaviors. A prototype of the proposed system has been implemented, and its functionality has been successfully verified for two types of normal operating conditions and further four forms of physical attacks. In addition, a systematic threat modeling analysis and security validation was carried out, which indicated the proposed solution provides better protection against including information leakage, loss of data, and disruption of operation.Keywords: anti-tamper, hardware, machine learning, physical security, embedded devices, ioT
Procedia PDF Downloads 1532941 Steel Bridge Coating Inspection Using Image Processing with Neural Network Approach
Authors: Ahmed Elbeheri, Tarek Zayed
Abstract:
Steel bridges deterioration has been one of the problems in North America for the last years. Steel bridges deterioration mainly attributed to the difficult weather conditions. Steel bridges suffer fatigue cracks and corrosion, which necessitate immediate inspection. Visual inspection is the most common technique for steel bridges inspection, but it depends on the inspector experience, conditions, and work environment. So many Non-destructive Evaluation (NDE) models have been developed use Non-destructive technologies to be more accurate, reliable and non-human dependent. Non-destructive techniques such as The Eddy Current Method, The Radiographic Method (RT), Ultra-Sonic Method (UT), Infra-red thermography and Laser technology have been used. Digital Image processing will be used for Corrosion detection as an Alternative for visual inspection. Different models had used grey-level and colored digital image for processing. However, color image proved to be better as it uses the color of the rust to distinguish it from the different backgrounds. The detection of the rust is an important process as it’s the first warning for the corrosion and a sign of coating erosion. To decide which is the steel element to be repainted and how urgent it is the percentage of rust should be calculated. In this paper, an image processing approach will be developed to detect corrosion and its severity. Two models were developed 1st to detect rust and 2nd to detect rust percentage.Keywords: steel bridge, bridge inspection, steel corrosion, image processing
Procedia PDF Downloads 3062940 Carbon-Nanodots Modified Glassy Carbon Electrode for the Electroanalysis of Selenium in Water
Authors: Azeez O. Idris, Benjamin O. Orimolade, Potlako J. Mafa, Alex T. Kuvarega, Usisipho Feleni, Bhekie B. Mamba
Abstract:
We report a simple and cheaper method for the electrochemical detection of Se(IV) using carbon nanodots (CNDTs) prepared from oat. The carbon nanodots were synthesised by green and facile approach and characterised using scanning electron microscopy, high-resolution transmission electron microscopy, Fourier transform infrared spectroscopy, X-ray diffraction, and Raman spectroscopy. The CNDT was used to fabricate an electrochemical sensor for the quantification of Se(IV) in water. The modification of glassy carbon electrode (GCE) with carbon nanodots led to an increase in the electroactive surface area of the electrode, which enhances the redox current peak of [Fe(CN)₆]₃₋/₄‒ in comparison to the bare GCE. Using the square wave voltammetry, the detection limit and quantification limit of 0.05 and 0.167 ppb were obtained under the optimised parameters using deposition potential of -200 mV, 0.1 M HNO₃ electrolyte, electrodeposition time of 60 s, and pH 1. The results further revealed that the GCE-CNDT was not susceptible to many interfering cations except Cu(II) and Pb(II), and Fe(II). The sensor fabrication involves a one-step electrode modification and was used to detect Se(IV) in a real water sample, and the result obtained is in agreement with the inductively coupled plasma technique. Overall, the electrode offers a cheap, fast, and sensitive way of detecting selenium in environmental matrices.Keywords: carbon nanodots, square wave voltammetry, nanomaterials, selenium, sensor
Procedia PDF Downloads 912939 Ischemic Stroke Detection in Computed Tomography Examinations
Authors: Allan F. F. Alves, Fernando A. Bacchim Neto, Guilherme Giacomini, Marcela de Oliveira, Ana L. M. Pavan, Maria E. D. Rosa, Diana R. Pina
Abstract:
Stroke is a worldwide concern, only in Brazil it accounts for 10% of all registered deaths. There are 2 stroke types, ischemic (87%) and hemorrhagic (13%). Early diagnosis is essential to avoid irreversible cerebral damage. Non-enhanced computed tomography (NECT) is one of the main diagnostic techniques used due to its wide availability and rapid diagnosis. Detection depends on the size and severity of lesions and the time spent between the first symptoms and examination. The Alberta Stroke Program Early CT Score (ASPECTS) is a subjective method that increases the detection rate. The aim of this work was to implement an image segmentation system to enhance ischemic stroke and to quantify the area of ischemic and hemorrhagic stroke lesions in CT scans. We evaluated 10 patients with NECT examinations diagnosed with ischemic stroke. Analyzes were performed in two axial slices, one at the level of the thalamus and basal ganglion and one adjacent to the top edge of the ganglionic structures with window width between 80 and 100 Hounsfield Units. We used different image processing techniques such as morphological filters, discrete wavelet transform and Fuzzy C-means clustering. Subjective analyzes were performed by a neuroradiologist according to the ASPECTS scale to quantify ischemic areas in the middle cerebral artery region. These subjective analysis results were compared with objective analyzes performed by the computational algorithm. Preliminary results indicate that the morphological filters actually improve the ischemic areas for subjective evaluations. The comparison in area of the ischemic region contoured by the neuroradiologist and the defined area by computational algorithm showed no deviations greater than 12% in any of the 10 examination tests. Although there is a tendency that the areas contoured by the neuroradiologist are smaller than those obtained by the algorithm. These results show the importance of a computer aided diagnosis software to assist neuroradiology decisions, especially in critical situations as the choice of treatment for ischemic stroke.Keywords: ischemic stroke, image processing, CT scans, Fuzzy C-means
Procedia PDF Downloads 3662938 Hamilton-Jacobi Treatment of Damped Motion
Authors: Khaled I. Nawafleh
Abstract:
In this work, we apply the method of Hamilton-Jacobi to obtain solutions of Hamiltonian systems in classical mechanics with two certain structures: the first structure plays a central role in the theory of time-dependent Hamiltonians, whilst the second is used to treat classical Hamiltonians, including dissipation terms. It is proved that the generalization of problems from the calculus of variation methods in the nonstationary case can be obtained naturally in Hamilton-Jacobi formalism. Then, another expression of geometry of the Hamilton Jacobi equation is retrieved for Hamiltonians with time-dependent and frictional terms. Both approaches shall be applied to many physical examples.Keywords: Hamilton-Jacobi, time dependent lagrangians, dissipative systems, variational principle
Procedia PDF Downloads 1792937 An Experimental Study on the Optimum Installation of Fire Detector for Early Stage Fire Detecting in Rack-Type Warehouses
Authors: Ki Ok Choi, Sung Ho Hong, Dong Suck Kim, Don Mook Choi
Abstract:
Rack type warehouses are different from general buildings in the kinds, amount, and arrangement of stored goods, so the fire risk of rack type warehouses is different from those buildings. The fire pattern of rack type warehouses is different in combustion characteristic and storing condition of stored goods. The initial fire burning rate is different in the surface condition of materials, but the running time of fire is closely related with the kinds of stored materials and stored conditions. The stored goods of the warehouse are consisted of diverse combustibles, combustible liquid, and so on. Fire detection time may be delayed because the residents are less than office and commercial buildings. If fire detectors installed in rack type warehouses are inadaptable, the fire of the warehouse may be the great fire because of delaying of fire detection. In this paper, we studied what kinds of fire detectors are optimized in early detecting of rack type warehouse fire by real-scale fire tests. The fire detectors used in the tests are rate of rise type, fixed type, photo electric type, and aspirating type detectors. We considered optimum fire detecting method in rack type warehouses suggested by the response characteristic and comparative analysis of the fire detectors.Keywords: fire detector, rack, response characteristic, warehouse
Procedia PDF Downloads 7452936 Self-Supervised Learning for Hate-Speech Identification
Authors: Shrabani Ghosh
Abstract:
Automatic offensive language detection in social media has become a stirring task in today's NLP. Manual Offensive language detection is tedious and laborious work where automatic methods based on machine learning are only alternatives. Previous works have done sentiment analysis over social media in different ways such as supervised, semi-supervised, and unsupervised manner. Domain adaptation in a semi-supervised way has also been explored in NLP, where the source domain and the target domain are different. In domain adaptation, the source domain usually has a large amount of labeled data, while only a limited amount of labeled data is available in the target domain. Pretrained transformers like BERT, RoBERTa models are fine-tuned to perform text classification in an unsupervised manner to perform further pre-train masked language modeling (MLM) tasks. In previous work, hate speech detection has been explored in Gab.ai, which is a free speech platform described as a platform of extremist in varying degrees in online social media. In domain adaptation process, Twitter data is used as the source domain, and Gab data is used as the target domain. The performance of domain adaptation also depends on the cross-domain similarity. Different distance measure methods such as L2 distance, cosine distance, Maximum Mean Discrepancy (MMD), Fisher Linear Discriminant (FLD), and CORAL have been used to estimate domain similarity. Certainly, in-domain distances are small, and between-domain distances are expected to be large. The previous work finding shows that pretrain masked language model (MLM) fine-tuned with a mixture of posts of source and target domain gives higher accuracy. However, in-domain performance of the hate classifier on Twitter data accuracy is 71.78%, and out-of-domain performance of the hate classifier on Gab data goes down to 56.53%. Recently self-supervised learning got a lot of attention as it is more applicable when labeled data are scarce. Few works have already been explored to apply self-supervised learning on NLP tasks such as sentiment classification. Self-supervised language representation model ALBERTA focuses on modeling inter-sentence coherence and helps downstream tasks with multi-sentence inputs. Self-supervised attention learning approach shows better performance as it exploits extracted context word in the training process. In this work, a self-supervised attention mechanism has been proposed to detect hate speech on Gab.ai. This framework initially classifies the Gab dataset in an attention-based self-supervised manner. On the next step, a semi-supervised classifier trained on the combination of labeled data from the first step and unlabeled data. The performance of the proposed framework will be compared with the results described earlier and also with optimized outcomes obtained from different optimization techniques.Keywords: attention learning, language model, offensive language detection, self-supervised learning
Procedia PDF Downloads 1052935 Noninvasive Disease Diagnosis through Breath Analysis Using DNA-functionalized SWNT Sensor Array
Authors: W. J. Zhang, Y. Q. Du, M. L. Wang
Abstract:
Noninvasive diagnostics of diseases via breath analysis has attracted considerable scientific and clinical interest for many years and become more and more promising with the rapid advancement in nanotechnology and biotechnology. The volatile organic compounds (VOCs) in exhaled breath, which are mainly blood borne, particularly provide highly valuable information about individuals’ physiological and pathophysiological conditions. Additionally, breath analysis is noninvasive, real-time, painless and agreeable to patients. We have developed a wireless sensor array based on single-stranded DNA (ssDNA)-decorated single-walled carbon nanotubes (SWNT) for the detection of a number of physiological indicators in breath. Eight DNA sequences were used to functionalize SWNT sensors to detect trace amount of methanol, benzene, dimethyl sulfide, hydrogen sulfide, acetone and ethanol, which are indicators of heavy smoking, excessive drinking, and diseases such as lung cancer, breast cancer, cirrhosis and diabetes. Our tests indicated that DNA functionalized SWNT sensors exhibit great selectivity, sensitivity, reproducibility, and repeatability. Furthermore, different molecules can be distinguished through pattern recognition enabled by this sensor array. Thus, the DNA-SWNT sensor array has great potential to be applied in chemical or bimolecular detection for the noninvasive diagnostics of diseases and health monitoring.Keywords: breath analysis, diagnosis, DNA-SWNT sensor array, noninvasive
Procedia PDF Downloads 3482934 PPB-Level H₂ Gas-Sensor Based on Porous Ni-MOF Derived NiO@CuO Nanoflowers for Superior Sensing Performance
Authors: Shah Sufaid, Hussain Shahid, Tianyan You, Liu Guiwu, Qiao Guanjun
Abstract:
Nickel oxide (NiO) is an optimal material for precise detection of hydrogen (H₂) gas due to its high catalytic activity and low resistivity. However, the gas response kinetics of H₂ gas molecules with the surface of NiO concurrence limitation imposed by its solid structure, leading to a diminished gas response value and slow electron-hole transport. Herein, NiO@CuO NFs with porous sharp-tip and nanospheres morphology were successfully synthesized by using a metal-organic framework (MOFs) as a precursor. The fabricated porous 2 wt% NiO@CuO NFs present outstanding selectivity towards H₂ gas, including a high sensitivity of a response value (170 to 20 ppm at 150 °C) higher than that of porous Ni-MOF (6), low detection limit (300 ppb) with a notable response (21), short response and recovery times at (300 ppb, 40/63 s and 20 ppm, 100/167 s), exceptional long-term stability and repeatability. Furthermore, an understanding of NiO@CuO sensor functioning in an actual environment has been obtained by using the impact of relative humidity as well. The boosted hydrogen sensing properties may be attributed due to synergistic effects of numerous facts including p-p heterojunction at the interface between NiO and CuO nanoflowers. Particularly, a porous Ni-MOF structure combined with the chemical sensitization effect of NiO with the rough surface of CuO nanosphere, are examined. This research presents an effective method for development of Ni-MOF derived metal oxide semiconductor (MOS) heterostructures with rigorous morphology and composition, suitable for gas sensing application.Keywords: NiO@CuO NFs, metal organic framework, porous structure, H₂, gas sensing
Procedia PDF Downloads 442933 Experimental Device for Fluorescence Measurement by Optical Fiber Combined with Dielectrophoretic Sorting in Microfluidic Chips
Authors: Jan Jezek, Zdenek Pilat, Filip Smatlo, Pavel Zemanek
Abstract:
We present a device that combines fluorescence spectroscopy with fiber optics and dielectrophoretic micromanipulation in PDMS (poly-(dimethylsiloxane)) microfluidic chips. The device allows high speed detection (in the order of kHz) of the fluorescence signal, which is coming from the sample by an inserted optical fiber, e.g. from a micro-droplet flow in a microfluidic chip, or even from the liquid flowing in the transparent capillary, etc. The device uses a laser diode at a wavelength suitable for excitation of fluorescence, excitation and emission filters, optics for focusing the laser radiation into the optical fiber, and a highly sensitive fast photodiode for detection of fluorescence. The device is combined with dielectrophoretic sorting on a chip for sorting of micro-droplets according to their fluorescence intensity. The electrodes are created by lift-off technology on a glass substrate, or by using channels filled with a soft metal alloy or an electrolyte. This device found its use in screening of enzymatic reactions and sorting of individual fluorescently labelled microorganisms. The authors acknowledge the support from the Grant Agency of the Czech Republic (GA16-07965S) and Ministry of Education, Youth and Sports of the Czech Republic (LO1212) together with the European Commission (ALISI No. CZ.1.05/2.1.00/01.0017).Keywords: dielectrophoretic sorting, fiber optics, laser, microfluidic chips, microdroplets, spectroscopy
Procedia PDF Downloads 7192932 Using MALDI-TOF MS to Detect Environmental Microplastics (Polyethylene, Polyethylene Terephthalate, and Polystyrene) within a Simulated Tissue Sample
Authors: Kara J. Coffman-Rea, Karen E. Samonds
Abstract:
Microplastic pollution is an urgent global threat to our planet and human health. Microplastic particles have been detected within our food, water, and atmosphere, and found within the human stool, placenta, and lung tissue. However, most spectrometric microplastic detection methods require chemical digestion which can alter or destroy microplastic particles and makes it impossible to acquire information about their in-situ distribution. MALDI TOF MS (Matrix-assisted laser desorption ionization-time of flight mass spectrometry) is an analytical method using a soft ionization technique that can be used for polymer analysis. This method provides a valuable opportunity to both acquire information regarding the in-situ distribution of microplastics and also minimizes the destructive element of chemical digestion. In addition, MALDI TOF MS allows for expanded analysis of the microplastics including detection of specific additives that may be present within them. MALDI TOF MS is particularly sensitive to sample preparation and has not yet been used to analyze environmental microplastics within their specific location (e.g., biological tissues, sediment, water). In this study, microplastics were created using polyethylene gloves, polystyrene micro-foam, and polyethylene terephthalate cable sleeving. Plastics were frozen using liquid nitrogen and ground to obtain small fragments. An artificial tissue was created using a cellulose sponge as scaffolding coated with a MaxGel Extracellular Matrix to simulate human lung tissue. Optimal preparation techniques (e.g., matrix, cationization reagent, solvent, mixing ratio, laser intensity) were first established for each specific polymer type. The artificial tissue sample was subsequently spiked with microplastics, and specific polymers were detected using MALDI-TOF-MS. This study presents a novel method for the detection of environmental polyethylene, polyethylene terephthalate, and polystyrene microplastics within a complex sample. Results of this study provide an effective method that can be used in future microplastics research and can aid in determining the potential threats to environmental and human health that they pose.Keywords: environmental plastic pollution, MALDI-TOF MS, microplastics, polymer identification
Procedia PDF Downloads 2562931 Analysis and Design of Exo-Skeleton System Based on Multibody Dynamics
Authors: Jatin Gupta, Bishakh Bhattacharya
Abstract:
With the aging process, many people start suffering from the problem of weak limbs resulting in mobility disorders and loss of sensory and motor function of limbs. Wearable robotic devices are viable solutions to help people suffering from these issues by augmenting their strength. These robotic devices, popularly known as exoskeletons aides user by providing external power and controlling the dynamics so as to achieve desired motion. Present work studies a simplified dynamic model of the human gait. A four link open chain kinematic model is developed to describe the dynamics of Single Support Phase (SSP) of the human gait cycle. The dynamic model is developed integrating mathematical models of the motion of inverted and triple pendulums. Stance leg is modeled as inverted pendulum having single degree of freedom and swing leg as triple pendulum having three degrees of freedom viz. thigh, knee, and ankle joints. The kinematic model is formulated using forward kinematics approach. Lagrangian approach is used to formulate governing dynamic equation of the model. For a system of nonlinear differential equations, numerical method is employed to obtain system response. Reference trajectory is generated using human body simulator, LifeMOD. For optimal mechanical design and controller design of exoskeleton system, it is imperative to study parameter sensitivity of the system. Six different parameters viz. thigh, shank, and foot masses and lengths are varied from 85% to 115% of the original value for the present work. It is observed that hip joint of swing leg is the most sensitive and ankle joint of swing leg is the least sensitive one. Changing link lengths causes more deviation in system response than link masses. Also, shank length and thigh mass are most sensitive parameters. Finally, the present study gives an insight on different factors that should be considered while designing a lower extremity exoskeleton.Keywords: lower limb exoskeleton, multibody dynamics, energy based formulation, optimal design
Procedia PDF Downloads 2002930 Image Processing techniques for Surveillance in Outdoor Environment
Authors: Jayanth C., Anirudh Sai Yetikuri, Kavitha S. N.
Abstract:
This paper explores the development and application of computer vision and machine learning techniques for real-time pose detection, facial recognition, and number plate extraction. Utilizing MediaPipe for pose estimation, the research presents methods for detecting hand raises and ducking postures through real-time video analysis. Complementarily, facial recognition is employed to compare and verify individual identities using the face recognition library. Additionally, the paper demonstrates a robust approach for extracting and storing vehicle number plates from images, integrating Optical Character Recognition (OCR) with a database management system. The study highlights the effectiveness and versatility of these technologies in practical scenarios, including security and surveillance applications. The findings underscore the potential of combining computer vision techniques to address diverse challenges and enhance automated systems for both individual and vehicular identification. This research contributes to the fields of computer vision and machine learning by providing scalable solutions and demonstrating their applicability in real-world contexts.Keywords: computer vision, pose detection, facial recognition, number plate extraction, machine learning, real-time analysis, OCR, database management
Procedia PDF Downloads 262929 Diagnosis and Analysis of Automated Liver and Tumor Segmentation on CT
Authors: R. R. Ramsheeja, R. Sreeraj
Abstract:
For view the internal structures of the human body such as liver, brain, kidney etc have a wide range of different modalities for medical images are provided nowadays. Computer Tomography is one of the most significant medical image modalities. In this paper use CT liver images for study the use of automatic computer aided techniques to calculate the volume of the liver tumor. Segmentation method is used for the detection of tumor from the CT scan is proposed. Gaussian filter is used for denoising the liver image and Adaptive Thresholding algorithm is used for segmentation. Multiple Region Of Interest(ROI) based method that may help to characteristic the feature different. It provides a significant impact on classification performance. Due to the characteristic of liver tumor lesion, inherent difficulties appear selective. For a better performance, a novel proposed system is introduced. Multiple ROI based feature selection and classification are performed. In order to obtain of relevant features for Support Vector Machine(SVM) classifier is important for better generalization performance. The proposed system helps to improve the better classification performance, reason in which we can see a significant reduction of features is used. The diagnosis of liver cancer from the computer tomography images is very difficult in nature. Early detection of liver tumor is very helpful to save the human life.Keywords: computed tomography (CT), multiple region of interest(ROI), feature values, segmentation, SVM classification
Procedia PDF Downloads 5092928 Automatic Detection of Defects in Ornamental Limestone Using Wavelets
Authors: Maria C. Proença, Marco Aniceto, Pedro N. Santos, José C. Freitas
Abstract:
A methodology based on wavelets is proposed for the automatic location and delimitation of defects in limestone plates. Natural defects include dark colored spots, crystal zones trapped in the stone, areas of abnormal contrast colors, cracks or fracture lines, and fossil patterns. Although some of these may or may not be considered as defects according to the intended use of the plate, the goal is to pair each stone with a map of defects that can be overlaid on a computer display. These layers of defects constitute a database that will allow the preliminary selection of matching tiles of a particular variety, with specific dimensions, for a requirement of N square meters, to be done on a desktop computer rather than by a two-hour search in the storage park, with human operators manipulating stone plates as large as 3 m x 2 m, weighing about one ton. Accident risks and work times are reduced, with a consequent increase in productivity. The base for the algorithm is wavelet decomposition executed in two instances of the original image, to detect both hypotheses – dark and clear defects. The existence and/or size of these defects are the gauge to classify the quality grade of the stone products. The tuning of parameters that are possible in the framework of the wavelets corresponds to different levels of accuracy in the drawing of the contours and selection of the defects size, which allows for the use of the map of defects to cut a selected stone into tiles with minimum waste, according the dimension of defects allowed.Keywords: automatic detection, defects, fracture lines, wavelets
Procedia PDF Downloads 2482927 The Use of Artificial Intelligence in Diagnosis of Mastitis in Cows
Authors: Djeddi Khaled, Houssou Hind, Miloudi Abdellatif, Rabah Siham
Abstract:
In the field of veterinary medicine, there is a growing application of artificial intelligence (AI) for diagnosing bovine mastitis, a prevalent inflammatory disease in dairy cattle. AI technologies, such as automated milking systems, have streamlined the assessment of key metrics crucial for managing cow health during milking and identifying prevalent diseases, including mastitis. These automated milking systems empower farmers to implement automatic mastitis detection by analyzing indicators like milk yield, electrical conductivity, fat, protein, lactose, blood content in the milk, and milk flow rate. Furthermore, reports highlight the integration of somatic cell count (SCC), thermal infrared thermography, and diverse systems utilizing statistical models and machine learning techniques, including artificial neural networks, to enhance the overall efficiency and accuracy of mastitis detection. According to a review of 15 publications, machine learning technology can predict the risk and detect mastitis in cattle with an accuracy ranging from 87.62% to 98.10% and sensitivity and specificity ranging from 84.62% to 99.4% and 81.25% to 98.8%, respectively. Additionally, machine learning algorithms and microarray meta-analysis are utilized to identify mastitis genes in dairy cattle, providing insights into the underlying functional modules of mastitis disease. Moreover, AI applications can assist in developing predictive models that anticipate the likelihood of mastitis outbreaks based on factors such as environmental conditions, herd management practices, and animal health history. This proactive approach supports farmers in implementing preventive measures and optimizing herd health. By harnessing the power of artificial intelligence, the diagnosis of bovine mastitis can be significantly improved, enabling more effective management strategies and ultimately enhancing the health and productivity of dairy cattle. The integration of artificial intelligence presents valuable opportunities for the precise and early detection of mastitis, providing substantial benefits to the dairy industry.Keywords: artificial insemination, automatic milking system, cattle, machine learning, mastitis
Procedia PDF Downloads 652926 Effect of Different Knee-Joint Positions on Passive Stiffness of Medial Gastrocnemius Muscle and Aponeuroses during Passive Ankle Motion
Authors: Xiyao Shan, Pavlos Evangelidis, Adam Kositsky, Naoki Ikeda, Yasuo Kawakami
Abstract:
The human triceps surae (two bi-articular gastrocnemii and one mono-articular soleus) have aponeuroses in the posterior and anterior aspects of each muscle, where the anterior aponeuroses of the gastrocnemii adjoin the posterior aponeurosis of the soleus, possibly contributing to the intermuscular force transmission between gastrocnemii and soleus. Since the mechanical behavior of these aponeuroses at different knee- and ankle-joint positions remains unclear, the purpose of this study was to clarify this through observations of the localized changes in passive stiffness of the posterior aponeuroses, muscle belly and adjoining aponeuroses of the medial gastrocnemius (MG) induced by different knee and ankle angles. Eleven healthy young males (25 ± 2 yr, 176.7 ± 4.7 cm, 71.1 ± 11.1 kg) participated in this study. Each subject took either a prone position on an isokinetic dynamometer while the knee joint was fully extended (K180) or a kneeling position while the knee joint was 90° flexed (K90), in a randomized and counterbalanced order. The ankle joint was then passively moved through a 50° range of motion (ROM) by the dynamometer from 30° of plantar flexion (PF) to 20° of dorsiflexion (DF) at 2°/s and the ultrasound shear-wave velocity was measured to obtain shear moduli of the posterior aponeurosis, MG belly, and adjoining aponeuroses. The main findings were: 1) shear modulus in K180 was significantly higher (p < 0.05) than K90 for the posterior aponeurosis (across all ankle angles, 10.2 ± 5.7 kPa-59.4 ± 28.7 kPa vs. 5.4 ± 2.2 kPa-11.6 ± 4.1 kPa), MG belly (from PF10° to DF20°, 9.7 ± 2.2 kPa-53.6 ± 18.6 kPa vs. 8.0 ± 2.7 kPa-9.5 ± 3.7 kPa), and adjoining aponeuroses (across all ankle angles, 17.3 ± 7.8 kPa-80 ± 25.7 kPa vs. 12.2 ± 4.5 kPa-52.4 ± 23.0 kPa); 2) shear modulus of the posterior aponeuroses significantly increased (p < 0.05) from PF10° to PF20° in K180, while shear modulus of MG belly significantly increased (p < 0.05) from 0° to PF20° only in K180 and shear modulus of adjoining aponeuroses significantly increased (p < 0.05) across the whole ROM of ankle both in K180 and K90. These results suggest that different knee-joint positions can affect not only the bi-articular gastrocnemius but also influence the mechanical behavior of aponeuroses. In addition, compared to the gradual stiffening of the adjoining aponeuroses across the whole ROM of ankle, the posterior aponeurosis became slack in the plantar flexed positions and then was stiffened gradually as the knee was fully extended. This suggests distinct stiffening for the posterior and adjoining aponeuroses which is joint position-dependent.Keywords: aponeurosis, plantar flexion and dorsiflexion, shear modulus, shear wave elastography
Procedia PDF Downloads 1902925 Multi-Layer Multi-Feature Background Subtraction Using Codebook Model Framework
Authors: Yun-Tao Zhang, Jong-Yeop Bae, Whoi-Yul Kim
Abstract:
Background modeling and subtraction in video analysis has been widely proved to be an effective method for moving objects detection in many computer vision applications. Over the past years, a large number of approaches have been developed to tackle different types of challenges in this field. However, the dynamic background and illumination variations are two of the most frequently occurring issues in the practical situation. This paper presents a new two-layer model based on codebook algorithm incorporated with local binary pattern (LBP) texture measure, targeted for handling dynamic background and illumination variation problems. More specifically, the first layer is designed by block-based codebook combining with LBP histogram and mean values of RGB color channels. Because of the invariance of the LBP features with respect to monotonic gray-scale changes, this layer can produce block-wise detection results with considerable tolerance of illumination variations. The pixel-based codebook is employed to reinforce the precision from the outputs of the first layer which is to eliminate false positives further. As a result, the proposed approach can greatly promote the accuracy under the circumstances of dynamic background and illumination changes. Experimental results on several popular background subtraction datasets demonstrate a very competitive performance compared to previous models.Keywords: background subtraction, codebook model, local binary pattern, dynamic background, illumination change
Procedia PDF Downloads 2172924 Towards Real-Time Classification of Finger Movement Direction Using Encephalography Independent Components
Authors: Mohamed Mounir Tellache, Hiroyuki Kambara, Yasuharu Koike, Makoto Miyakoshi, Natsue Yoshimura
Abstract:
This study explores the practicality of using electroencephalographic (EEG) independent components to predict eight-direction finger movements in pseudo-real-time. Six healthy participants with individual-head MRI images performed finger movements in eight directions with two different arm configurations. The analysis was performed in two stages. The first stage consisted of using independent component analysis (ICA) to separate the signals representing brain activity from non-brain activity signals and to obtain the unmixing matrix. The resulting independent components (ICs) were checked, and those reflecting brain-activity were selected. Finally, the time series of the selected ICs were used to predict eight finger-movement directions using Sparse Logistic Regression (SLR). The second stage consisted of using the previously obtained unmixing matrix, the selected ICs, and the model obtained by applying SLR to classify a different EEG dataset. This method was applied to two different settings, namely the single-participant level and the group-level. For the single-participant level, the EEG dataset used in the first stage and the EEG dataset used in the second stage originated from the same participant. For the group-level, the EEG datasets used in the first stage were constructed by temporally concatenating each combination without repetition of the EEG datasets of five participants out of six, whereas the EEG dataset used in the second stage originated from the remaining participants. The average test classification results across datasets (mean ± S.D.) were 38.62 ± 8.36% for the single-participant, which was significantly higher than the chance level (12.50 ± 0.01%), and 27.26 ± 4.39% for the group-level which was also significantly higher than the chance level (12.49% ± 0.01%). The classification accuracy within [–45°, 45°] of the true direction is 70.03 ± 8.14% for single-participant and 62.63 ± 6.07% for group-level which may be promising for some real-life applications. Clustering and contribution analyses further revealed the brain regions involved in finger movement and the temporal aspect of their contribution to the classification. These results showed the possibility of using the ICA-based method in combination with other methods to build a real-time system to control prostheses.Keywords: brain-computer interface, electroencephalography, finger motion decoding, independent component analysis, pseudo real-time motion decoding
Procedia PDF Downloads 1382923 Comparing Different Frequency Ground Penetrating Radar Antennas for Tunnel Health Assessment
Authors: Can Mungan, Gokhan Kilic
Abstract:
Structural engineers and tunnel owners have good reason to attach importance to the assessment and inspection of tunnels. Regular inspection is necessary to maintain and monitor the health of the structure not only at the present time but throughout its life cycle. Detection of flaws within the structure, such as corrosion and the formation of cracks within the internal elements of the structure, can go a long way to ensuring that the structure maintains its integrity over the course of its life. Other issues that may be detected earlier through regular assessment include tunnel surface delamination and the corrosion of the rebar. One advantage of new technology such as the ground penetrating radar (GPR) is the early detection of imperfections. This study will aim to discuss and present the effectiveness of GPR as a tool for assessing the structural integrity of the heavily used tunnel. GPR is used with various antennae in frequency and application method (2 GHz and 500 MHz GPR antennae). The paper will attempt to produce a greater understanding of structural defects and identify the correct tool for such purposes. Conquest View with 3D scanning capabilities was involved throughout the analysis, reporting, and interpretation of the results. This study will illustrate GPR mapping and its effectiveness in providing information of value when it comes to rebar position (lower and upper reinforcement). It will also show how such techniques can detect structural features that would otherwise remain unseen, as well as moisture ingress.Keywords: tunnel, GPR, health monitoring, moisture ingress, rebar position
Procedia PDF Downloads 1192922 Guided Energy Theory of a Particle: Answered Questions Arise from Quantum Foundation
Authors: Desmond Agbolade Ademola
Abstract:
This work aimed to introduce a theory, called Guided Energy Theory of a particle that answered questions that arise from quantum foundation, quantum mechanics theory, and interpretation such as: what is nature of wavefunction? Is mathematical formalism of wavefunction correct? Does wavefunction collapse during measurement? Do quantum physical entanglement and many world interpretations really exist? In addition, is there uncertainty in the physical reality of our nature as being concluded in the Quantum theory? We have been able to show by the fundamental analysis presented in this work that the way quantum mechanics theory, and interpretation describes nature is not correlated with physical reality. Because, we discovered amongst others that, (1) Guided energy theory of a particle fundamentally provides complete physical observable series of quantized measurement of a particle momentum, force, energy e.t.c. in a given distance and time.In contrast, quantum mechanics wavefunction describes that nature has inherited probabilistic and indeterministic physical quantities, resulting in unobservable physical quantities that lead to many worldinterpretation.(2) Guided energy theory of a particle fundamentally predicts that it is mathematically possible to determine precise quantized measurementof position and momentum of a particle simultaneously. Because, there is no uncertainty in nature; nature however naturally guides itself against uncertainty. Contrary to the conclusion in quantum mechanics theory that, it is mathematically impossible to determine the position and the momentum of a particle simultaneously. Furthermore, we have been able to show by this theory that, it is mathematically possible to determine quantized measurement of force acting on a particle simultaneously, which is not possible on the premise of quantum mechanics theory. (3) It is evidently shown by our theory that, guided energy does not collapse, only describes the lopsided nature of a particle behavior in motion. This pretty offers us insight on gradual process of engagement - convergence and disengagement – divergence of guided energy holders which further highlight the picture how wave – like behavior return to particle-like behavior and how particle – like behavior return to wave – like behavior respectively. This further proves that the particles’ behavior in motion is oscillatory in nature. The mathematical formalism of Guided energy theory shows that nature is certainty whereas the mathematical formalism of Quantum mechanics theory shows that nature is absolutely probabilistics. In addition, the nature of wavefunction is the guided energy of the wave. In conclusion, the fundamental mathematical formalism of Quantum mechanics theory is wrong.Keywords: momentum, physical entanglement, wavefunction, uncertainty
Procedia PDF Downloads 2952921 Evaluation of Beam Structure Using Non-Destructive Vibration-Based Damage Detection Method
Authors: Bashir Ahmad Aasim, Abdul Khaliq Karimi, Jun Tomiyama
Abstract:
Material aging is one of the vital issues among all the civil, mechanical, and aerospace engineering societies. Sustenance and reliability of concrete, which is the widely used material in the world, is the focal point in civil engineering societies. For few decades, researchers have been able to present some form algorithms that could lead to evaluate a structure globally rather than locally without harming its serviceability and traffic interference. The algorithms could help presenting different methods for evaluating structures non-destructively. In this paper, a non-destructive vibration-based damage detection method is adopted to evaluate two concrete beams, one being in a healthy state while the second one contains a crack on its bottom vicinity. The study discusses that damage in a structure affects modal parameters (natural frequency, mode shape, and damping ratio), which are the function of physical properties (mass, stiffness, and damping). The assessment is carried out to acquire the natural frequency of the sound beam. Next, the vibration response is recorded from the cracked beam. Eventually, both results are compared to know the variation in the natural frequencies of both beams. The study concludes that damage can be detected using vibration characteristics of a structural member considering the decline occurred in the natural frequency of the cracked beam.Keywords: concrete beam, natural frequency, non-destructive testing, vibration characteristics
Procedia PDF Downloads 1122920 Wireless Sensor Network for Forest Fire Detection and Localization
Authors: Tarek Dandashi
Abstract:
WSNs may provide a fast and reliable solution for the early detection of environment events like forest fires. This is crucial for alerting and calling for fire brigade intervention. Sensor nodes communicate sensor data to a host station, which enables a global analysis and the generation of a reliable decision on a potential fire and its location. A WSN with TinyOS and nesC for the capturing and transmission of a variety of sensor information with controlled source, data rates, duration, and the records/displaying activity traces is presented. We propose a similarity distance (SD) between the distribution of currently sensed data and that of a reference. At any given time, a fire causes diverging opinions in the reported data, which alters the usual data distribution. Basically, SD consists of a metric on the Cumulative Distribution Function (CDF). SD is designed to be invariant versus day-to-day changes of temperature, changes due to the surrounding environment, and normal changes in weather, which preserve the data locality. Evaluation shows that SD sensitivity is quadratic versus an increase in sensor node temperature for a group of sensors of different sizes and neighborhood. Simulation of fire spreading when ignition is placed at random locations with some wind speed shows that SD takes a few minutes to reliably detect fires and locate them. We also discuss the case of false negative and false positive and their impact on the decision reliability.Keywords: forest fire, WSN, wireless sensor network, algortihm
Procedia PDF Downloads 2622919 Effect of Birks Constant and Defocusing Parameter on Triple-to-Double Coincidence Ratio Parameter in Monte Carlo Simulation-GEANT4
Authors: Farmesk Abubaker, Francesco Tortorici, Marco Capogni, Concetta Sutera, Vincenzo Bellini
Abstract:
This project concerns with the detection efficiency of the portable triple-to-double coincidence ratio (TDCR) at the National Institute of Metrology of Ionizing Radiation (INMRI-ENEA) which allows direct activity measurement and radionuclide standardization for pure-beta emitter or pure electron capture radionuclides. The dependency of the simulated detection efficiency of the TDCR, by using Monte Carlo simulation Geant4 code, on the Birks factor (kB) and defocusing parameter has been examined especially for low energy beta-emitter radionuclides such as 3H and 14C, for which this dependency is relevant. The results achieved in this analysis can be used for selecting the best kB factor and the defocusing parameter for computing theoretical TDCR parameter value. The theoretical results were compared with the available ones, measured by the ENEA TDCR portable detector, for some pure-beta emitter radionuclides. This analysis allowed to improve the knowledge of the characteristics of the ENEA TDCR detector that can be used as a traveling instrument for in-situ measurements with particular benefits in many applications in the field of nuclear medicine and in the nuclear energy industry.Keywords: Birks constant, defocusing parameter, GEANT4 code, TDCR parameter
Procedia PDF Downloads 1482918 Two Degree of Freedom Spherical Mechanism Design for Exact Sun Tracking
Authors: Osman Acar
Abstract:
Sun tracking systems are the systems following the sun ray by a right angle or by predetermined certain angle. In this study, we used theoretical trajectory of sun for latitude of central Anatolia in Turkey. A two degree of freedom spherical mechanism was designed to have a large workspace able to follow the sun's theoretical motion by the right angle during the whole year. An inverse kinematic analysis was generated to find the positions of mechanism links for the predicted trajectory. Force and torque analysis were shown for the first day of the year.Keywords: sun tracking, theoretical sun trajectory, spherical mechanism, inverse kinematic analysis
Procedia PDF Downloads 4192917 Remote Assessment and Change Detection of GreenLAI of Cotton Crop Using Different Vegetation Indices
Authors: Ganesh B. Shinde, Vijaya B. Musande
Abstract:
Cotton crop identification based on the timely information has significant advantage to the different implications of food, economic and environment. Due to the significant advantages, the accurate detection of cotton crop regions using supervised learning procedure is challenging problem in remote sensing. Here, classifiers on the direct image are played a major role but the results are not much satisfactorily. In order to further improve the effectiveness, variety of vegetation indices are proposed in the literature. But, recently, the major challenge is to find the better vegetation indices for the cotton crop identification through the proposed methodology. Accordingly, fuzzy c-means clustering is combined with neural network algorithm, trained by Levenberg-Marquardt for cotton crop classification. To experiment the proposed method, five LISS-III satellite images was taken and the experimentation was done with six vegetation indices such as Simple Ratio, Normalized Difference Vegetation Index, Enhanced Vegetation Index, Green Atmospherically Resistant Vegetation Index, Wide-Dynamic Range Vegetation Index, Green Chlorophyll Index. Along with these indices, Green Leaf Area Index is also considered for investigation. From the research outcome, Green Atmospherically Resistant Vegetation Index outperformed with all other indices by reaching the average accuracy value of 95.21%.Keywords: Fuzzy C-Means clustering (FCM), neural network, Levenberg-Marquardt (LM) algorithm, vegetation indices
Procedia PDF Downloads 3182916 Automatic Detection of Sugarcane Diseases: A Computer Vision-Based Approach
Authors: Himanshu Sharma, Karthik Kumar, Harish Kumar
Abstract:
The major problem in crop cultivation is the occurrence of multiple crop diseases. During the growth stage, timely identification of crop diseases is paramount to ensure the high yield of crops, lower production costs, and minimize pesticide usage. In most cases, crop diseases produce observable characteristics and symptoms. The Surveyors usually diagnose crop diseases when they walk through the fields. However, surveyor inspections tend to be biased and error-prone due to the nature of the monotonous task and the subjectivity of individuals. In addition, visual inspection of each leaf or plant is costly, time-consuming, and labour-intensive. Furthermore, the plant pathologists and experts who can often identify the disease within the plant according to their symptoms in early stages are not readily available in remote regions. Therefore, this study specifically addressed early detection of leaf scald, red rot, and eyespot types of diseases within sugarcane plants. The study proposes a computer vision-based approach using a convolutional neural network (CNN) for automatic identification of crop diseases. To facilitate this, firstly, images of sugarcane diseases were taken from google without modifying the scene, background, or controlling the illumination to build the training dataset. Then, the testing dataset was developed based on the real-time collected images from the sugarcane field from India. Then, the image dataset is pre-processed for feature extraction and selection. Finally, the CNN-based Visual Geometry Group (VGG) model was deployed on the training and testing dataset to classify the images into diseased and healthy sugarcane plants and measure the model's performance using various parameters, i.e., accuracy, sensitivity, specificity, and F1-score. The promising result of the proposed model lays the groundwork for the automatic early detection of sugarcane disease. The proposed research directly sustains an increase in crop yield.Keywords: automatic classification, computer vision, convolutional neural network, image processing, sugarcane disease, visual geometry group
Procedia PDF Downloads 116