Search results for: weather detection
1620 Assessment of Cellular Metabolites and Impedance for Early Diagnosis of Oral Cancer among Habitual Smokers
Authors: Ripon Sarkar, Kabita Chaterjee, Ananya Barui
Abstract:
Smoking is one of the leading causes of oral cancer. Cigarette smoke affects various cellular parameters and alters molecular metabolism of cells. Epithelial cells losses their cytoskeleton structure, membrane integrity, cellular polarity that subsequently initiates the process of epithelial cells to mesenchymal transition due to long exposure of cigarette smoking. It changes the normal cellular metabolic activity which induces oxidative stress and enhances the reactive oxygen spices (ROS) formation. Excessive ROS and associated oxidative stress are considered to be a driving force in alteration in cellular phenotypes, polarity distribution and mitochondrial metabolism. Noninvasive assessment of such parameters plays essential role in development of routine screening system for early diagnosis of oral cancer. Electrical cell-substrate impedance sensing (ECIS) is one of such method applied for detection of cellular membrane impedance which can be correlated to cell membrane integrity. Present study intends to explore the alteration in cellular impedance along with the expression of cellular polarity molecules and cytoskeleton distributions in oral epithelial cells of habitual smokers and to correlate the outcome to that of clinically diagnosed oral leukoplakia and oral squamous cell carcinoma patients. Total 80 subjects were categorized into four study groups: nonsmoker (NS), cigarette smoker (CS), oral leukoplakia (OLPK) and oral squamous cell carcinoma (OSCC). Cytoskeleton distribution was analyzed by staining of actin filament and generation of ROS was measured using assay kit using standard protocol. Cell impedance was measured through ECIS method at different frequencies. Expression of E-cadherin and protease-activated receptor (PAR) proteins were observed through immune-fluorescence method. Distribution of actin filament is well organized in NS group however; distribution pattern was grossly varied in CS, OLPK and OSCC. Generation of ROS was low in NS which subsequently increased towards OSCC. Expressions of E-cadherin and change in cellular electrical impedance in different study groups indicated the hallmark of cancer progression from NS to OSCC. Expressions of E-cadherin, PAR protein, and cell impedance were decreased from NS to CS and farther OSCC. Generally, the oral epithelial cells exhibit apico-basal polarity however with cancer progression these cells lose their characteristic polarity distribution. In this study expression of polarity molecule and ECIS observation indicates such altered pattern of polarity among smoker group. Overall the present study monitored the alterations in intracellular ROS generation and cell metabolic function, membrane integrity in oral epithelial cells in cigarette smokers. Present study thus has clinical significance, and it may help in developing a noninvasive technique for early diagnosis of oral cancer amongst susceptible individuals.Keywords: cigarette smoking, early oral cancer detection, electric cell-substrate impedance sensing, noninvasive screening
Procedia PDF Downloads 1761619 A Comparative Analysis of Hyper-Parameters Using Neural Networks for E-Mail Spam Detection
Authors: Syed Mahbubuz Zaman, A. B. M. Abrar Haque, Mehedi Hassan Nayeem, Misbah Uddin Sagor
Abstract:
Everyday e-mails are being used by millions of people as an effective form of communication over the Internet. Although e-mails allow high-speed communication, there is a constant threat known as spam. Spam e-mail is often called junk e-mails which are unsolicited and sent in bulk. These unsolicited emails cause security concerns among internet users because they are being exposed to inappropriate content. There is no guaranteed way to stop spammers who use static filters as they are bypassed very easily. In this paper, a smart system is proposed that will be using neural networks to approach spam in a different way, and meanwhile, this will also detect the most relevant features that will help to design the spam filter. Also, a comparison of different parameters for different neural network models has been shown to determine which model works best within suitable parameters.Keywords: long short-term memory, bidirectional long short-term memory, gated recurrent unit, natural language processing, natural language processing
Procedia PDF Downloads 2051618 A Radiofrequency Spectrophotometer Device to Detect Liquids in Gastroesophageal Ways
Authors: R. Gadea, J. M. Monzó, F. J. Puertas, M. Castro, A. Tebar, P. J. Fito, R. J. Colom
Abstract:
There exists a wide array of ailments impacting the structural soundness of the esophageal walls, predominantly linked to digestive issues. Presently, the techniques employed for identifying esophageal tract complications are excessively invasive and discomforting, subjecting patients to prolonged discomfort in order to achieve an accurate diagnosis. This study proposes the creation of a sensor with profound measuring capabilities designed to detect fluids coursing through the esophageal tract. The multi-sensor detection system relies on radiofrequency photospectrometry. During experimentation, individuals representing diverse demographics in terms of gender and age were utilized, positioning the sensors amidst the trachea and diaphragm and assessing measurements in vacuum conditions, water, orange juice, and saline solutions. The findings garnered enabled the identification of various liquid mediums within the esophagus, segregating them based on their ionic composition.Keywords: radiofrequency spectrophotometry, medical device, gastroesophageal disease, photonics
Procedia PDF Downloads 811617 Incident Management System: An Essential Tool for Oil Spill Response
Authors: Ali Heyder Alatas, D. Xin, L. Nai Ming
Abstract:
An oil spill emergency can vary in size and complexity, subject to factors such as volume and characteristics of spilled oil, incident location, impacted sensitivities and resources required. A major incident typically involves numerous stakeholders; these include the responsible party, response organisations, government authorities across multiple jurisdictions, local communities, and a spectrum of technical experts. An incident management team will encounter numerous challenges. Factors such as limited access to location, adverse weather, poor communication, and lack of pre-identified resources can impede a response; delays caused by an inefficient response can exacerbate impacts caused to the wider environment, socio-economic and cultural resources. It is essential that all parties work based on defined roles, responsibilities and authority, and ensure the availability of sufficient resources. To promote steadfast coordination and overcome the challenges highlighted, an Incident Management System (IMS) offers an essential tool for oil spill response. It provides clarity in command and control, improves communication and coordination, facilitates the cooperation between stakeholders, and integrates resources committed. Following the preceding discussion, a comprehensive review of existing literature serves to illustrate the application of IMS in oil spill response to overcome common challenges faced in a major-scaled incident. With a primary audience comprising practitioners in mind, this study will discuss key principles of incident management which enables an effective response, along with pitfalls and challenges, particularly, the tension between government and industry; case studies will be used to frame learning and issues consolidated from previous research, and provide the context to link practice with theory. It will also feature the industry approach to incident management which was further crystallized as part of a review by the Joint Industry Project (JIP) established in the wake of the Macondo well control incident. The authors posit that a common IMS which can be adopted across the industry not only enhances response capacity towards a major oil spill incident but is essential to the global preparedness effort.Keywords: command and control, incident management system, oil spill response, response organisation
Procedia PDF Downloads 1561616 Improved Pitch Detection Using Fourier Approximation Method
Authors: Balachandra Kumaraswamy, P. G. Poonacha
Abstract:
Automatic Music Information Retrieval has been one of the challenging topics of research for a few decades now with several interesting approaches reported in the literature. In this paper we have developed a pitch extraction method based on a finite Fourier series approximation to the given window of samples. We then estimate pitch as the fundamental period of the finite Fourier series approximation to the given window of samples. This method uses analysis of the strength of harmonics present in the signal to reduce octave as well as harmonic errors. The performance of our method is compared with three best known methods for pitch extraction, namely, Yin, Windowed Special Normalization of the Auto-Correlation Function and Harmonic Product Spectrum methods of pitch extraction. Our study with artificially created signals as well as music files show that Fourier Approximation method gives much better estimate of pitch with less octave and harmonic errors.Keywords: pitch, fourier series, yin, normalization of the auto- correlation function, harmonic product, mean square error
Procedia PDF Downloads 4121615 Multi-Sensor Target Tracking Using Ensemble Learning
Authors: Bhekisipho Twala, Mantepu Masetshaba, Ramapulana Nkoana
Abstract:
Multiple classifier systems combine several individual classifiers to deliver a final classification decision. However, an increasingly controversial question is whether such systems can outperform the single best classifier, and if so, what form of multiple classifiers system yields the most significant benefit. Also, multi-target tracking detection using multiple sensors is an important research field in mobile techniques and military applications. In this paper, several multiple classifiers systems are evaluated in terms of their ability to predict a system’s failure or success for multi-sensor target tracking tasks. The Bristol Eden project dataset is utilised for this task. Experimental and simulation results show that the human activity identification system can fulfill requirements of target tracking due to improved sensors classification performances with multiple classifier systems constructed using boosting achieving higher accuracy rates.Keywords: single classifier, ensemble learning, multi-target tracking, multiple classifiers
Procedia PDF Downloads 2681614 Botnet Detection with ML Techniques by Using the BoT-IoT Dataset
Authors: Adnan Baig, Ishteeaq Naeem, Saad Mansoor
Abstract:
The Internet of Things (IoT) gadgets have advanced quickly in recent years, and their use is steadily rising daily. However, cyber-attackers can target these gadgets due to their distributed nature. Additionally, many IoT devices have significant security flaws in their implementation and design, making them vulnerable to security threats. Hence, these threats can cause important data security and privacy loss from a single attack on network devices or systems. Botnets are a significant security risk that can harm the IoT network; hence, sophisticated techniques are required to mitigate the risk. This work uses a machine learning-based method to identify IoT orchestrated by botnets. The proposed technique identifies the net attack by distinguishing between legitimate and malicious traffic. This article proposes a hyperparameter tuning model to improvise the method to improve the accuracy of existing processes. The results demonstrated an improved and more accurate indication of botnet-based cyber-attacks.Keywords: Internet of Things, Botnet, BoT-IoT dataset, ML techniques
Procedia PDF Downloads 111613 A Statistical-Algorithmic Approach for the Design and Evaluation of a Fresnel Solar Concentrator-Receiver System
Authors: Hassan Qandil
Abstract:
Using a statistical algorithm incorporated in MATLAB, four types of non-imaging Fresnel lenses are designed; spot-flat, linear-flat, dome-shaped and semi-cylindrical-shaped. The optimization employs a statistical ray-tracing methodology of the incident light, mainly considering effects of chromatic aberration, varying focal lengths, solar inclination and azimuth angles, lens and receiver apertures, and the optimum number of prism grooves. While adopting an equal-groove-width assumption of the Poly-methyl-methacrylate (PMMA) prisms, the main target is to maximize the ray intensity on the receiver’s aperture and therefore achieving higher values of heat flux. The algorithm outputs prism angles and 2D sketches. 3D drawings are then generated via AutoCAD and linked to COMSOL Multiphysics software to simulate the lenses under solar ray conditions, which provides optical and thermal analysis at both the lens’ and the receiver’s apertures while setting conditions as per the Dallas-TX weather data. Once the lenses’ characterization is finalized, receivers are designed based on its optimized aperture size. Several cavity shapes; including triangular, arc-shaped and trapezoidal, are tested while coupled with a variety of receiver materials, working fluids, heat transfer mechanisms, and enclosure designs. A vacuum-reflective enclosure is also simulated for an enhanced thermal absorption efficiency. Each receiver type is simulated via COMSOL while coupled with the optimized lens. A lab-scale prototype for the optimum lens-receiver configuration is then fabricated for experimental evaluation. Application-based testing is also performed for the selected configuration, including that of a photovoltaic-thermal cogeneration system and solar furnace system. Finally, some future research work is pointed out, including the coupling of the collector-receiver system with an end-user power generator, and the use of a multi-layered genetic algorithm for comparative studies.Keywords: COMSOL, concentrator, energy, fresnel, optics, renewable, solar
Procedia PDF Downloads 1551612 Automatic Diagnosis of Electrical Equipment Using Infrared Thermography
Authors: Y. Laib Dit Leksir, S. Bouhouche
Abstract:
Analysis and processing of data bases resulting from infrared thermal measurements made on the electrical installation requires the development of new tools in order to obtain correct and additional information to the visual inspections. Consequently, the methods based on the capture of infrared digital images show a great potential and are employed increasingly in various fields. Although, there is an enormous need for the development of effective techniques to analyse these data base in order to extract relevant information relating to the state of the equipments. Our goal consists in introducing recent techniques of modeling based on new methods, image and signal processing to develop mathematical models in this field. The aim of this work is to capture the anomalies existing in electrical equipments during an inspection of some machines using A40 Flir camera. After, we use binarisation techniques in order to select the region of interest and we make comparison between these methods of thermal images obtained to choose the best one.Keywords: infrared thermography, defect detection, troubleshooting, electrical equipment
Procedia PDF Downloads 4761611 Estimating Algae Concentration Based on Deep Learning from Satellite Observation in Korea
Authors: Heewon Jeong, Seongpyo Kim, Joon Ha Kim
Abstract:
Over the last few tens of years, the coastal regions of Korea have experienced red tide algal blooms, which are harmful and toxic to both humans and marine organisms due to their potential threat. It was accelerated owing to eutrophication by human activities, certain oceanic processes, and climate change. Previous studies have tried to monitoring and predicting the algae concentration of the ocean with the bio-optical algorithms applied to color images of the satellite. However, the accurate estimation of algal blooms remains problems to challenges because of the complexity of coastal waters. Therefore, this study suggests a new method to identify the concentration of red tide algal bloom from images of geostationary ocean color imager (GOCI) which are representing the water environment of the sea in Korea. The method employed GOCI images, which took the water leaving radiances centered at 443nm, 490nm and 660nm respectively, as well as observed weather data (i.e., humidity, temperature and atmospheric pressure) for the database to apply optical characteristics of algae and train deep learning algorithm. Convolution neural network (CNN) was used to extract the significant features from the images. And then artificial neural network (ANN) was used to estimate the concentration of algae from the extracted features. For training of the deep learning model, backpropagation learning strategy is developed. The established methods were tested and compared with the performances of GOCI data processing system (GDPS), which is based on standard image processing algorithms and optical algorithms. The model had better performance to estimate algae concentration than the GDPS which is impossible to estimate greater than 5mg/m³. Thus, deep learning model trained successfully to assess algae concentration in spite of the complexity of water environment. Furthermore, the results of this system and methodology can be used to improve the performances of remote sensing. Acknowledgement: This work was supported by the 'Climate Technology Development and Application' research project (#K07731) through a grant provided by GIST in 2017.Keywords: deep learning, algae concentration, remote sensing, satellite
Procedia PDF Downloads 1831610 Biosensors for Parathion Based on Au-Pd Nanoparticles Modified Electrodes
Authors: Tian-Fang Kang, Chao-Nan Ge, Rui Li
Abstract:
An electrochemical biosensor for the determination of organophosphorus pesticides was developed based on electrochemical co-deposition of Au and Pd nanoparticles on glassy carbon electrode (GCE). Energy disperse spectroscopy (EDS) analysis was used for characterization of the surface structure. Scanning electron micrograph (SEM) demonstrates that the films are uniform and the nanoclusters are homogeneously distributed on the GCE surface. Acetylcholinesterase (AChE) was immobilized on the Au and Pd nanoparticle modified electrode (Au-Pd/GCE) by cross-linking with glutaraldehyde. The electrochemical behavior of thiocholine at the biosensor (AChE/Au-Pd/GCE) was studied. The biosensors exhibited substantial electrocatalytic effect on the oxidation of thiocholine. The peak current of linear scan voltammetry (LSV) of thiocholine at the biosensor is proportional to the concentration of acetylthiocholine chloride (ATCl) over the range of 2.5 × 10-6 to 2.5 × 10-4 M in 0.1 M phosphate buffer solution (pH 7.0). The percent inhibition of acetylcholinesterase was proportional to the logarithm of parathion concentration in the range of 4.0 × 10-9 to 1.0 × 10-6 M. The detection limit of parathion was 2.6 × 10-9 M. The proposed method exhibited high sensitivity and good reproducibility.Keywords: acetylcholinesterase, Au-Pd nanoparticles, electrochemical biosensors, parathion
Procedia PDF Downloads 4071609 Use of Predictive Food Microbiology to Determine the Shelf-Life of Foods
Authors: Fatih Tarlak
Abstract:
Predictive microbiology can be considered as an important field in food microbiology in which it uses predictive models to describe the microbial growth in different food products. Predictive models estimate the growth of microorganisms quickly, efficiently, and in a cost-effective way as compared to traditional methods of enumeration, which are long-lasting, expensive, and time-consuming. The mathematical models used in predictive microbiology are mainly categorised as primary and secondary models. The primary models are the mathematical equations that define the growth data as a function of time under a constant environmental condition. The secondary models describe the effects of environmental factors, such as temperature, pH, and water activity (aw) on the parameters of the primary models, including the maximum specific growth rate and lag phase duration, which are the most critical growth kinetic parameters. The combination of primary and secondary models provides valuable information to set limits for the quantitative detection of the microbial spoilage and assess product shelf-life.Keywords: shelf-life, growth model, predictive microbiology, simulation
Procedia PDF Downloads 2111608 Face Tracking and Recognition Using Deep Learning Approach
Authors: Degale Desta, Cheng Jian
Abstract:
The most important factor in identifying a person is their face. Even identical twins have their own distinct faces. As a result, identification and face recognition are needed to tell one person from another. A face recognition system is a verification tool used to establish a person's identity using biometrics. Nowadays, face recognition is a common technique used in a variety of applications, including home security systems, criminal identification, and phone unlock systems. This system is more secure because it only requires a facial image instead of other dependencies like a key or card. Face detection and face identification are the two phases that typically make up a human recognition system.The idea behind designing and creating a face recognition system using deep learning with Azure ML Python's OpenCV is explained in this paper. Face recognition is a task that can be accomplished using deep learning, and given the accuracy of this method, it appears to be a suitable approach. To show how accurate the suggested face recognition system is, experimental results are given in 98.46% accuracy using Fast-RCNN Performance of algorithms under different training conditions.Keywords: deep learning, face recognition, identification, fast-RCNN
Procedia PDF Downloads 1401607 Hardware Error Analysis and Severity Characterization in Linux-Based Server Systems
Authors: Nikolaos Georgoulopoulos, Alkis Hatzopoulos, Konstantinos Karamitsios, Konstantinos Kotrotsios, Alexandros I. Metsai
Abstract:
In modern server systems, business critical applications run in different types of infrastructure, such as cloud systems, physical machines and virtualization. Often, due to high load and over time, various hardware faults occur in servers that translate to errors, resulting to malfunction or even server breakdown. CPU, RAM and hard drive (HDD) are the hardware parts that concern server administrators the most regarding errors. In this work, selected RAM, HDD and CPU errors, that have been observed or can be simulated in kernel ring buffer log files from two groups of Linux servers, are investigated. Moreover, a severity characterization is given for each error type. Better understanding of such errors can lead to more efficient analysis of kernel logs that are usually exploited for fault diagnosis and prediction. In addition, this work summarizes ways of simulating hardware errors in RAM and HDD, in order to test the error detection and correction mechanisms of a Linux server.Keywords: hardware errors, Kernel logs, Linux servers, RAM, hard disk, CPU
Procedia PDF Downloads 1541606 Data Quality Enhancement with String Length Distribution
Authors: Qi Xiu, Hiromu Hota, Yohsuke Ishii, Takuya Oda
Abstract:
Recently, collectable manufacturing data are rapidly increasing. On the other hand, mega recall is getting serious as a social problem. Under such circumstances, there are increasing needs for preventing mega recalls by defect analysis such as root cause analysis and abnormal detection utilizing manufacturing data. However, the time to classify strings in manufacturing data by traditional method is too long to meet requirement of quick defect analysis. Therefore, we present String Length Distribution Classification method (SLDC) to correctly classify strings in a short time. This method learns character features, especially string length distribution from Product ID, Machine ID in BOM and asset list. By applying the proposal to strings in actual manufacturing data, we verified that the classification time of strings can be reduced by 80%. As a result, it can be estimated that the requirement of quick defect analysis can be fulfilled.Keywords: string classification, data quality, feature selection, probability distribution, string length
Procedia PDF Downloads 3181605 Rapid Method for the Determination of Acid Dyes by Capillary Electrophoresis
Authors: Can Hu, Huixia Shi, Hongcheng Mei, Jun Zhu, Hongling Guo
Abstract:
Textile fibers are important trace evidence and frequently encountered in criminal investigations. A significant aspect of fiber evidence examination is the determination of fiber dyes. Although several instrumental methods have been developed for dyes detection, the analysis speed is not fast enough yet. A rapid dye analysis method is still needed to further improve the efficiency of case handling. Capillary electrophoresis has the advantages of high separation speed and high separation efficiency and is an ideal method for the rapid analysis of fiber dyes. In this paper, acid dyes used for protein fiber dyeing were determined by a developed short-end injection capillary electrophoresis technique. Five acid red dyes with similar structures were successfully baseline separated within 5 min. The separation reproducibility is fairly good for the relative standard deviation of retention time is 0.51%. The established method is rapid and accurate which has great potential to be applied in forensic setting.Keywords: acid dyes, capillary electrophoresis, fiber evidence, rapid determination
Procedia PDF Downloads 1441604 A Semi-Markov Chain-Based Model for the Prediction of Deterioration of Concrete Bridges in Quebec
Authors: Eslam Mohammed Abdelkader, Mohamed Marzouk, Tarek Zayed
Abstract:
Infrastructure systems are crucial to every aspect of life on Earth. Existing Infrastructure is subjected to degradation while the demands are growing for a better infrastructure system in response to the high standards of safety, health, population growth, and environmental protection. Bridges play a crucial role in urban transportation networks. Moreover, they are subjected to high level of deterioration because of the variable traffic loading, extreme weather conditions, cycles of freeze and thaw, etc. The development of Bridge Management Systems (BMSs) has become a fundamental imperative nowadays especially in the large transportation networks due to the huge variance between the need for maintenance actions, and the available funds to perform such actions. Deterioration models represent a very important aspect for the effective use of BMSs. This paper presents a probabilistic time-based model that is capable of predicting the condition ratings of the concrete bridge decks along its service life. The deterioration process of the concrete bridge decks is modeled using semi-Markov process. One of the main challenges of the Markov Chain Decision Process (MCDP) is the construction of the transition probability matrix. Yet, the proposed model overcomes this issue by modeling the sojourn times based on some probability density functions. The sojourn times of each condition state are fitted to probability density functions based on some goodness of fit tests such as Kolmogorov-Smirnov test, Anderson Darling, and chi-squared test. The parameters of the probability density functions are obtained using maximum likelihood estimation (MLE). The condition ratings obtained from the Ministry of Transportation in Quebec (MTQ) are utilized as a database to construct the deterioration model. Finally, a comparison is conducted between the Markov Chain and semi-Markov chain to select the most feasible prediction model.Keywords: bridge management system, bridge decks, deterioration model, Semi-Markov chain, sojourn times, maximum likelihood estimation
Procedia PDF Downloads 2111603 A Network-Theorical Perspective on Music Analysis
Authors: Alberto Alcalá-Alvarez, Pablo Padilla-Longoria
Abstract:
The present paper describes a framework for constructing mathematical networks encoding relevant musical information from a music score for structural analysis. These graphs englobe statistical information about music elements such as notes, chords, rhythms, intervals, etc., and the relations among them, and so become helpful in visualizing and understanding important stylistic features of a music fragment. In order to build such networks, musical data is parsed out of a digital symbolic music file. This data undergoes different analytical procedures from Graph Theory, such as measuring the centrality of nodes, community detection, and entropy calculation. The resulting networks reflect important structural characteristics of the fragment in question: predominant elements, connectivity between them, and complexity of the information contained in it. Music pieces in different styles are analyzed, and the results are contrasted with the traditional analysis outcome in order to show the consistency and potential utility of this method for music analysis.Keywords: computational musicology, mathematical music modelling, music analysis, style classification
Procedia PDF Downloads 1021602 A Passive Digital Video Authentication Technique Using Wavelet Based Optical Flow Variation Thresholding
Authors: R. S. Remya, U. S. Sethulekshmi
Abstract:
Detecting the authenticity of a video is an important issue in digital forensics as Video is used as a silent evidence in court such as in child pornography, movie piracy cases, insurance claims, cases involving scientific fraud, traffic monitoring etc. The biggest threat to video data is the availability of modern open video editing tools which enable easy editing of videos without leaving any trace of tampering. In this paper, we propose an efficient passive method for inter-frame video tampering detection, its type and location by estimating the optical flow of wavelet features of adjacent frames and thresholding the variation in the estimated feature. The performance of the algorithm is compared with the z-score thresholding and achieved an efficiency above 95% on all the tested databases. The proposed method works well for videos with dynamic (forensics) as well as static (surveillance) background.Keywords: discrete wavelet transform, optical flow, optical flow variation, video tampering
Procedia PDF Downloads 3591601 Automatic Target Recognition in SAR Images Based on Sparse Representation Technique
Authors: Ahmet Karagoz, Irfan Karagoz
Abstract:
Synthetic Aperture Radar (SAR) is a radar mechanism that can be integrated into manned and unmanned aerial vehicles to create high-resolution images in all weather conditions, regardless of day and night. In this study, SAR images of military vehicles with different azimuth and descent angles are pre-processed at the first stage. The main purpose here is to reduce the high speckle noise found in SAR images. For this, the Wiener adaptive filter, the mean filter, and the median filters are used to reduce the amount of speckle noise in the images without causing loss of data. During the image segmentation phase, pixel values are ordered so that the target vehicle region is separated from other regions containing unnecessary information. The target image is parsed with the brightest 20% pixel value of 255 and the other pixel values of 0. In addition, by using appropriate parameters of statistical region merging algorithm, segmentation comparison is performed. In the step of feature extraction, the feature vectors belonging to the vehicles are obtained by using Gabor filters with different orientation, frequency and angle values. A number of Gabor filters are created by changing the orientation, frequency and angle parameters of the Gabor filters to extract important features of the images that form the distinctive parts. Finally, images are classified by sparse representation method. In the study, l₁ norm analysis of sparse representation is used. A joint database of the feature vectors generated by the target images of military vehicle types is obtained side by side and this database is transformed into the matrix form. In order to classify the vehicles in a similar way, the test images of each vehicle is converted to the vector form and l₁ norm analysis of the sparse representation method is applied through the existing database matrix form. As a result, correct recognition has been performed by matching the target images of military vehicles with the test images by means of the sparse representation method. 97% classification success of SAR images of different military vehicle types is obtained.Keywords: automatic target recognition, sparse representation, image classification, SAR images
Procedia PDF Downloads 3661600 A Network Approach to Analyzing Financial Markets
Authors: Yusuf Seedat
Abstract:
The necessity to understand global financial markets has increased following the unfortunate spread of the recent financial crisis around the world. Financial markets are considered to be complex systems consisting of highly volatile move-ments whose indexes fluctuate without any clear pattern. Analytic methods of stock prices have been proposed in which financial markets are modeled using common network analysis tools and methods. It has been found that two key components of social network analysis are relevant to modeling financial markets, allowing us to forecast accurate predictions of stock prices within the financial market. Financial markets have a number of interacting components, leading to complex behavioral patterns. This paper describes a social network approach to analyzing financial markets as a viable approach to studying the way complex stock markets function. We also look at how social network analysis techniques and metrics are used to gauge an understanding of the evolution of financial markets as well as how community detection can be used to qualify and quantify in-fluence within a network.Keywords: network analysis, social networks, financial markets, stocks, nodes, edges, complex networks
Procedia PDF Downloads 1911599 Artificial Intelligence-Based Detection of Individuals Suffering from Vestibular Disorder
Authors: Dua Hişam, Serhat İkizoğlu
Abstract:
Identifying the problem behind balance disorder is one of the most interesting topics in the medical literature. This study has considerably enhanced the development of artificial intelligence (AI) algorithms applying multiple machine learning (ML) models to sensory data on gait collected from humans to classify between normal people and those suffering from Vestibular System (VS) problems. Although AI is widely utilized as a diagnostic tool in medicine, AI models have not been used to perform feature extraction and identify VS disorders through training on raw data. In this study, three machine learning (ML) models, the Random Forest Classifier (RF), Extreme Gradient Boosting (XGB), and K-Nearest Neighbor (KNN), have been trained to detect VS disorder, and the performance comparison of the algorithms has been made using accuracy, recall, precision, and f1-score. With an accuracy of 95.28 %, Random Forest Classifier (RF) was the most accurate model.Keywords: vestibular disorder, machine learning, random forest classifier, k-nearest neighbor, extreme gradient boosting
Procedia PDF Downloads 691598 GPU Based High Speed Error Protection for Watermarked Medical Image Transmission
Authors: Md Shohidul Islam, Jongmyon Kim, Ui-pil Chong
Abstract:
Medical image is an integral part of e-health care and e-diagnosis system. Medical image watermarking is widely used to protect patients’ information from malicious alteration and manipulation. The watermarked medical images are transmitted over the internet among patients, primary and referred physicians. The images are highly prone to corruption in the wireless transmission medium due to various noises, deflection, and refractions. Distortion in the received images leads to faulty watermark detection and inappropriate disease diagnosis. To address the issue, this paper utilizes error correction code (ECC) with (8, 4) Hamming code in an existing watermarking system. In addition, we implement the high complex ECC on a graphics processing units (GPU) to accelerate and support real-time requirement. Experimental results show that GPU achieves considerable speedup over the sequential CPU implementation, while maintaining 100% ECC efficiency.Keywords: medical image watermarking, e-health system, error correction, Hamming code, GPU
Procedia PDF Downloads 2901597 Comparison of Concentration of Heavy Metals in PM2.5 Analyzed in Three Different Global Research Institutions Using X-Ray Fluorescence
Authors: Sungroul Kim, Yeonjin Kim
Abstract:
This study was conducted by comparing the concentrations of heavy metals analyzed from the same samples with three X-Ray fluorescence (XRF) spectrometer in three different global research institutions, including PAN (A Branch of Malvern Panalytical, Seoul, South Korea), RTI (Research Triangle Institute, NC, U.S.A), and aerosol laboratory in Harvard University, Boston, U.S.A. To achieve our research objectives, the indoor air filter samples were collected at homes (n=24) of adults or child asthmatics then analyzed in PAN followed by Harvard University and RTI consecutively. Descriptive statistics were conducted for data comparison as well as correlation and simple regression analysis using R version 4.0.3. As a result, detection rates of most heavy metals analyzed in three institutions were about 90%. Of the 25 elements commonly analyzed among those institutions, 16 elements showed an R² (coefficient of determination) of 0.7 or higher (10 components were 0.9 or higher). The findings of this study demonstrated that XRF was a useful device ensuring reproducibility and compatibility for measuring heavy metals in PM2.5 collected from indoor air of asthmatics’ home.Keywords: heavy metals, indoor air quality, PM2.5, X-ray fluorescence
Procedia PDF Downloads 2001596 A Review of HVDC Modular Multilevel Converters Subjected to DC and AC Faults
Authors: Jude Inwumoh, Adam P. R. Taylor, Kosala Gunawardane
Abstract:
Modular multilevel converters (MMC) exhibit a highly scalable and modular characteristic with good voltage/power expansion, fault tolerance capability, low output harmonic content, good redundancy, and a flexible front-end configuration. Fault detection, location, and isolation, as well as maintaining fault ride-through (FRT), are major challenges to MMC reliability and power supply sustainability. Different papers have been reviewed to seek the best MMC configuration with fault capability. DC faults are the most common fault, while the probability that AC fault occurs in a modular multilevel converter (MCC) is low; though, AC faults consequence are severe. This paper reviews several MMC topologies and modulation techniques in tackling faults. These fault control strategies are compared based on cost, complexity, controllability, and power loss. A meshed network of half-bridge (HB) MMC topology was optimal in rendering fault ride through than any other MMC topologies but only when combined with DC circuit breakers (CBS), AC CBS, and fault current limiters (FCL).Keywords: MMC-HVDC, DC faults, fault current limiters, control scheme
Procedia PDF Downloads 1391595 Single Cell and Spatial Transcriptomics: A Beginners Viewpoint from the Conceptual Pipeline
Authors: Leo Nnamdi Ozurumba-Dwight
Abstract:
Messenger ribooxynucleic acid (mRNA) molecules are compositional, protein-based. These proteins, encoding mRNA molecules (which collectively connote the transcriptome), when analyzed by RNA sequencing (RNAseq), unveils the nature of gene expression in the RNA. The obtained gene expression provides clues of cellular traits and their dynamics in presentations. These can be studied in relation to function and responses. RNAseq is a practical concept in Genomics as it enables detection and quantitative analysis of mRNA molecules. Single cell and spatial transcriptomics both present varying avenues for expositions in genomic characteristics of single cells and pooled cells in disease conditions such as cancer, auto-immune diseases, hematopoietic based diseases, among others, from investigated biological tissue samples. Single cell transcriptomics helps conduct a direct assessment of each building unit of tissues (the cell) during diagnosis and molecular gene expressional studies. A typical technique to achieve this is through the use of a single-cell RNA sequencer (scRNAseq), which helps in conducting high throughput genomic expressional studies. However, this technique generates expressional gene data for several cells which lack presentations on the cells’ positional coordinates within the tissue. As science is developmental, the use of complimentary pre-established tissue reference maps using molecular and bioinformatics techniques has innovatively sprung-forth and is now used to resolve this set back to produce both levels of data in one shot of scRNAseq analysis. This is an emerging conceptual approach in methodology for integrative and progressively dependable transcriptomics analysis. This can support in-situ fashioned analysis for better understanding of tissue functional organization, unveil new biomarkers for early-stage detection of diseases, biomarkers for therapeutic targets in drug development, and exposit nature of cell-to-cell interactions. Also, these are vital genomic signatures and characterizations of clinical applications. Over the past decades, RNAseq has generated a wide array of information that is igniting bespoke breakthroughs and innovations in Biomedicine. On the other side, spatial transcriptomics is tissue level based and utilized to study biological specimens having heterogeneous features. It exposits the gross identity of investigated mammalian tissues, which can then be used to study cell differentiation, track cell line trajectory patterns and behavior, and regulatory homeostasis in disease states. Also, it requires referenced positional analysis to make up of genomic signatures that will be sassed from the single cells in the tissue sample. Given these two presented approaches to RNA transcriptomics study in varying quantities of cell lines, with avenues for appropriate resolutions, both approaches have made the study of gene expression from mRNA molecules interesting, progressive, developmental, and helping to tackle health challenges head-on.Keywords: transcriptomics, RNA sequencing, single cell, spatial, gene expression.
Procedia PDF Downloads 1221594 Walking in a Weather rather than a Climate: Critique on the Meta-Narrative of Buddhism in Early India
Authors: Yongjun Kim
Abstract:
Since the agreement on the historicity of historical Buddha in eastern India, the beginning, heyday and decline of Buddhism in Early India have been discussed in urbanization, commercialism and state formation context, in short, Weberian socio-politico frame. Recent Scholarship, notably in archaeology and anthropology, has proposed ‘re-materialization of Buddhism in Early India’ based on what Buddhist had actually done rather than what they should do according to canonical teachings or philosophies. But its historical narrations still remain with a domain of socio-politico meta-narrative which tends to unjustifiably dismiss the naturally existing heterogeneity and often chaotic dynamic of diverse agencies, landscape perceptions, localized traditions, etc. An author will argue the multiplicity of theoretical standpoints for the reconstruction on the Buddhism in Early India. For this, at first, the diverse agencies, localized traditions, landscape patterns of Buddhist communities and monasteries in Trans-Himalayan regions; focusing Zanskar Valley and Spiti Valley in India will be illustrated based on an author’s field work. And then an author will discuss this anthropological landscape analysis is better appropriated with textual and archaeological evidences on the tension between urban monastic and forest Buddhism, the phenomena of sacred landscape, cemetery, garden, natural cave along with socio-economic landscape, the demographic heterogeneity in Early India. Finally, it will be attempted to compare between anthropological landscape of present Trans-Himalayan and archaeological one of ancient Western India. The study of Buddhism in Early India has hardly been discussed through multivalent theoretical archaeology and anthropology of religion, thus traditional and recent scholarship have produced historical meta-narrative though heterogeneous among them. The multidisciplinary approaches of textual critics, archaeology and anthropology will surely help to deconstruct the grand and all-encompassing historical description on Buddhism in Early India and then to reconstruct the localized, behavioral and multivalent narratives. This paper expects to highlight the importance of lesser-studied Buddhist archaeological sites and the dynamic views on religious landscape in Early India with a help of critical anthropology of religion.Keywords: analogy by living traditions, Buddhism in Early India, landscape analysis, meta-narrative
Procedia PDF Downloads 3331593 Deep Learning for Renewable Power Forecasting: An Approach Using LSTM Neural Networks
Authors: Fazıl Gökgöz, Fahrettin Filiz
Abstract:
Load forecasting has become crucial in recent years and become popular in forecasting area. Many different power forecasting models have been tried out for this purpose. Electricity load forecasting is necessary for energy policies, healthy and reliable grid systems. Effective power forecasting of renewable energy load leads the decision makers to minimize the costs of electric utilities and power plants. Forecasting tools are required that can be used to predict how much renewable energy can be utilized. The purpose of this study is to explore the effectiveness of LSTM-based neural networks for estimating renewable energy loads. In this study, we present models for predicting renewable energy loads based on deep neural networks, especially the Long Term Memory (LSTM) algorithms. Deep learning allows multiple layers of models to learn representation of data. LSTM algorithms are able to store information for long periods of time. Deep learning models have recently been used to forecast the renewable energy sources such as predicting wind and solar energy power. Historical load and weather information represent the most important variables for the inputs within the power forecasting models. The dataset contained power consumption measurements are gathered between January 2016 and December 2017 with one-hour resolution. Models use publicly available data from the Turkish Renewable Energy Resources Support Mechanism. Forecasting studies have been carried out with these data via deep neural networks approach including LSTM technique for Turkish electricity markets. 432 different models are created by changing layers cell count and dropout. The adaptive moment estimation (ADAM) algorithm is used for training as a gradient-based optimizer instead of SGD (stochastic gradient). ADAM performed better than SGD in terms of faster convergence and lower error rates. Models performance is compared according to MAE (Mean Absolute Error) and MSE (Mean Squared Error). Best five MAE results out of 432 tested models are 0.66, 0.74, 0.85 and 1.09. The forecasting performance of the proposed LSTM models gives successful results compared to literature searches.Keywords: deep learning, long short term memory, energy, renewable energy load forecasting
Procedia PDF Downloads 2661592 Detecting Cyberbullying, Spam and Bot Behavior and Fake News in Social Media Accounts Using Machine Learning
Authors: M. D. D. Chathurangi, M. G. K. Nayanathara, K. M. H. M. M. Gunapala, G. M. R. G. Dayananda, Kavinga Yapa Abeywardena, Deemantha Siriwardana
Abstract:
Due to the growing popularity of social media platforms at present, there are various concerns, mostly cyberbullying, spam, bot accounts, and the spread of incorrect information. To develop a risk score calculation system as a thorough method for deciphering and exposing unethical social media profiles, this research explores the most suitable algorithms to our best knowledge in detecting the mentioned concerns. Various multiple models, such as Naïve Bayes, CNN, KNN, Stochastic Gradient Descent, Gradient Boosting Classifier, etc., were examined, and the best results were taken into the development of the risk score system. For cyberbullying, the Logistic Regression algorithm achieved an accuracy of 84.9%, while the spam-detecting MLP model gained 98.02% accuracy. The bot accounts identifying the Random Forest algorithm obtained 91.06% accuracy, and 84% accuracy was acquired for fake news detection using SVM.Keywords: cyberbullying, spam behavior, bot accounts, fake news, machine learning
Procedia PDF Downloads 361591 Tax Evasion with Mobility between the Regular and Irregular Sectors
Authors: Xavier Ruiz Del Portal
Abstract:
This paper incorporates mobility between the legal and black economies into a model of tax evasion with endogenous labor supply in which underreporting is possible in one sector but impossible in the other. We have found that the results of the effects along the extensive margin (number of evaders) become more robust and conclusive than those along the intensive margin (hours of illegal work) usually considered by the literature. In particular, it is shown that the following policies reduce the number of evaders: (a) larger and more progressive evasion penalties; (b) higher detection probabilities; (c) an increase in the legal sector wage rate; (d) a decrease in the moonlighting wage rate; (e) higher costs for creating opportunities to evade; (f) lower opportunities to evade, and (g) greater psychological costs of tax evasion. When tax concealment and illegal work also are taken into account, the effects do not vary significantly under the assumptions in Cowell (1985), except for the fact that policies (a) and (b) only hold as regards low- and middle-income groups and policies (e) and (f) as regards high-income groups.Keywords: income taxation, tax evasion, extensive margin responses, the penalty system
Procedia PDF Downloads 155