Search results for: multi-scale feature extraction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3243

Search results for: multi-scale feature extraction

2433 Localization of Geospatial Events and Hoax Prediction in the UFO Database

Authors: Harish Krishnamurthy, Anna Lafontant, Ren Yi

Abstract:

Unidentified Flying Objects (UFOs) have been an interesting topic for most enthusiasts and hence people all over the United States report such findings online at the National UFO Report Center (NUFORC). Some of these reports are a hoax and among those that seem legitimate, our task is not to establish that these events confirm that they indeed are events related to flying objects from aliens in outer space. Rather, we intend to identify if the report was a hoax as was identified by the UFO database team with their existing curation criterion. However, the database provides a wealth of information that can be exploited to provide various analyses and insights such as social reporting, identifying real-time spatial events and much more. We perform analysis to localize these time-series geospatial events and correlate with known real-time events. This paper does not confirm any legitimacy of alien activity, but rather attempts to gather information from likely legitimate reports of UFOs by studying the online reports. These events happen in geospatial clusters and also are time-based. We look at cluster density and data visualization to search the space of various cluster realizations to decide best probable clusters that provide us information about the proximity of such activity. A random forest classifier is also presented that is used to identify true events and hoax events, using the best possible features available such as region, week, time-period and duration. Lastly, we show the performance of the scheme on various days and correlate with real-time events where one of the UFO reports strongly correlates to a missile test conducted in the United States.

Keywords: time-series clustering, feature extraction, hoax prediction, geospatial events

Procedia PDF Downloads 361
2432 A Comprehensive Review of Artificial Intelligence Applications in Sustainable Building

Authors: Yazan Al-Kofahi, Jamal Alqawasmi.

Abstract:

In this study, a comprehensive literature review (SLR) was conducted, with the main goal of assessing the existing literature about how artificial intelligence (AI), machine learning (ML), deep learning (DL) models are used in sustainable architecture applications and issues including thermal comfort satisfaction, energy efficiency, cost prediction and many others issues. For this reason, the search strategy was initiated by using different databases, including Scopus, Springer and Google Scholar. The inclusion criteria were used by two research strings related to DL, ML and sustainable architecture. Moreover, the timeframe for the inclusion of the papers was open, even though most of the papers were conducted in the previous four years. As a paper filtration strategy, conferences and books were excluded from database search results. Using these inclusion and exclusion criteria, the search was conducted, and a sample of 59 papers was selected as the final included papers in the analysis. The data extraction phase was basically to extract the needed data from these papers, which were analyzed and correlated. The results of this SLR showed that there are many applications of ML and DL in Sustainable buildings, and that this topic is currently trendy. It was found that most of the papers focused their discussions on addressing Environmental Sustainability issues and factors using machine learning predictive models, with a particular emphasis on the use of Decision Tree algorithms. Moreover, it was found that the Random Forest repressor demonstrates strong performance across all feature selection groups in terms of cost prediction of the building as a machine-learning predictive model.

Keywords: machine learning, deep learning, artificial intelligence, sustainable building

Procedia PDF Downloads 46
2431 Comparative Life Cycle Assessment of High Barrier Polymer Packaging for Selecting Resource Efficient and Environmentally Low-Impact Materials

Authors: D. Kliaugaitė, J. K, Staniškis

Abstract:

In this study tree types of multilayer gas barrier plastic packaging films were compared using life cycle assessment as a tool for resource efficient and environmentally low-impact materials selection. The first type of multilayer packaging film (PET-AlOx/LDPE) consists of polyethylene terephthalate with barrier layer AlOx (PET-AlOx) and low density polyethylene (LDPE). The second type of polymer film (PET/PE-EVOH-PE) is made of polyethylene terephthalate (PET) and co-extrusion film PE-EVOH-PE as barrier layer. And the third one type of multilayer packaging film (PET-PVOH/LDPE) is formed from polyethylene terephthalate with barrier layer PVOH (PET-PVOH) and low density polyethylene (LDPE). All of analyzed packaging has significant impact to resource depletion, because of raw materials extraction and energy use and production of different kind of plastics. Nevertheless the impact generated during life cycle of functional unit of II type of packaging (PET/PE-EVOH-PE) was about 25% lower than impact generated by I type (PET-AlOx/LDPE) and III type (PET-PVOH/LDPE) of packaging. Result revealed that the contribution of different gas barrier type to the overall environmental problem of packaging is not significant. The impact are mostly generated by using energy and materials during raw material extraction and production of different plastic materials as plastic polymers material as PE, LDPE and PET, but not gas barrier materials as AlOx, PVOH and EVOH. The LCA results could be useful in different decision-making processes, for selecting resource efficient and environmentally low-impact materials.

Keywords: life cycle assessment, polymer packaging, resource efficiency, materials extraction, polyethylene terephthalate

Procedia PDF Downloads 343
2430 Artificial Intelligence Based Abnormality Detection System and Real Valuᵀᴹ Product Design

Authors: Junbeom Lee, Jaehyuck Cho, Wookyeong Jeong, Jonghan Won, Jungmin Hwang, Youngseok Song, Taikyeong Jeong

Abstract:

This paper investigates and analyzes meta-learning technologies that use multiple-cameras to monitor and check abnormal behavior in people in real-time in the area of healthcare fields. Advances in artificial intelligence and computer vision technologies have confirmed that cameras can be useful for individual health monitoring and abnormal behavior detection. Through this, it is possible to establish a system that can respond early by automatically detecting abnormal behavior of the elderly, such as patients and the elderly. In this paper, we use a technique called meta-learning to analyze image data collected from cameras and develop a commercial product to determine abnormal behavior. Meta-learning applies machine learning algorithms to help systems learn and adapt quickly to new real data. Through this, the accuracy and reliability of the abnormal behavior discrimination system can be improved. In addition, this study proposes a meta-learning-based abnormal behavior detection system that includes steps such as data collection and preprocessing, feature extraction and selection, and classification model development. Various healthcare scenarios and experiments analyze the performance of the proposed system and demonstrate excellence compared to other existing methods. Through this study, we present the possibility that camera-based meta-learning technology can be useful for monitoring and testing abnormal behavior in the healthcare area.

Keywords: artificial intelligence, abnormal behavior, early detection, health monitoring

Procedia PDF Downloads 66
2429 Actually Existing Policy Mobilities in Czechia: Comparing Creative and Smart Cities

Authors: Ondrej Slach, Jan Machacek, Jan Zenka, Lucie Hyllova, Petr Rumpel

Abstract:

The aim of the paper is to identify and asses different trajectories of two fashionable urban policies –creative and smart cities– in specific post-socialistic context. Drawing on the case of Czechia, we employ the concept of policy mobility research. More specifically, we employ a discourse analysis in order to identify the so-called 'infrastructure' of both policies (such as principal actors, journals, conferences, events), with the special focus on 'agents of transfer' in a multiscale perspective. The preliminary results indicate faster and more aggressive spatial penetration of smart cities policy compared to creative cities policy in Czechia. Further, it seems that existed translation and implementation of smart cities policy into the national and urban context resulted in deliberated fragmented policy of smart cities in Czechia (pure technocratic view), which might be a threat for the future development of social sustainability, especially in cities that are facing increasing social polarisation. Last but not least, due to the fast spatial penetration of the concept and policies of smart cities, it seems that creative cities policy has almost been crowded out of the Czech urban agenda.

Keywords: policy mobility, smart cities, creative cities, Czechia

Procedia PDF Downloads 151
2428 Parameters of Validation Method of Determining Polycyclic Aromatic Hydrocarbons in Drinking Water by High Performance Liquid Chromatography

Authors: Jonida Canaj

Abstract:

A simple method of extraction and determination of fifteen priority polycyclic aromatic hydrocarbons (PAHs) from drinking water using high performance liquid chromatography (HPLC) has been validated with limits of detection (LOD) and limits of quantification (LOQ), method recovery and reproducibility, and other factors. HPLC parameters, such as mobile phase composition and flow standardized for determination of PAHs using fluorescent detector (FLD). PAH was carried out by liquid-liquid extraction using dichloromethane. Linearity of calibration curves was good for all PAH (R², 0.9954-1.0000) in the concentration range 0.1-100 ppb. Analysis of standard spiked water samples resulted in good recoveries between 78.5-150%(0.1ppb) and 93.04-137.47% (10ppb). The estimated LOD and LOQ ranged between 0.0018-0.98 ppb. The method described has been used for determination of the fifteen PAHs contents in drinking water samples.

Keywords: high performance liquid chromatography, HPLC, method validation, polycyclic aromatic hydrocarbons, PAHs, water

Procedia PDF Downloads 86
2427 Algorithm Research on Traffic Sign Detection Based on Improved EfficientDet

Authors: Ma Lei-Lei, Zhou You

Abstract:

Aiming at the problems of low detection accuracy of deep learning algorithm in traffic sign detection, this paper proposes improved EfficientDet based traffic sign detection algorithm. Multi-head self-attention is introduced in the minimum resolution layer of the backbone of EfficientDet to achieve effective aggregation of local and global depth information, and this study proposes an improved feature fusion pyramid with increased vertical cross-layer connections, which improves the performance of the model while introducing a small amount of complexity, the Balanced L1 Loss is introduced to replace the original regression loss function Smooth L1 Loss, which solves the problem of balance in the loss function. Experimental results show, the algorithm proposed in this study is suitable for the task of traffic sign detection. Compared with other models, the improved EfficientDet has the best detection accuracy. Although the test speed is not completely dominant, it still meets the real-time requirement.

Keywords: convolutional neural network, transformer, feature pyramid networks, loss function

Procedia PDF Downloads 83
2426 A Computational Framework for Load Mediated Patellar Ligaments Damage at the Tropocollagen Level

Authors: Fadi Al Khatib, Raouf Mbarki, Malek Adouni

Abstract:

In various sport and recreational activities, the patellofemoral joint undergoes large forces and moments while accommodating the significant knee joint movement. In doing so, this joint is commonly the source of anterior knee pain related to instability in normal patellar tracking and excessive pressure syndrome. One well-observed explanation of the instability of the normal patellar tracking is the patellofemoral ligaments and patellar tendon damage. Improved knowledge of the damage mechanism mediating ligaments and tendon injuries can be a great help not only in rehabilitation and prevention procedures but also in the design of better reconstruction systems in the management of knee joint disorders. This damage mechanism, specifically due to excessive mechanical loading, has been linked to the micro level of the fibred structure precisely to the tropocollagen molecules and their connection density. We argue defining a clear frame starting from the bottom (micro level) to up (macro level) in the hierarchies of the soft tissue may elucidate the essential underpinning on the state of the ligaments damage. To do so, in this study a multiscale fibril reinforced hyper elastoplastic Finite Element model that accounts for the synergy between molecular and continuum syntheses was developed to determine the short-term stresses/strains patellofemoral ligaments and tendon response. The plasticity of the proposed model is associated only with the uniaxial deformation of the collagen fibril. The yield strength of the fibril is a function of the cross-link density between tropocollagen molecules, defined here by a density function. This function obtained through a Coarse-graining procedure linking nanoscale collagen features and the tissue level materials properties using molecular dynamics simulations. The hierarchies of the soft tissues were implemented using the rule of mixtures. Thereafter, the model was calibrated using a statistical calibration procedure. The model then implemented into a real structure of patellofemoral ligaments and patellar tendon (OpenKnee) and simulated under realistic loading conditions. With the calibrated material parameters the calculated axial stress lies well with the experimental measurement with a coefficient of determination (R2) equal to 0.91 and 0.92 for the patellofemoral ligaments and the patellar tendon respectively. The ‘best’ prediction of the yielding strength and strain as compared with the reported experimental data yielded when the cross-link density between the tropocollagen molecule of the fibril equal to 5.5 ± 0.5 (patellofemoral ligaments) and 12 (patellar tendon). Damage initiation of the patellofemoral ligaments was located at the femoral insertions while the damage of the patellar tendon happened in the middle of the structure. These predicted finding showed a meaningful correlation between the cross-link density of the tropocollagen molecules and the stiffness of the connective tissues of the extensor mechanism. Also, damage initiation and propagation were documented with this model, which were in satisfactory agreement with earlier observation. To the best of our knowledge, this is the first attempt to model ligaments from the bottom up, predicted depending to the tropocollagen cross-link density. This approach appears more meaningful towards a realistic simulation of a damaging process or repair attempt compared with certain published studies.

Keywords: tropocollagen, multiscale model, fibrils, knee ligaments

Procedia PDF Downloads 115
2425 Internet of Things Networks: Denial of Service Detection in Constrained Application Protocol Using Machine Learning Algorithm

Authors: Adamu Abdullahi, On Francisca, Saidu Isah Rambo, G. N. Obunadike, D. T. Chinyio

Abstract:

The paper discusses the potential threat of Denial of Service (DoS) attacks in the Internet of Things (IoT) networks on constrained application protocols (CoAP). As billions of IoT devices are expected to be connected to the internet in the coming years, the security of these devices is vulnerable to attacks, disrupting their functioning. This research aims to tackle this issue by applying mixed methods of qualitative and quantitative for feature selection, extraction, and cluster algorithms to detect DoS attacks in the Constrained Application Protocol (CoAP) using the Machine Learning Algorithm (MLA). The main objective of the research is to enhance the security scheme for CoAP in the IoT environment by analyzing the nature of DoS attacks and identifying a new set of features for detecting them in the IoT network environment. The aim is to demonstrate the effectiveness of the MLA in detecting DoS attacks and compare it with conventional intrusion detection systems for securing the CoAP in the IoT environment. Findings: The research identifies the appropriate node to detect DoS attacks in the IoT network environment and demonstrates how to detect the attacks through the MLA. The accuracy detection in both classification and network simulation environments shows that the k-means algorithm scored the highest percentage in the training and testing of the evaluation. The network simulation platform also achieved the highest percentage of 99.93% in overall accuracy. This work reviews conventional intrusion detection systems for securing the CoAP in the IoT environment. The DoS security issues associated with the CoAP are discussed.

Keywords: algorithm, CoAP, DoS, IoT, machine learning

Procedia PDF Downloads 54
2424 Phylogenetic Differential Separation of Environmental Samples

Authors: Amber C. W. Vandepoele, Michael A. Marciano

Abstract:

Biological analyses frequently focus on single organisms, however many times, the biological sample consists of more than the target organism; for example, human microbiome research targets bacterial DNA, yet most samples consist largely of human DNA. Therefore, there would be an advantage to removing these contaminating organisms. Conversely, some analyses focus on a single organism but would greatly benefit from the additional information regarding the other organismal components of the sample. Forensic analysis is one such example, wherein most forensic casework, human DNA is targeted; however, it typically exists in complex non-pristine sample substrates such as soil or unclean surfaces. These complex samples are commonly comprised of not just human tissue but also microbial and plant life, where these organisms may help gain more forensically relevant information about a specific location or interaction. This project aims to optimize a ‘phylogenetic’ differential extraction method that will separate mammalian, bacterial and plant cells in a mixed sample. This is accomplished through the use of size exclusion separation, whereby the different cell types are separated through multiple filtrations using 5 μm filters. The components are then lysed via differential enzymatic sensitivities among the cells and extracted with minimal contribution from the preceding component. This extraction method will then allow complex DNA samples to be more easily interpreted through non-targeting sequencing since the data will not be skewed toward the smaller and usually more numerous bacterial DNAs. This research project has demonstrated that this ‘phylogenetic’ differential extraction method successfully separated the epithelial and bacterial cells from each other with minimal cell loss. We will take this one step further, showing that when adding the plant cells into the mixture, they will be separated and extracted from the sample. Research is ongoing, and results are pending.

Keywords: DNA isolation, geolocation, non-human, phylogenetic separation

Procedia PDF Downloads 99
2423 Speciation Analysis by Solid-Phase Microextraction and Application to Atrazine

Authors: K. Benhabib, X. Pierens, V-D Nguyen, G. Mimanne

Abstract:

The main hypothesis of the dynamics of solid phase microextraction (SPME) is that steady-state mass transfer is respected throughout the SPME extraction process. It considers steady-state diffusion is established in the two phases and fast exchange of the analyte at the solid phase film/water interface. An improved model is proposed in this paper to handle with the situation when the analyte (atrazine) is in contact with colloid suspensions (carboxylate latex in aqueous solution). A mathematical solution is obtained by substituting the diffusion coefficient by the mean of diffusion coefficient between analyte and carboxylate latex, and also thickness layer by the mean thickness in aqueous solution. This solution provides an equation relating the extracted amount of the analyte to the extraction a little more complicated than previous models. It also gives a better description of experimental observations. Moreover, the rate constant of analyte obtained is in satisfactory agreement with that obtained from the initial curve fitting.

Keywords: pesticide, solid-phase microextraction (SPME) methods, steady state, analytical model

Procedia PDF Downloads 474
2422 Recovery of Au and Other Metals from Old Electronic Components by Leaching and Liquid Extraction Process

Authors: Tomasz Smolinski, Irena Herdzik-Koniecko, Marta Pyszynska, M. Rogowski

Abstract:

Old electronic components can be easily found nowadays. Significant quantities of valuable metals such as gold, silver or copper are used for the production of advanced electronic devices. Old useless electronic device slowly became a new source of precious metals, very often more efficient than natural. For example, it is possible to recover more gold from 1-ton personal computers than seventeen tons of gold ore. It makes urban mining industry very profitable and necessary for sustainable development. For the recovery of metals from waste of electronic equipment, various treatment options based on conventional physical, hydrometallurgical and pyrometallurgical processes are available. In this group hydrometallurgy processes with their relatively low capital cost, low environmental impact, potential for high metal recoveries and suitability for small scale applications, are very promising options. Institute of Nuclear Chemistry and Technology has great experience in hydrometallurgy processes especially focused on recovery metals from industrial and agricultural wastes. At the moment, urban mining project is carried out. The method of effective recovery of valuable metals from central processing units (CPU) components has been developed. The principal processes such as acidic leaching and solvent extraction were used for precious metals recovery from old processors and graphic cards. Electronic components were treated by acidic solution at various conditions. Optimal acid concentration, time of the process and temperature were selected. Precious metals have been extracted to the aqueous phase. At the next step, metals were selectively extracted by organic solvents such as oximes or tributyl phosphate (TBP) etc. Multistage mixer-settler equipment was used. The process was optimized.

Keywords: electronic waste, leaching, hydrometallurgy, metal recovery, solvent extraction

Procedia PDF Downloads 125
2421 Organic Matter Distribution in Bazhenov Source Rock: Insights from Sequential Extraction and Molecular Geochemistry

Authors: Margarita S. Tikhonova, Alireza Baniasad, Anton G. Kalmykov, Georgy A. Kalmykov, Ralf Littke

Abstract:

There is a high complexity in the pore structure of organic-rich rocks caused by the combination of inter-particle porosity from inorganic mineral matter and ultrafine intra-particle porosity from both organic matter and clay minerals. Fluids are retained in that pore space, but there are major uncertainties in how and where the fluids are stored and to what extent they are accessible or trapped in 'closed' pores. A large degree of tortuosity may lead to fractionation of organic matter so that the lighter and flexible compounds would diffuse to the reservoir whereas more complicated compounds may be locked in place. Additionally, parts of hydrocarbons could be bound to solid organic matter –kerogen– and mineral matrix during expulsion and migration. Larger compounds can occupy thin channels so that clogging or oil and gas entrapment will occur. Sequential extraction of applying different solvents is a powerful tool to provide more information about the characteristics of trapped organic matter distribution. The Upper Jurassic – Lower Cretaceous Bazhenov shale is one of the most petroliferous source rock extended in West Siberia, Russia. Concerning the variable mineral composition, pore space distribution and thermal maturation, there are high uncertainties in distribution and composition of organic matter in this formation. In order to address this issue geological and geochemical properties of 30 samples including mineral composition (XRD and XRF), structure and texture (thin-section microscopy), organic matter contents, type and thermal maturity (Rock-Eval) as well as molecular composition (GC-FID and GC-MS) of different extracted materials during sequential extraction were considered. Sequential extraction was performed by a Soxhlet apparatus using different solvents, i.e., n-hexane, chloroform and ethanol-benzene (1:1 v:v) first on core plugs and later on pulverized materials. The results indicate that the studied samples are mainly composed of type II kerogen with TOC contents varied from 5 to 25%. The thermal maturity ranged from immature to late oil window. Whereas clay contents decreased with increasing maturity, the amount of silica increased in the studied samples. According to molecular geochemistry, stored hydrocarbons in open and closed pore space reveal different geochemical fingerprints. The results improve our understanding of hydrocarbon expulsion and migration in the organic-rich Bazhenov shale and therefore better estimation of hydrocarbon potential for this formation.

Keywords: Bazhenov formation, bitumen, molecular geochemistry, sequential extraction

Procedia PDF Downloads 154
2420 Predicting Match Outcomes in Team Sport via Machine Learning: Evidence from National Basketball Association

Authors: Jacky Liu

Abstract:

This paper develops a team sports outcome prediction system with potential for wide-ranging applications across various disciplines. Despite significant advancements in predictive analytics, existing studies in sports outcome predictions possess considerable limitations, including insufficient feature engineering and underutilization of advanced machine learning techniques, among others. To address these issues, we extend the Sports Cross Industry Standard Process for Data Mining (SRP-CRISP-DM) framework and propose a unique, comprehensive predictive system, using National Basketball Association (NBA) data as an example to test this extended framework. Our approach follows a holistic methodology in feature engineering, employing both Time Series and Non-Time Series Data, as well as conducting Explanatory Data Analysis and Feature Selection. Furthermore, we contribute to the discourse on target variable choice in team sports outcome prediction, asserting that point spread prediction yields higher profits as opposed to game-winner predictions. Using machine learning algorithms, particularly XGBoost, results in a significant improvement in predictive accuracy of team sports outcomes. Applied to point spread betting strategies, it offers an astounding annual return of approximately 900% on an initial investment of $100. Our findings not only contribute to academic literature, but have critical practical implications for sports betting. Our study advances the understanding of team sports outcome prediction a burgeoning are in complex system predictions and pave the way for potential profitability and more informed decision making in sports betting markets.

Keywords: machine learning, team sports, game outcome prediction, sports betting, profits simulation

Procedia PDF Downloads 81
2419 Exploiting the Potential of Fabric Phase Sorptive Extraction for Forensic Food Safety: Analysis of Food Samples in Cases of Drug Facilitated Crimes

Authors: Bharti Jain, Rajeev Jain, Abuzar Kabir, Torki Zughaibi, Shweta Sharma

Abstract:

Drug-facilitated crimes (DFCs) entail the use of a single drug or a mixture of drugs to render a victim unable. Traditionally, biological samples have been gathered from victims and conducted analysis to establish evidence of drug administration. Nevertheless, the rapid metabolism of various drugs and delays in analysis can impede the identification of such substances. For this, the present article describes a rapid, sustainable, highly efficient and miniaturized protocol for the identification and quantification of three sedative-hypnotic drugs, namely diazepam, chlordiazepoxide and ketamine in alcoholic beverages and complex food samples (cream of biscuit, flavored milk, juice, cake, tea, sweets and chocolate). The methodology involves utilizing fabric phase sorptive extraction (FPSE) to extract diazepam (DZ), chlordiazepoxide (CDP), and ketamine (KET). Subsequently, the extracted samples are subjected to analysis using gas chromatography-mass spectrometry (GC-MS). Several parameters, including the type of membrane, pH, agitation time and speed, ionic strength, sample volume, elution volume and time, and type of elution solvent, were screened and thoroughly optimized. Sol-gel Carbowax 20M (CW-20M) has demonstrated the most effective extraction efficiency for the target analytes among all evaluated membranes. Under optimal conditions, the method displayed linearity within the range of 0.3–10 µg mL–¹ (or µg g–¹), exhibiting a coefficient of determination (R2) ranging from 0.996–0.999. The limits of detection (LODs) and limits of quantification (LOQs) for liquid samples range between 0.020-0.069 µg mL-¹ and 0.066-0.22 µg mL-¹, respectively. Correspondingly, the LODs for solid samples ranged from 0.056-0.090 µg g-¹, while the LOQs ranged from 0.18-0.29 µg g-¹. Notably, the method showcased better precision, with repeatability and reproducibility both below 5% and 10%, respectively. Furthermore, the FPSE-GC-MS method proved effective in determining diazepam (DZ) in forensic food samples connected to drug-facilitated crimes (DFCs). Additionally, the proposed method underwent evaluation for its whiteness using the RGB12 algorithm.

Keywords: drug facilitated crime, fabric phase sorptive extraction, food forensics, white analytical chemistry

Procedia PDF Downloads 52
2418 Multidirectional Product Support System for Decision Making in Textile Industry Using Collaborative Filtering Methods

Authors: A. Senthil Kumar, V. Murali Bhaskaran

Abstract:

In the information technology ground, people are using various tools and software for their official use and personal reasons. Nowadays, people are worrying to choose data accessing and extraction tools at the time of buying and selling their products. In addition, worry about various quality factors such as price, durability, color, size, and availability of the product. The main purpose of the research study is to find solutions to these unsolved existing problems. The proposed algorithm is a Multidirectional Rank Prediction (MDRP) decision making algorithm in order to take an effective strategic decision at all the levels of data extraction, uses a real time textile dataset and analyzes the results. Finally, the results are obtained and compared with the existing measurement methods such as PCC, SLCF, and VSS. The result accuracy is higher than the existing rank prediction methods.

Keywords: Knowledge Discovery in Database (KDD), Multidirectional Rank Prediction (MDRP), Pearson’s Correlation Coefficient (PCC), VSS (Vector Space Similarity)

Procedia PDF Downloads 269
2417 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis

Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya

Abstract:

In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.

Keywords: cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis

Procedia PDF Downloads 308
2416 Structuring of Multilayer Aluminum Nickel by Lift-off Process Using Cheap Negative Resist

Authors: Muhammad Talal Asghar

Abstract:

The lift-off technique of the photoresist for metal patterning in integrated circuit (IC) packaging has been widely utilized in the field of microelectromechanical systems and semiconductor component manufacturing. The main advantage lies in cost-saving, reduction in complexity, and maturity of the process. The selection of photoresist depends upon many factors such as cost, the thickness of the resist, comfortable and valuable parameters extraction. In the present study, an extremely cheap dry film photoresist E8015 of thickness 38-micrometer is processed for the first time for edge profiling, according to the author's best knowledge. Successful extraction of the helpful parameter range for resist processing is performed. An undercut angle of 66 to 73 degrees is realized by parameter variation like exposure energy and development time. Finally, 10-micrometer thick metallic multilayer aluminum nickel is lifted off on the plain silicon wafer. Possible applications lie in controlled self-propagating reactions within structured metallic multilayer that may be utilized for IC packaging in the future.

Keywords: lift-off, IC packaging, photoresist, multilayer

Procedia PDF Downloads 198
2415 Performance Evaluation and Comparison between the Empirical Mode Decomposition, Wavelet Analysis, and Singular Spectrum Analysis Applied to the Time Series Analysis in Atmospheric Science

Authors: Olivier Delage, Hassan Bencherif, Alain Bourdier

Abstract:

Signal decomposition approaches represent an important step in time series analysis, providing useful knowledge and insight into the data and underlying dynamics characteristics while also facilitating tasks such as noise removal and feature extraction. As most of observational time series are nonlinear and nonstationary, resulting of several physical processes interaction at different time scales, experimental time series have fluctuations at all time scales and requires the development of specific signal decomposition techniques. Most commonly used techniques are data driven, enabling to obtain well-behaved signal components without making any prior-assumptions on input data. Among the most popular time series decomposition techniques, most cited in the literature, are the empirical mode decomposition and its variants, the empirical wavelet transform and singular spectrum analysis. With increasing popularity and utility of these methods in wide ranging applications, it is imperative to gain a good understanding and insight into the operation of these algorithms. In this work, we describe all of the techniques mentioned above as well as their ability to denoise signals, to capture trends, to identify components corresponding to the physical processes involved in the evolution of the observed system and deduce the dimensionality of the underlying dynamics. Results obtained with all of these methods on experimental total ozone columns and rainfall time series will be discussed and compared

Keywords: denoising, empirical mode decomposition, singular spectrum analysis, time series, underlying dynamics, wavelet analysis

Procedia PDF Downloads 89
2414 The Comparison of Depression Level of Male Athlete Students with Non-Athlete Students

Authors: Seyed Hossein Alavi, Farshad Ghazalian, Soghra Jamshidi

Abstract:

The present study was done with the purpose of considering mental health and general purpose of describing and comparing depression level of athlete and non-athlete male students educational year of 2012 Research method in this study in proportion to the selective title, descriptive method is causative – comparative. Research samples were selected randomly from B.A students of different fields including 500 students. Average mean of research samples was between 20 to 25 years. Data collection tool is questionnaire of depression measurement of Aroun Beck (B.D.I) that analyzes and measures 21 aspects of depression in 6 ranges. Operation related to analysis of statistical data to extraction of results was done by SPSS software. To extraction of research obtained by comparison of depression level mean, show that the hypothesis of the research (H_1) based on the existence of the significance scientific difference was supported and showed that there’s a significance difference between depression level of athlete male students in comparison with depression level of non-athlete male students. Thus, depression level of athlete male students was lower in comparison with depression level of non-athlete male students.

Keywords: depression, athlete students, non-athlete students

Procedia PDF Downloads 456
2413 Drug-Drug Interaction Prediction in Diabetes Mellitus

Authors: Rashini Maduka, C. R. Wijesinghe, A. R. Weerasinghe

Abstract:

Drug-drug interactions (DDIs) can happen when two or more drugs are taken together. Today DDIs have become a serious health issue due to adverse drug effects. In vivo and in vitro methods for identifying DDIs are time-consuming and costly. Therefore, in-silico-based approaches are preferred in DDI identification. Most machine learning models for DDI prediction are used chemical and biological drug properties as features. However, some drug features are not available and costly to extract. Therefore, it is better to make automatic feature engineering. Furthermore, people who have diabetes already suffer from other diseases and take more than one medicine together. Then adverse drug effects may happen to diabetic patients and cause unpleasant reactions in the body. In this study, we present a model with a graph convolutional autoencoder and a graph decoder using a dataset from DrugBank version 5.1.3. The main objective of the model is to identify unknown interactions between antidiabetic drugs and the drugs taken by diabetic patients for other diseases. We considered automatic feature engineering and used Known DDIs only as the input for the model. Our model has achieved 0.86 in AUC and 0.86 in AP.

Keywords: drug-drug interaction prediction, graph embedding, graph convolutional networks, adverse drug effects

Procedia PDF Downloads 81
2412 Detection of Abnormal Process Behavior in Copper Solvent Extraction by Principal Component Analysis

Authors: Kirill Filianin, Satu-Pia Reinikainen, Tuomo Sainio

Abstract:

Frequent measurements of product steam quality create a data overload that becomes more and more difficult to handle. In the current study, plant history data with multiple variables was successfully treated by principal component analysis to detect abnormal process behavior, particularly, in copper solvent extraction. The multivariate model is based on the concentration levels of main process metals recorded by the industrial on-stream x-ray fluorescence analyzer. After mean-centering and normalization of concentration data set, two-dimensional multivariate model under principal component analysis algorithm was constructed. Normal operating conditions were defined through control limits that were assigned to squared score values on x-axis and to residual values on y-axis. 80 percent of the data set were taken as the training set and the multivariate model was tested with the remaining 20 percent of data. Model testing showed successful application of control limits to detect abnormal behavior of copper solvent extraction process as early warnings. Compared to the conventional techniques of analyzing one variable at a time, the proposed model allows to detect on-line a process failure using information from all process variables simultaneously. Complex industrial equipment combined with advanced mathematical tools may be used for on-line monitoring both of process streams’ composition and final product quality. Defining normal operating conditions of the process supports reliable decision making in a process control room. Thus, industrial x-ray fluorescence analyzers equipped with integrated data processing toolbox allows more flexibility in copper plant operation. The additional multivariate process control and monitoring procedures are recommended to apply separately for the major components and for the impurities. Principal component analysis may be utilized not only in control of major elements’ content in process streams, but also for continuous monitoring of plant feed. The proposed approach has a potential in on-line instrumentation providing fast, robust and cheap application with automation abilities.

Keywords: abnormal process behavior, failure detection, principal component analysis, solvent extraction

Procedia PDF Downloads 293
2411 Multiple Fusion Based Single Image Dehazing

Authors: Joe Amalraj, M. Arunkumar

Abstract:

Haze is an atmospheric phenomenon that signicantly degrades the visibility of outdoor scenes. This is mainly due to the atmosphere particles that absorb and scatter the light. This paper introduces a novel single image approach that enhances the visibility of such degraded images. In this method is a fusion-based strategy that derives from two original hazy image inputs by applying a white balance and a contrast enhancing procedure. To blend effectively the information of the derived inputs to preserve the regions with good visibility, we filter their important features by computing three measures (weight maps): luminance, chromaticity, and saliency. To minimize artifacts introduced by the weight maps, our approach is designed in a multiscale fashion, using a Laplacian pyramid representation. This paper demonstrates the utility and effectiveness of a fusion-based technique for de-hazing based on a single degraded image. The method performs in a per-pixel fashion, which is straightforward to implement. The experimental results demonstrate that the method yields results comparative to and even better than the more complex state-of-the-art techniques, having the advantage of being appropriate for real-time applications.

Keywords: single image de-hazing, outdoor images, enhancing, DSP

Procedia PDF Downloads 394
2410 Sequential Pulsed Electric Field and Ultrasound Assisted Extraction of Bioactive Enriched Fractions from Button Mushroom Stalks

Authors: Bibha Kumari, Nigel P. Brunton, Dilip K. Rai, Brijesh K. Tiwari

Abstract:

Edible mushrooms possess numerous functional components like homo- and hetero- β-glucans [β(1→3), β(1→4) and β(1→6) glucosidic linkages], chitins, ergosterols, bioactive polysaccharides and peptides imparting health beneficial properties to mushrooms. Some of the proven biological activities of mushroom extracts are antioxidant, antimicrobial, immunomodulatory, cholesterol lowering activity by inhibiting a key cholesterol metabolism enzyme i.e. 3-hydroxy-3-methyl-glutaryl CoA reductase (HMGCR), angiotensin I-converting enzyme (ACE) inhibition. Application of novel extraction technologies like pulsed electric field (PEF) and high power ultrasound offers clean, green, faster and efficient extraction alternatives with enhanced and good quality extracts. Sequential PEF followed by ultrasound assisted extraction (UAE) were applied to recover bioactive enriched fractions from industrial white button mushroom (Agaricus bisporus) stalk waste using environmentally friendly and GRAS solvents i.e. water and water/ethanol combinations. The PEF treatment was carried out at 60% output voltage, 2 Hz frequency for 500 pulses of 20 microseconds pulse width, using KCl salt solution of 0.6 mS/cm conductivity by the placing 35g of chopped fresh mushroom stalks and 25g of salt solution in the 4x4x4cm3 treatment chamber. Sequential UAE was carried out on the PEF pre-treated samples using ultrasonic-water-bath (USB) of three frequencies (25 KHz, 35 KHz and 45 KHz) for various treatment times (15-120 min) at 80°C. Individual treatment using either PEF or UAE were also investigation to compare the effect of each treatment along with the combined effect on the recovery and bioactivity of the crude extracts. The freeze dried mushroom stalk powder was characterised for proximate compositional parameters (dry weight basis) showing 64.11% total carbohydrate, 19.12% total protein, 7.21% total fat, 31.2% total dietary fiber, 7.9% chitin (as glucosamine equivalent) and 1.02% β-glucan content. The total phenolic contents (TPC) were determined by the Folin-Ciocalteu procedure and expressed as gallic-acid-equivalents (GAE). The antioxidant properties were ascertained using DPPH and FRAP assays and expressed as trolox-equivalents (TE). HMGCR activity and molecular mass of β-glucans will be measured using the commercial HMG-CoA Reductase Assay kit (Sigma-Aldrich) and size exclusion chromatography (HPLC-SEC), respectively. Effects of PEF, UAE and their combination on the antioxidant capacity, HMGCR inhibition and β-glucans content will be presented.

Keywords: β-glucan, mushroom stalks, pulsed electric field (PEF), ultrasound assisted extraction (UAE)

Procedia PDF Downloads 275
2409 Finding Related Scientific Documents Using Formal Concept Analysis

Authors: Nadeem Akhtar, Hira Javed

Abstract:

An important aspect of research is literature survey. Availability of a large amount of literature across different domains triggers the need for optimized systems which provide relevant literature to researchers. We propose a search system based on keywords for text documents. This experimental approach provides a hierarchical structure to the document corpus. The documents are labelled with keywords using KEA (Keyword Extraction Algorithm) and are automatically organized in a lattice structure using Formal Concept Analysis (FCA). This groups the semantically related documents together. The hierarchical structure, based on keywords gives out only those documents which precisely contain them. This approach open doors for multi-domain research. The documents across multiple domains which are indexed by similar keywords are grouped together. A hierarchical relationship between keywords is obtained. To signify the effectiveness of the approach, we have carried out the experiment and evaluation on Semeval-2010 Dataset. Results depict that the presented method is considerably successful in indexing of scientific papers.

Keywords: formal concept analysis, keyword extraction algorithm, scientific documents, lattice

Procedia PDF Downloads 314
2408 Microwave Assisted Extractive Desulfurization of Gas Oil Feedstock

Authors: Hamida Y. Mostafa, Ghada E. Khedr, Dina M. Abd El-Aty

Abstract:

Sulfur compound removal from petroleum fractions is a critical component of environmental protection demands. Solvent extraction, oxidative desulfurization, or hydro-treatment techniques have traditionally been used as the removal processes. While all methods were capable of eliminating sulfur compounds at moderate rates, they had some limitations. A major problem with these routes is their high running expenses, which are caused by their prolonged operation times and high energy consumption. Therefore, new methods for removing sulfur are still necessary. In the current study, a simple assisted desulfurization system for gas oil fraction has been successfully developed using acetonitrile and methanol as a solvent under microwave irradiation. The key variables affecting sulfur removal have been studied, including microwave power, irradiation time, and solvent to gas oil volume ratio. At the conclusion of the research that is being presented, promising results have been found. The results show that a microwave-assisted extractive desulfurization method had remove sulfur with a high degree of efficiency under the suitable conditions.

Keywords: extractive desulfurization, microwave assisted extraction, petroleum fractions, acetonitrile and methanol

Procedia PDF Downloads 83
2407 Multiclass Analysis of Pharmaceuticals in Fish and Shrimp Tissues by High-Performance Liquid Chromatography-Tandem Mass Spectrometry

Authors: Reza Pashaei, Reda Dzingelevičienė

Abstract:

An efficient, reliable, and sensitive multiclass analytical method has been expanded to simultaneously determine 15 human pharmaceutical residues in fish and shrimp tissue samples by ultra-high-performance liquid chromatography-tandem mass spectrometry. The investigated compounds comprise ten classes, namely analgesic, antibacterial, anticonvulsant, cardiovascular, fluoroquinolones, macrolides, nonsteroidal anti-inflammatory, penicillins, stimulant, and sulfonamide. A simple liquid extraction procedure based on 0.1% formic acid in methanol was developed. Chromatographic conditions were optimized, and mobile phase namely 0.1 % ammonium acetate (A), and acetonitrile (B): 0 – 2 min, 15% B; 2 – 5 min, linear to 95% B; 5 – 10 min, 95% B; and 10 – 12 min was obtained. Limits of detection and quantification ranged from 0.017 to 1.371 μg/kg and 0.051 to 4.113 μg/kg, respectively. Finally, amoxicillin, azithromycin, caffeine, carbamazepine, ciprofloxacin, clarithromycin, diclofenac, erythromycin, furosemide, ibuprofen, ketoprofen, naproxen, sulfamethoxazole, tetracycline, and triclosan were quantifiable in fish and shrimp samples.

Keywords: fish, liquid chromatography, mass spectrometry, pharmaceuticals, shrimp, solid-phase extraction

Procedia PDF Downloads 239
2406 An Improved Tracking Approach Using Particle Filter and Background Subtraction

Authors: Amir Mukhtar, Dr. Likun Xia

Abstract:

An improved, robust and efficient visual target tracking algorithm using particle filtering is proposed. Particle filtering has been proven very successful in estimating non-Gaussian and non-linear problems. In this paper, the particle filter is used with color feature to estimate the target state with time. Color distributions are applied as this feature is scale and rotational invariant, shows robustness to partial occlusion and computationally efficient. The performance is made more robust by choosing the different (YIQ) color scheme. Tracking is performed by comparison of chrominance histograms of target and candidate positions (particles). Color based particle filter tracking often leads to inaccurate results when light intensity changes during a video stream. Furthermore, background subtraction technique is used for size estimation of the target. The qualitative evaluation of proposed algorithm is performed on several real-world videos. The experimental results demonstrate that the improved algorithm can track the moving objects very well under illumination changes, occlusion and moving background.

Keywords: tracking, particle filter, histogram, corner points, occlusion, illumination

Procedia PDF Downloads 362
2405 The Convolution Recurrent Network of Using Residual LSTM to Process the Output of the Downsampling for Monaural Speech Enhancement

Authors: Shibo Wei, Ting Jiang

Abstract:

Convolutional-recurrent neural networks (CRN) have achieved much success recently in the speech enhancement field. The common processing method is to use the convolution layer to compress the feature space by multiple upsampling and then model the compressed features with the LSTM layer. At last, the enhanced speech is obtained by deconvolution operation to integrate the global information of the speech sequence. However, the feature space compression process may cause the loss of information, so we propose to model the upsampling result of each step with the residual LSTM layer, then join it with the output of the deconvolution layer and input them to the next deconvolution layer, by this way, we want to integrate the global information of speech sequence better. The experimental results show the network model (RES-CRN) we introduce can achieve better performance than LSTM without residual and overlaying LSTM simply in the original CRN in terms of scale-invariant signal-to-distortion ratio (SI-SNR), speech quality (PESQ), and intelligibility (STOI).

Keywords: convolutional-recurrent neural networks, speech enhancement, residual LSTM, SI-SNR

Procedia PDF Downloads 181
2404 Optimal Configuration for Polarimetric Surface Plasmon Resonance Sensors

Authors: Ibrahim Watad, Ibrahim Abdulhalim

Abstract:

Conventional spectroscopic surface plasmon resonance (SPR) sensors are widely used, both in fundamental research and environmental monitoring as well as healthcare diagnostics. However, they still lack the low limit of detection (LOD) and there still a place for improvement. SPR conventional sensors are based on the detection of a dip in the reflectivity spectrum which is relatively wide. To improve the performance of these sensors, many techniques and methods proposed either to reduce the width of the dip or to increase the sensitivity. Together with that, profiting from the sharp jump in the phase spectrum under SPR, several works suggested the extraction of the phase of the reflected wave. However, existing phase measurement setups are in general more complicated compared to the conventional setups, require more stability and are very sensitive to external vibrations and noises. In this study, a simple polarimetric technique for phase extraction under SPR is presented, followed by a theoretical error analysis and an experimental verification. The advantages of the proposed technique upon existing techniques will be elaborated, together with conclusions regarding the best polarimetric function, and its corresponding optimal metal layer range of thicknesses to use under the conventional Kretschmann-Raether configuration.

Keywords: plasmonics, polarimetry, thin films, optical sensors

Procedia PDF Downloads 387