Search results for: feature extraction method for tremor classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22384

Search results for: feature extraction method for tremor classification

21034 A Scientific Method of Drug Development Based on Ayurvedic Bhaishajya Knowledge

Authors: Rajesh S. Mony, Vaidyaratnam Oushadhasala

Abstract:

An attempt is made in this study to evolve a drug development modality based on classical Ayurvedic knowledge base as well as on modern scientific methodology. The present study involves (a) identification of a specific ailment condition, (b) the selection of a polyherbal formulation, (c) deciding suitable extraction procedure, (d) confirming the efficacy of the combination by in-vitro trials and (e) fixing up the recommended dose. The ailment segment selected is arthritic condition. The selected herbal combination is Kunturushka, Vibhitaki, Guggulu, Haridra, Maricha and Nirgundi. They were selected as per Classical Ayurvedic references, Authentified as per API (Ayurvedic Pharmacopeia of India), Extraction of each drug was done by different ratios of Hydroalcoholic menstrums, Invitro assessment of each extract after removing residual solvent for anti-Inflammatory, anti-arthritic activities (by UV-Vis. Spectrophotometer with positive control), Invitro assessment of each extract for COX enzyme inhibition (by UV-Vis. Spectrophotometer with positive control), Selection of the extracts was made having good in-vitro activity, Performed the QC testing of each selected extract including HPTLC, that is the in process QC specifications, h. Decision of the single dose with mixtures of selected extracts was made as per the level of in-vitro activity and available toxicology data, Quantification of major groups like Phenolics, Flavonoids, Alkaloids and Bitters was done with both standard Spectrophotometric and Gravimetric methods, Method for Marker assay was developed and validated by HPTLC and a good resolved HPTLC finger print was developed for the single dosage API (Active Pharmaceutical Ingredient mixture of extracts), Three batches was prepared to fix the in process and API (Active Pharmaceutical Ingredient) QC specifications.

Keywords: drug development, antiinflammatory, quality stardardisation, planar chromatography

Procedia PDF Downloads 99
21033 Attention-Based ResNet for Breast Cancer Classification

Authors: Abebe Mulugojam Negash, Yongbin Yu, Ekong Favour, Bekalu Nigus Dawit, Molla Woretaw Teshome, Aynalem Birtukan Yirga

Abstract:

Breast cancer remains a significant health concern, necessitating advancements in diagnostic methodologies. Addressing this, our paper confronts the notable challenges in breast cancer classification, particularly the imbalance in datasets and the constraints in the accuracy and interpretability of prevailing deep learning approaches. We proposed an attention-based residual neural network (ResNet), which effectively combines the robust features of ResNet with an advanced attention mechanism. Enhanced through strategic data augmentation and positive weight adjustments, this approach specifically targets the issue of data imbalance. The proposed model is tested on the BreakHis dataset and achieved accuracies of 99.00%, 99.04%, 98.67%, and 98.08% in different magnifications (40X, 100X, 200X, and 400X), respectively. We evaluated the performance by using different evaluation metrics such as precision, recall, and F1-Score and made comparisons with other state-of-the-art methods. Our experiments demonstrate that the proposed model outperforms existing approaches, achieving higher accuracy in breast cancer classification.

Keywords: residual neural network, attention mechanism, positive weight, data augmentation

Procedia PDF Downloads 102
21032 Quantification of Polychlorinated Biphenyls (PCBs) in Soil Samples of Electrical Power Substations from Different Cities in Nigeria

Authors: Omasan Urhie Urhie, Adenipekun C. O, Eke W., Ogwu K., Erinle K. O

Abstract:

Polychlorinated Biphenyls (PCBs) are Persistent organic pollutants (POPs) that are very toxic; they possess ability to accumulate in soil and in human tissues hence resulting in health issues like birth defect, reproductive disorder and cancer. The air is polluted by PCBs through volatilization and dispersion; they also contaminate soil and sediments and are not easily degraded. Soil samples were collected from a depth of 0-15 cm from three substations (Warri, Ughelli and Ibadan) of Power Holding Company of Nigeria (PHCN) where old transformers were dumped in Nigeria. Extraction and cleanup of soil samples were conducted using Accelerated Solvent Extraction (ASE) with Pressurized Liquid extraction (PLE). The concentration of PCBs was determined using gsas chromatography/mass spectrometry (GC/MS). Mean total PCB concentrations in the soil samples increased in the order Ughelli ˂ Ibadan˂ Warri, 2.457757ppm Ughelli substation 4.198926ppm, for Ibadan substation and 14.05065ppm at Warri substation. In the Warri samples, PCB-167 was the most abundant at about 30% (4.28086ppm) followed by PCB-157 at about 20% (2.77871), of the total PCB concentrations (14.05065ppm). Of the total PCBs in the Ughelli and Ibadan samples, PCB-156 was the most abundant at about 44% and 40%, respectively. This study provides a baseline report on the presence of PCBs in the vicinity of abandoned electrical power facilities in different cities in Nigeria.

Keywords: polychlorintated biphenyls, persistent organic pollutants, soil, transformer

Procedia PDF Downloads 139
21031 An Efficient Machine Learning Model to Detect Metastatic Cancer in Pathology Scans Using Principal Component Analysis Algorithm, Genetic Algorithm, and Classification Algorithms

Authors: Bliss Singhal

Abstract:

Machine learning (ML) is a branch of Artificial Intelligence (AI) where computers analyze data and find patterns in the data. The study focuses on the detection of metastatic cancer using ML. Metastatic cancer is the stage where cancer has spread to other parts of the body and is the cause of approximately 90% of cancer-related deaths. Normally, pathologists spend hours each day to manually classifying whether tumors are benign or malignant. This tedious task contributes to mislabeling metastasis being over 60% of the time and emphasizes the importance of being aware of human error and other inefficiencies. ML is a good candidate to improve the correct identification of metastatic cancer, saving thousands of lives and can also improve the speed and efficiency of the process, thereby taking fewer resources and time. So far, the deep learning methodology of AI has been used in research to detect cancer. This study is a novel approach to determining the potential of using preprocessing algorithms combined with classification algorithms in detecting metastatic cancer. The study used two preprocessing algorithms: principal component analysis (PCA) and the genetic algorithm, to reduce the dimensionality of the dataset and then used three classification algorithms: logistic regression, decision tree classifier, and k-nearest neighbors to detect metastatic cancer in the pathology scans. The highest accuracy of 71.14% was produced by the ML pipeline comprising of PCA, the genetic algorithm, and the k-nearest neighbor algorithm, suggesting that preprocessing and classification algorithms have great potential for detecting metastatic cancer.

Keywords: breast cancer, principal component analysis, genetic algorithm, k-nearest neighbors, decision tree classifier, logistic regression

Procedia PDF Downloads 82
21030 Filling the Gap of Extraction of Digital Evidence from Emerging Platforms Without Forensics Tools

Authors: Yi Anson Lam, Siu Ming Yiu, Kam Pui Chow

Abstract:

Digital evidence has been tendering to courts at an exponential rate in recent years. As an industrial practice, most digital evidence is extracted and preserved using specialized and well-accepted forensics tools. On the other hand, the advancement in technologies enables the creation of quite a few emerging platforms such as Telegram, Signal etc. Existing (well-accepted) forensics tools were not designed to extract evidence from these emerging platforms. While new forensics tools require a significant amount of time and effort to be developed and verified, this paper tries to address how to fill this gap using quick-fix alternative methods for digital evidence collection (e.g., based on APIs provided by Apps) and discuss issues related to the admissibility of this evidence to courts with support from international courts’ stance and the circumstances of accepting digital evidence using these proposed alternatives.

Keywords: extraction, digital evidence, laws, investigation

Procedia PDF Downloads 68
21029 Pretreatment of Cattail (Typha domingensis) Fibers to Obtain Cellulose Nanocrystals

Authors: Marivane Turim Koschevic, Maycon dos Santos, Marcello Lima Bertuci, Farayde Matta Fakhouri, Silvia Maria Martelli

Abstract:

Natural fibers are rich raw materials in cellulose and abundant in the world, its use for the cellulose nanocrystals extraction is promising as an example cited is the cattail, macrophyte native weed in South America. This study deals with the pre-treatment cattail of crushed fibers, at six different methods of mercerization, followed by the use of bleaching. As a result, have found The positive effects of treating fibers by means of optical microscopy and spectroscopy, Fourier transform (FTIR). The sample selected for future testing of cellulose nanocrystals extraction was treated in 2.5% NaOH for 2 h, 60 °C in the first stage and 30vol H2O2, NaOH 5% in the proportion 30/70% (v/v) for 1 hour 60 °C, followed by treatment at 50/50% (v/v) 15 minutes, 50°C, with the same constituents of the solution.

Keywords: cellulose nanocrystal, chemical treatment, mercerization, natural fibers

Procedia PDF Downloads 293
21028 Modeling and Simulation of Ship Structures Using Finite Element Method

Authors: Javid Iqbal, Zhu Shifan

Abstract:

The development in the construction of unconventional ships and the implementation of lightweight materials have shown a large impulse towards finite element (FE) method, making it a general tool for ship design. This paper briefly presents the modeling and analysis techniques of ship structures using FE method for complex boundary conditions which are difficult to analyze by existing Ship Classification Societies rules. During operation, all ships experience complex loading conditions. These loads are general categories into thermal loads, linear static, dynamic and non-linear loads. General strength of the ship structure is analyzed using static FE analysis. FE method is also suitable to consider the local loads generated by ballast tanks and cargo in addition to hydrostatic and hydrodynamic loads. Vibration analysis of a ship structure and its components can be performed using FE method which helps in obtaining the dynamic stability of the ship. FE method has developed better techniques for calculation of natural frequencies and different mode shapes of ship structure to avoid resonance both globally and locally. There is a lot of development towards the ideal design in ship industry over the past few years for solving complex engineering problems by employing the data stored in the FE model. This paper provides an overview of ship modeling methodology for FE analysis and its general application. Historical background, the basic concept of FE, advantages, and disadvantages of FE analysis are also reported along with examples related to hull strength and structural components.

Keywords: dynamic analysis, finite element methods, ship structure, vibration analysis

Procedia PDF Downloads 137
21027 Integrating Wound Location Data with Deep Learning for Improved Wound Classification

Authors: Mouli Banga, Chaya Ravindra

Abstract:

Wound classification is a crucial step in wound diagnosis. An effective classifier can aid wound specialists in identifying wound types with reduced financial and time investments, facilitating the determination of optimal treatment procedures. This study presents a deep neural network-based classifier that leverages wound images and their corresponding locations to categorize wounds into various classes, such as diabetic, pressure, surgical, and venous ulcers. By incorporating a developed body map, the process of tagging wound locations is significantly enhanced, providing healthcare specialists with a more efficient tool for wound analysis. We conducted a comparative analysis between two prominent convolutional neural network models, ResNet50 and MobileNetV2, utilizing a dataset of 730 images. Our findings reveal that the RestNet50 outperforms MovileNetV2, achieving an accuracy of approximately 90%, compared to MobileNetV2’s 83%. This disparity highlights the superior capability of ResNet50 in the context of this dataset. The results underscore the potential of integrating deep learning with spatial data to improve the precision and efficiency of wound diagnosis, ultimately contributing to better patient outcomes and reducing healthcare costs.

Keywords: wound classification, MobileNetV2, ResNet50, multimodel

Procedia PDF Downloads 32
21026 Numerical Investigation of Nanofluid Based Thermosyphon System

Authors: Kiran Kumar K., Ramesh Babu Bejjam, Atul Najan

Abstract:

A thermosyphon system is a heat transfer loop which operates on the basis of gravity and buoyancy forces. It guarantees a good reliability and low maintenance cost as it does not involve any mechanical pump. Therefore it can be used in many industrial applications such as refrigeration and air conditioning, electronic cooling, nuclear reactors, geothermal heat extraction, etc. But flow instabilities and loop configuration are the major problems in this system. Several previous researchers studied that stabilities can be suppressed by using nanofluids as loop fluid. In the present study a rectangular thermosyphon loop with end heat exchangers are considered for the study. This configuration is more appropriate for many practical applications such as solar water heater, geothermal heat extraction, etc. In the present work, steady-state analysis is carried out on thermosyphon loop with parallel flow coaxial heat exchangers at heat source and heat sink. In this loop nano fluid is considered as the loop fluid and water is considered as the external fluid in both hot and cold heat exchangers. For this analysis one-dimensional homogeneous model is developed. In this model, conservation equations like conservation of mass, momentum, energy are discretized using finite difference method. A computer code is written in MATLAB to simulate the flow in thermosyphon loop. A comparison in terms of heat transfer is made between water and nano fluid as working fluids in the loop.

Keywords: heat exchanger, heat transfer, nanofluid, thermosyphon loop

Procedia PDF Downloads 477
21025 Ligandless Extraction and Determination of Trace Amounts of Lead in Pomegranate, Zucchini and Lettuce Samples after Dispersive Liquid-Liquid Microextraction with Ultrasonic Bath and Optimization of Extraction Condition with RSM Design

Authors: Fariba Tadayon, Elmira Hassanlou, Hasan Bagheri, Mostafa Jafarian

Abstract:

Heavy metals are released into water, plants, soil, and food by natural and human activities. Lead has toxic roles in the human body and may cause serious problems even in low concentrations, since it may have several adverse effects on human. Therefore, determination of lead in different samples is an important procedure in the studies of environmental pollution. In this work, an ultrasonic assisted-ionic liquid based-liquid-liquid microextraction (UA-IL-DLLME) procedure for the determination of lead in zucchini, pomegranate, and lettuce has been established and developed by using flame atomic absorption spectrometer (FAAS). For UA-IL-DLLME procedure, 10 mL of the sample solution containing Pb2+ was adjusted to pH=5 in a glass test tube with a conical bottom; then, 120 μL of 1-Hexyl-3-methylimidazolium hexafluoro phosphate (CMIM)(PF6) was rapidly injected into the sample solution with a microsyringe. After that, the resulting cloudy mixture was treated by ultrasonic for 5 min, then the separation of two phases was obtained by centrifugation for 5 min at 3000 rpm and IL-phase diluted with 1 cc ethanol, and the analytes were determined by FAAS. The effect of different experimental parameters in the extraction step including: ionic liquid volume, sonication time and pH was studied and optimized simultaneously by using Response Surface Methodology (RSM) employing a central composite design (CCD). The optimal conditions were determined to be an ionic liquid volume of 120 μL, sonication time of 5 min, and pH=5. The linear ranges of the calibration curve for the determination by FAAS of lead were 0.1-4 ppm with R2=0.992. Under optimized conditions, the limit of detection (LOD) for lead was 0.062 μg.mL-1, the enrichment factor (EF) was 93, and the relative standard deviation (RSD) for lead was calculated as 2.29%. The levels of lead for pomegranate, zucchini, and lettuce were calculated as 2.88 μg.g-1, 1.54 μg.g-1, 2.18 μg.g-1, respectively. Therefore, this method has been successfully applied for the analysis of the content of lead in different food samples by FAAS.

Keywords: Dispersive liquid-liquid microextraction, Central composite design, Food samples, Flame atomic absorption spectrometry.

Procedia PDF Downloads 283
21024 A Machine Learning Approach for the Leakage Classification in the Hydraulic Final Test

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

The widespread use of machine learning applications in production is significantly accelerated by improved computing power and increasing data availability. Predictive quality enables the assurance of product quality by using machine learning models as a basis for decisions on test results. The use of real Bosch production data based on geometric gauge blocks from machining, mating data from assembly and hydraulic measurement data from final testing of directional valves is a promising approach to classifying the quality characteristics of workpieces.

Keywords: machine learning, classification, predictive quality, hydraulics, supervised learning

Procedia PDF Downloads 213
21023 Money Laundering and Governance in Cryptocurrencies: The Double-Edged Sword of Blockchain Technology

Authors: Jiaqi Yan, Yani Shi

Abstract:

With the growing popularity of bitcoin transactions, criminals have exploited the bitcoin like cryptocurrencies, and cybercriminals such as money laundering have thrived. Unlike traditional currencies, the Internet-based virtual currencies can be used anonymously via the blockchain technology underpinning. In this paper, we analyze the double-edged sword features of blockchain technology in the context of money laundering. In particular, the traceability feature of blockchain-based system facilitates a level of governance, while the decentralization feature of blockchain-based system may bring governing difficulties. Based on the analysis, we propose guidelines for policy makers in governing blockchain-based cryptocurrency systems.

Keywords: cryptocurrency, money laundering, blockchain, decentralization, traceability

Procedia PDF Downloads 202
21022 Green Synthesis of Magnetic, Silica Nanocomposite and Its Adsorptive Performance against Organochlorine Pesticides

Authors: Waleed A. El-Said, Dina M. Fouad, Mohamed H. Aly, Mohamed A. El-Gahami

Abstract:

Green synthesis of nanomaterials has received increasing attention as an eco-friendly technology in materials science. Here, we have used two types of extractions from green tea leaf (i.e. total extraction and tannin extraction) as reducing agents for a rapid, simple and one step synthesis method of mesoporous silica nanoparticles (MSNPs)/iron oxide (Fe3O4) nanocomposite based on deposition of Fe3O4 onto MSNPs. MSNPs/Fe3O4 nanocomposite were characterized by X-ray diffraction, Fourier transform infrared spectroscopy, scanning electron microscopy, energy dispersive X-ray, vibrating sample magnetometer, N2 adsorption, and high-resolution transmission electron microscopy. The average mesoporous silica particle diameter was found to be around 30 nm with high surface area (818 m2/gm). MSNPs/Fe3O4 nanocomposite was used for removing lindane pesticide (an environmental hazard material) from aqueous solutions. Fourier transform infrared, UV-vis, High-performance liquid chromatography and gas chromatography techniques were used to confirm the high ability of MSNPs/Fe3O4 nanocomposite for sensing and capture of lindane molecules with high sorption capacity (more than 89%) that could develop a new eco-friendly strategy for detection and removing of pesticide and as a promising material for water treatment application.

Keywords: green synthesis, mesoporous silica, magnetic iron oxide NPs, adsorption Lindane

Procedia PDF Downloads 436
21021 Effect of Ethanol Concentration and Enzyme Pre-Treatment on Bioactive Compounds from Ginger Extract

Authors: S. Lekhavat, T. Kajsongkram, S. Sang-han

Abstract:

Dried ginger was extracted and investigated the effect of ethanol concentration and enzyme pre-treatment on its bioactive compounds in solvent extraction process. Sliced fresh gingers were dried by oven dryer at 70 °C for 24 hours and ground to powder using grinder which their size were controlled by passing through a 20-mesh sieve. In enzyme pre-treatment process, ginger powder was sprayed with 1 % (w/w) cellulase and then was incubated at 45 °C for 2 hours following by extraction process using ethanol at concentration of 0, 20, 40, 60 and 80 % (v/v), respectively. The ratio of ginger powder and ethanol are 1:9 and extracting conditions were controlled at 80 °C for 2 hours. Bioactive compounds extracted from ginger, either enzyme-treated or non enzyme-treated samples, such as total phenolic content (TPC), 6-Gingerol (6 G), 6-Shogaols (6 S) and antioxidant activity (IC50 using DPPH assay), were examined. Regardless of enzyme treatment, the results showed that 60 % ethanol provided the highest TPC (20.36 GAE mg /g. dried ginger), 6G (0.77%), 6S (0.036 %) and the lowest IC50 (625 μg/ml) compared to other ratios of ethanol. Considering the effect of enzyme on bioactive compounds and antioxidant activity, it was found that enzyme-treated sample has more 6G (0.17-0.77 %) and 6S (0.020-0.036 %) than non enzyme-treated samples (0.13-0.77 % 6G, 0.015-0.036 % 6S). However, the results showed that non enzyme-treated extracts provided higher TPC (6.76-20.36 GAE mg /g. dried ginger) and Lowest IC50 (625-1494 μg/ml ) than enzyme-treated extracts (TPC 5.36-17.50 GAE mg /g. dried ginger, IC50 793-2146 μg/ml).

Keywords: antioxidant activity, enzyme, extraction, ginger

Procedia PDF Downloads 256
21020 Web Search Engine Based Naming Procedure for Independent Topic

Authors: Takahiro Nishigaki, Takashi Onoda

Abstract:

In recent years, the number of document data has been increasing since the spread of the Internet. Many methods have been studied for extracting topics from large document data. We proposed Independent Topic Analysis (ITA) to extract topics independent of each other from large document data such as newspaper data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis. The topic represented by ITA is represented by a set of words. However, the set of words is quite different from the topics the user imagines. For example, the top five words with high independence of a topic are as follows. Topic1 = {"scor", "game", "lead", "quarter", "rebound"}. This Topic 1 is considered to represent the topic of "SPORTS". This topic name "SPORTS" has to be attached by the user. ITA cannot name topics. Therefore, in this research, we propose a method to obtain topics easy for people to understand by using the web search engine, topics given by the set of words given by independent topic analysis. In particular, we search a set of topical words, and the title of the homepage of the search result is taken as the topic name. And we also use the proposed method for some data and verify its effectiveness.

Keywords: independent topic analysis, topic extraction, topic naming, web search engine

Procedia PDF Downloads 119
21019 Development of a New Characterization Method to Analyse Cypermethrin Penetration in Wood Material by Immunolabelling

Authors: Sandra Tapin-Lingua, Katia Ruel, Jean-Paul Joseleau, Daouia Messaoudi, Olivier Fahy, Michel Petit-Conil

Abstract:

The preservative efficacy of organic biocides is strongly related to their capacity of penetration and retention within wood tissues. The specific detection of the pyrethroid insecticide is currently obtained after extraction followed by chemical analysis by chromatography techniques. However visualizing the insecticide molecule within the wood structure requires specific probes together with microscopy techniques. Therefore, the aim of the present work was to apply a new methodology based on antibody-antigen recognition and electronic microscopy to visualize directly pyrethroids in the wood material. A polyclonal antibody directed against cypermethrin was developed and implement it on Pinus sylvestris wood samples coated with technical cypermethrin. The antibody was tested on impregnated wood and the specific recognition of the insecticide was visualized in transmission electron microscopy (TEM). The immunogold-TEM assay evidenced the capacity of the synthetic biocide to penetrate in the wood. The depth of penetration was measured on sections taken at increasing distances from the coated surface of the wood. Such results correlated with chemical analyzes carried out by GC-ECD after extraction. In addition, the immuno-TEM investigation allowed visualizing, for the first time at the ultrastructure scale of resolution, that cypermethrin was able to diffuse within the secondary wood cell walls.

Keywords: cypermethrin, insecticide, wood penetration, wood retention, immuno-transmission electron microscopy, polyclonal antibody

Procedia PDF Downloads 413
21018 An Approach to Solving Some Inverse Problems for Parabolic Equations

Authors: Bolatbek Rysbaiuly, Aliya S. Azhibekova

Abstract:

Problems concerning the interpretation of the well testing results belong to the class of inverse problems of subsurface hydromechanics. The distinctive feature of such problems is that additional information is depending on the capabilities of oilfield experiments. Another factor that should not be overlooked is the existence of errors in the test data. To determine reservoir properties, some inverse problems for parabolic equations were investigated. An approach to solving the inverse problems based on the method of regularization is proposed.

Keywords: iterative approach, inverse problem, parabolic equation, reservoir properties

Procedia PDF Downloads 428
21017 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.

Keywords: classification, CRISP-DM, machine learning, predictive quality, regression

Procedia PDF Downloads 144
21016 Noninvasive Brain-Machine Interface to Control Both Mecha TE Robotic Hands Using Emotiv EEG Neuroheadset

Authors: Adrienne Kline, Jaydip Desai

Abstract:

Electroencephalogram (EEG) is a noninvasive technique that registers signals originating from the firing of neurons in the brain. The Emotiv EEG Neuroheadset is a consumer product comprised of 14 EEG channels and was used to record the reactions of the neurons within the brain to two forms of stimuli in 10 participants. These stimuli consisted of auditory and visual formats that provided directions of ‘right’ or ‘left.’ Participants were instructed to raise their right or left arm in accordance with the instruction given. A scenario in OpenViBE was generated to both stimulate the participants while recording their data. In OpenViBE, the Graz Motor BCI Stimulator algorithm was configured to govern the duration and number of visual stimuli. Utilizing EEGLAB under the cross platform MATLAB®, the electrodes most stimulated during the study were defined. Data outputs from EEGLAB were analyzed using IBM SPSS Statistics® Version 20. This aided in determining the electrodes to use in the development of a brain-machine interface (BMI) using real-time EEG signals from the Emotiv EEG Neuroheadset. Signal processing and feature extraction were accomplished via the Simulink® signal processing toolbox. An Arduino™ Duemilanove microcontroller was used to link the Emotiv EEG Neuroheadset and the right and left Mecha TE™ Hands.

Keywords: brain-machine interface, EEGLAB, emotiv EEG neuroheadset, OpenViBE, simulink

Procedia PDF Downloads 502
21015 From User's Requirements to UML Class Diagram

Authors: Zeineb Ben Azzouz, Wahiba Ben Abdessalem Karaa

Abstract:

The automated extraction of UML class diagram from natural language requirements is a highly challenging task. Many approaches, frameworks and tools have been presented in this field. Nonetheless, the experiments of these tools have shown that there is no approach that can work best all the time. In this context, we propose a new accurate approach to facilitate the automatic mapping from textual requirements to UML class diagram. Our new approach integrates the best properties of statistical Natural Language Processing (NLP) techniques to reduce ambiguity when analysing natural language requirements text. In addition, our approach follows the best practices defined by conceptual modelling experts to determine some patterns indispensable for the extraction of basic elements and concepts of the class diagram. Once the relevant information of class diagram is captured, a XMI document is generated and imported with a CASE tool to build the corresponding UML class diagram.

Keywords: class diagram, user’s requirements, XMI, software engineering

Procedia PDF Downloads 471
21014 A Method for the Extraction of the Character's Tendency from Korean Novels

Authors: Min-Ha Hong, Kee-Won Kim, Seung-Hoon Kim

Abstract:

The character in the story-based content, such as novels and movies, is one of the core elements to understand the story. In particular, the character’s tendency is an important factor to analyze the story-based content, because it has a significant influence on the storyline. If readers have the knowledge of the tendency of characters before reading a novel, it will be helpful to understand the structure of conflict, episode and relationship between characters in the novel. It may therefore help readers to select novel that the reader wants to read. In this paper, we propose a method of extracting the tendency of the characters from a novel written in Korean. In advance, we build the dictionary with pairs of the emotional words in Korean and English since the emotion words in the novel’s sentences express character’s feelings. We rate the degree of polarity (positive or negative) of words in our emotional words dictionary based on SenticNet. Then we extract characters and emotion words from sentences in a novel. Since the polarity of a word grows strong or weak due to sentence features such as quotations and modifiers, our proposed method consider them to calculate the polarity of characters. The information of the extracted character’s polarity can be used in the book search service or book recommendation service.

Keywords: character tendency, data mining, emotion word, Korean novel

Procedia PDF Downloads 334
21013 A Methodology for Characterising the Tail Behaviour of a Distribution

Authors: Serge Provost, Yishan Zang

Abstract:

Following a review of various approaches that are utilized for classifying the tail behavior of a distribution, an easily implementable methodology that relies on an arctangent transformation is presented. The classification criterion is actually based on the difference between two specific quantiles of the transformed distribution. The resulting categories enable one to classify distributional tails as distinctly short, short, nearly medium, medium, extended medium and somewhat long, providing that at least two moments exist. Distributions possessing a single moment are said to be long tailed while those failing to have any finite moments are classified as having an extremely long tail. Several illustrative examples will be presented.

Keywords: arctangent transformation, tail classification, heavy-tailed distributions, distributional moments

Procedia PDF Downloads 120
21012 Approach to Honey Volatiles' Profiling by Gas Chromatography and Mass Spectrometry

Authors: Igor Jerkovic

Abstract:

Biodiversity of flora provides many different nectar sources for the bees. Unifloral honeys possess distinctive flavours, mainly derived from their nectar sources (characteristic volatile organic components (VOCs)). Specific or nonspecific VOCs (chemical markers) could be used for unifloral honey characterisation as addition to the melissopalynologycal analysis. The main honey volatiles belong, in general, to three principal categories: terpenes, norisoprenoids, and benzene derivatives. Some of these substances have been described as characteristics of the floral source, and other compounds, like several alcohols, branched aldehydes, and furan derivatives, may be related to the microbial purity of honey processing and storage conditions. Selection of the extraction method for the honey volatiles profiling should consider that heating of the honey produce different artefacts and therefore conventional methods of VOCs isolation (such as hydrodistillation) cannot be applied for the honey. Two-way approach for the isolation of the honey VOCs was applied using headspace solid-phase microextraction (HS-SPME) and ultrasonic solvent extraction (USE). The extracts were analysed by gas chromatography and mass spectrometry (GC-MS). HS-SPME (with the fibers of different polarity such as polydimethylsiloxane/ divinylbenzene (PDMS/DVB) or divinylbenzene/carboxene/ polydimethylsiloxane (DVB/CAR/PDMS)) enabled isolation of high volatile headspace VOCs of the honey samples. Among them, some characteristic or specific compounds can be found such as 3,4-dihydro-3-oxoedulan (in Centaurea cyanus L. honey) or 1H-indole, methyl anthranilate, and cis-jasmone (in Citrus unshiu Marc. honey). USE with different solvents (mainly dichloromethane or the mixture pentane : diethyl ether 1 : 2 v/v) enabled isolation of less volatile and semi-volatile VOCs of the honey samples. Characteristic compounds from C. unshiu honey extracts were caffeine, 1H-indole, 1,3-dihydro-2H-indol-2-one, methyl anthranilate, and phenylacetonitrile. Sometimes, the selection of solvent sequence was useful for more complete profiling such as sequence I: pentane → diethyl ether or sequence II: pentane → pentane/diethyl ether (1:2, v/v) → dichloromethane). The extracts with diethyl ether contained hydroquinone and 4-hydroxybenzoic acid as the major compounds, while (E)-4-(r-1’,t-2’,c-4’-trihydroxy-2’,6’,6’-trimethylcyclo-hexyl)but-3-en-2-one predominated in dichloromethane extracts of Allium ursinum L. honey. With this two-way approach, it was possible to obtain a more detailed insight into the honey volatile and semi-volatile compounds and to minimize the risks of compound discrimination due to their partial extraction that is of significant importance for the complete honey profiling and identification of the chemical biomarkers that can complement the pollen analysis.

Keywords: honey chemical biomarkers, honey volatile compounds profiling, headspace solid-phase microextraction (HS-SPME), ultrasonic solvent extraction (USE)

Procedia PDF Downloads 203
21011 A Comparative Study of Deep Learning Methods for COVID-19 Detection

Authors: Aishrith Rao

Abstract:

COVID 19 is a pandemic which has resulted in thousands of deaths around the world and a huge impact on the global economy. Testing is a huge issue as the test kits have limited availability and are expensive to manufacture. Using deep learning methods on radiology images in the detection of the coronavirus as these images contain information about the spread of the virus in the lungs is extremely economical and time-saving as it can be used in areas with a lack of testing facilities. This paper focuses on binary classification and multi-class classification of COVID 19 and other diseases such as pneumonia, tuberculosis, etc. Different deep learning methods such as VGG-19, COVID-Net, ResNET+ SVM, Deep CNN, DarkCovidnet, etc., have been used, and their accuracy has been compared using the Chest X-Ray dataset.

Keywords: deep learning, computer vision, radiology, COVID-19, ResNet, VGG-19, deep neural networks

Procedia PDF Downloads 160
21010 Fatigue Life Estimation of Tubular Joints - A Comparative Study

Authors: Jeron Maheswaran, Sudath C. Siriwardane

Abstract:

In fatigue analysis, the structural detail of tubular joint has taken great attention among engineers. The DNV-RP-C203 is covering this topic quite well for simple and clear joint cases. For complex joint and geometry, where joint classification isn’t available and limitation on validity range of non-dimensional geometric parameters, the challenges become a fact among engineers. The classification of joint is important to carry out through the fatigue analysis. These joint configurations are identified by the connectivity and the load distribution of tubular joints. To overcome these problems to some extent, this paper compare the fatigue life of tubular joints in offshore jacket according to the stress concentration factors (SCF) in DNV-RP-C203 and finite element method employed Abaqus/CAE. The paper presents the geometric details, material properties and considered load history of the jacket structure. Describe the global structural analysis and identification of critical tubular joints for fatigue life estimation. Hence fatigue life is determined based on the guidelines provided by design codes. Fatigue analysis of tubular joints is conducted using finite element employed Abaqus/CAE [4] as next major step. Finally, obtained SCFs and fatigue lives are compared and their significances are discussed.

Keywords: fatigue life, stress-concentration factor, finite element analysis, offshore jacket structure

Procedia PDF Downloads 453
21009 Societal Acceptance of Trombe Wall in Buildings in Mediterranean Region: A Case Cyprus

Authors: Soad Abokhamis Mousavi

Abstract:

The Trombe wall is an ancient technique that continues to serve as an effective feature of a passive solar system. However, in practice, architects and their clients are not opting for the Trombe wall because of the view of the Trombe wall on the facades of the buildings. Therefore, this study has two main goals, and one of the goals is to find out why the Trombe wall is not considered in the buildings in the Mediterranean region. And the second goal is to find a solution to facilitate the societal acceptance of the Trombe walls in buildings. To cover the goals, the present work attempts to develop and design a different Trombe Wall with different Materials and views in the facades of the buildings. A qualitative data method was used in this article. The qualitative method was developed based on observation and questionnaires with different clients and expert architects in the selected region. Results indicate that the view of the Trombe wall in the facade of buildings can be used with different designs in order to not affect the beauty of the buildings.

Keywords: trombe wall, societal acceptance, building, energy efficacy

Procedia PDF Downloads 81
21008 Identification and Classification of Stakeholders in the Transition to 3D Cadastre

Authors: Qiaowen Lin

Abstract:

The 3D cadastre is an inevitable choice to meet the needs of real cadastral management. Nowadays, more attention is given to the technical aspects of 3D cadastre, resulting in the imbalance within this field. To fulfill this research gap, the stakeholder, which has been regarded as the determining factor in cadastral change has been studied. Delphi method, Michael rating, and stakeholder mapping are used to identify and classify the stakeholders in 3D cadastre. It is concluded that the project managers should pay more attention to the interesting appeal of the key stakeholders and different coping strategies should be adopted to facilitate the transition to 3D cadastre.

Keywords: stakeholders, three dimension, cadastre, transtion

Procedia PDF Downloads 290
21007 Liver Lesion Extraction with Fuzzy Thresholding in Contrast Enhanced Ultrasound Images

Authors: Abder-Rahman Ali, Adélaïde Albouy-Kissi, Manuel Grand-Brochier, Viviane Ladan-Marcus, Christine Hoeffl, Claude Marcus, Antoine Vacavant, Jean-Yves Boire

Abstract:

In this paper, we present a new segmentation approach for focal liver lesions in contrast enhanced ultrasound imaging. This approach, based on a two-cluster Fuzzy C-Means methodology, considers type-II fuzzy sets to handle uncertainty due to the image modality (presence of speckle noise, low contrast, etc.), and to calculate the optimum inter-cluster threshold. Fine boundaries are detected by a local recursive merging of ambiguous pixels. The method has been tested on a representative database. Compared to both Otsu and type-I Fuzzy C-Means techniques, the proposed method significantly reduces the segmentation errors.

Keywords: defuzzification, fuzzy clustering, image segmentation, type-II fuzzy sets

Procedia PDF Downloads 485
21006 Image Segmentation Using 2-D Histogram in RGB Color Space in Digital Libraries

Authors: El Asnaoui Khalid, Aksasse Brahim, Ouanan Mohammed

Abstract:

This paper presents an unsupervised color image segmentation method. It is based on a hierarchical analysis of 2-D histogram in RGB color space. This histogram minimizes storage space of images and thus facilitates the operations between them. The improved segmentation approach shows a better identification of objects in a color image and, at the same time, the system is fast.

Keywords: image segmentation, hierarchical analysis, 2-D histogram, classification

Procedia PDF Downloads 380
21005 Characterizing and Developing the Clinical Grade Microbiome Assay with a Robust Bioinformatics Pipeline for Supporting Precision Medicine Driven Clinical Development

Authors: Danyi Wang, Andrew Schriefer, Dennis O'Rourke, Brajendra Kumar, Yang Liu, Fei Zhong, Juergen Scheuenpflug, Zheng Feng

Abstract:

Purpose: It has been recognized that the microbiome plays critical roles in disease pathogenesis, including cancer, autoimmune disease, and multiple sclerosis. To develop a clinical-grade assay for exploring microbiome-derived clinical biomarkers across disease areas, a two-phase approach is implemented. 1) Identification of the optimal sample preparation reagents using pre-mixed bacteria and healthy donor stool samples coupled with proprietary Sigma-Aldrich® bioinformatics solution. 2) Exploratory analysis of patient samples for enabling precision medicine. Study Procedure: In phase 1 study, we first compared the 16S sequencing results of two ATCC® microbiome standards (MSA 2002 and MSA 2003) across five different extraction kits (Kit A, B, C, D & E). Both microbiome standards samples were extracted in triplicate across all extraction kits. Following isolation, DNA quantity was determined by Qubit assay. DNA quality was assessed to determine purity and to confirm extracted DNA is of high molecular weight. Bacterial 16S ribosomal ribonucleic acid (rRNA) amplicons were generated via amplification of the V3/V4 hypervariable region of the 16S rRNA. Sequencing was performed using a 2x300 bp paired-end configuration on the Illumina MiSeq. Fastq files were analyzed using the Sigma-Aldrich® Microbiome Platform. The Microbiome Platform is a cloud-based service that offers best-in-class 16S-seq and WGS analysis pipelines and databases. The Platform and its methods have been extensively benchmarked using microbiome standards generated internally by MilliporeSigma and other external providers. Data Summary: The DNA yield using the extraction kit D and E is below the limit of detection (100 pg/µl) of Qubit assay as both extraction kits are intended for samples with low bacterial counts. The pre-mixed bacterial pellets at high concentrations with an input of 2 x106 cells for MSA-2002 and 1 x106 cells from MSA-2003 were not compatible with the kits. Among the remaining 3 extraction kits, kit A produced the greatest yield whereas kit B provided the least yield (Kit-A/MSA-2002: 174.25 ± 34.98; Kit-A/MSA-2003: 179.89 ± 30.18; Kit-B/MSA-2002: 27.86 ± 9.35; Kit-B/MSA-2003: 23.14 ± 6.39; Kit-C/MSA-2002: 55.19 ± 10.18; Kit-C/MSA-2003: 35.80 ± 11.41 (Mean ± SD)). Also, kit A produced the greatest yield, whereas kit B provided the least yield. The PCoA 3D visualization of the Weighted Unifrac beta diversity shows that kits A and C cluster closely together while kit B appears as an outlier. The kit A sequencing samples cluster more closely together than both the other kits. The taxonomic profiles of kit B have lower recall when compared to the known mixture profiles indicating that kit B was inefficient at detecting some of the bacteria. Conclusion: Our data demonstrated that the DNA extraction method impacts DNA concentration, purity, and microbial communities detected by next-generation sequencing analysis. Further microbiome analysis performance comparison of using healthy stool samples is underway; also, colorectal cancer patients' samples will be acquired for further explore the clinical utilities. Collectively, our comprehensive qualification approach, including the evaluation of optimal DNA extraction conditions, the inclusion of positive controls, and the implementation of a robust qualified bioinformatics pipeline, assures accurate characterization of the microbiota in a complex matrix for deciphering the deep biology and enabling precision medicine.

Keywords: 16S rRNA sequencing, analytical validation, bioinformatics pipeline, metagenomics

Procedia PDF Downloads 170