Search results for: iterative extraction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2288

Search results for: iterative extraction

938 VIAN-DH: Computational Multimodal Conversation Analysis Software and Infrastructure

Authors: Teodora Vukovic, Christoph Hottiger, Noah Bubenhofer

Abstract:

The development of VIAN-DH aims at bridging two linguistic approaches: conversation analysis/interactional linguistics (IL), so far a dominantly qualitative field, and computational/corpus linguistics and its quantitative and automated methods. Contemporary IL investigates the systematic organization of conversations and interactions composed of speech, gaze, gestures, and body positioning, among others. These highly integrated multimodal behaviour is analysed based on video data aimed at uncovering so called “multimodal gestalts”, patterns of linguistic and embodied conduct that reoccur in specific sequential positions employed for specific purposes. Multimodal analyses (and other disciplines using videos) are so far dependent on time and resource intensive processes of manual transcription of each component from video materials. Automating these tasks requires advanced programming skills, which is often not in the scope of IL. Moreover, the use of different tools makes the integration and analysis of different formats challenging. Consequently, IL research often deals with relatively small samples of annotated data which are suitable for qualitative analysis but not enough for making generalized empirical claims derived quantitatively. VIAN-DH aims to create a workspace where many annotation layers required for the multimodal analysis of videos can be created, processed, and correlated in one platform. VIAN-DH will provide a graphical interface that operates state-of-the-art tools for automating parts of the data processing. The integration of tools that already exist in computational linguistics and computer vision, facilitates data processing for researchers lacking programming skills, speeds up the overall research process, and enables the processing of large amounts of data. The main features to be introduced are automatic speech recognition for the transcription of language, automatic image recognition for extraction of gestures and other visual cues, as well as grammatical annotation for adding morphological and syntactic information to the verbal content. In the ongoing instance of VIAN-DH, we focus on gesture extraction (pointing gestures, in particular), making use of existing models created for sign language and adapting them for this specific purpose. In order to view and search the data, VIAN-DH will provide a unified format and enable the import of the main existing formats of annotated video data and the export to other formats used in the field, while integrating different data source formats in a way that they can be combined in research. VIAN-DH will adapt querying methods from corpus linguistics to enable parallel search of many annotation levels, combining token-level and chronological search for various types of data. VIAN-DH strives to bring crucial and potentially revolutionary innovation to the field of IL, (that can also extend to other fields using video materials). It will allow the processing of large amounts of data automatically and, the implementation of quantitative analyses, combining it with the qualitative approach. It will facilitate the investigation of correlations between linguistic patterns (lexical or grammatical) with conversational aspects (turn-taking or gestures). Users will be able to automatically transcribe and annotate visual, spoken and grammatical information from videos, and to correlate those different levels and perform queries and analyses.

Keywords: multimodal analysis, corpus linguistics, computational linguistics, image recognition, speech recognition

Procedia PDF Downloads 96
937 Architectural Robotics in Micro Living Spaces: An Approach to Enhancing Wellbeing

Authors: Timothy Antoniuk

Abstract:

This paper will demonstrate why the most successful and livable cities in the future will require multi-disciplinary designers to develop a deep understanding of peoples’ changing lifestyles, and why new generations of deeply integrated products, services and experiences need to be created. Disseminating research from the UNEP Creative Economy Reports and through a variety of other consumption and economic-based statistics, a compelling argument will be made that it is peoples’ living spaces that offer the easiest and most significant affordances for inducing positive changes to their wellbeing, and to a city’s economic and environmental prosperity. This idea, that leveraging happiness, wellbeing and prosperity through creating new concepts and typologies of ‘home’, puts people and their needs, wants, desires, aspirations and lifestyles at the beginning of the design process, not at the end, as so often occurs with current-day multi-unit housing construction. As an important part of the creative-reflective and statistical comparisons that are necessary for this on-going body of research and practice, Professor Antoniuk created the Micro Habitation Lab (mHabLab) in 2016. By focusing on testing the functional and economic feasibility of activating small spaces with different types of architectural robotics, a variety of movable, expandable and interactive objects have been hybridized and integrated into the architectural structure of the Lab. Allowing the team to test new ideas continually and accumulate thousands of points of feedback from everyday consumers, a series of on-going open houses is allowing the public-at-large to see, physically engage with, and give feedback on the items they find most and least valuable. This iterative approach of testing has exposed two key findings: Firstly, that there is a clear opportunity to improve the macro and micro functionality of small living spaces; and secondly, that allowing people to physically alter smaller elements of their living space lessens feelings of frustration and enhances feelings of pride and a deeper perception of “home”. Equally interesting to these findings is a grouping of new research questions that are being exposed which relate to: The duality of space; how people can be in two living spaces at one time; and how small living spaces is moving the Extended Home into the public realm.

Keywords: architectural robotics, extended home, interactivity, micro living spaces

Procedia PDF Downloads 158
936 Computationally Efficient Electrochemical-Thermal Li-Ion Cell Model for Battery Management System

Authors: Sangwoo Han, Saeed Khaleghi Rahimian, Ying Liu

Abstract:

Vehicle electrification is gaining momentum, and many car manufacturers promise to deliver more electric vehicle (EV) models to consumers in the coming years. In controlling the battery pack, the battery management system (BMS) must maintain optimal battery performance while ensuring the safety of a battery pack. Tasks related to battery performance include determining state-of-charge (SOC), state-of-power (SOP), state-of-health (SOH), cell balancing, and battery charging. Safety related functions include making sure cells operate within specified, static and dynamic voltage window and temperature range, derating power, detecting faulty cells, and warning the user if necessary. The BMS often utilizes an RC circuit model to model a Li-ion cell because of its robustness and low computation cost among other benefits. Because an equivalent circuit model such as the RC model is not a physics-based model, it can never be a prognostic model to predict battery state-of-health and avoid any safety risk even before it occurs. A physics-based Li-ion cell model, on the other hand, is more capable at the expense of computation cost. To avoid the high computation cost associated with a full-order model, many researchers have demonstrated the use of a single particle model (SPM) for BMS applications. One drawback associated with the single particle modeling approach is that it forces to use the average current density in the calculation. The SPM would be appropriate for simulating drive cycles where there is insufficient time to develop a significant current distribution within an electrode. However, under a continuous or high-pulse electrical load, the model may fail to predict cell voltage or Li⁺ plating potential. To overcome this issue, a multi-particle reduced-order model is proposed here. The use of multiple particles combined with either linear or nonlinear charge-transfer reaction kinetics enables to capture current density distribution within an electrode under any type of electrical load. To maintain computational complexity like that of an SPM, governing equations are solved sequentially to minimize iterative solving processes. Furthermore, the model is validated against a full-order model implemented in COMSOL Multiphysics.

Keywords: battery management system, physics-based li-ion cell model, reduced-order model, single-particle and multi-particle model

Procedia PDF Downloads 97
935 Economic Evaluation of Bowland Shale Gas Wells Development in the UK

Authors: Elijah Acquah-Andoh

Abstract:

The UK has had its fair share of the shale gas revolutionary waves blowing across the global oil and gas industry at present. Although, its exploitation is widely agreed to have been delayed, shale gas was looked upon favorably by the UK Parliament when they recognized it as genuine energy source and granted licenses to industry to search and extract the resource. This, although a significant progress by industry, there yet remains another test the UK fracking resource must pass in order to render shale gas extraction feasible – it must be economically extractible and sustainably so. Developing unconventional resources is much more expensive and risky, and for shale gas wells, producing in commercial volumes is conditional upon drilling horizontal wells and hydraulic fracturing, techniques which increase CAPEX. Meanwhile, investment in shale gas development projects is sensitive to gas price and technical and geological risks. Using a Two-Factor Model, the economics of the Bowland shale wells were analyzed and the operational conditions under which fracking is profitable in the UK was characterized. We find that there is a great degree of flexibility about Opex spending; hence Opex does not pose much threat to the fracking industry in the UK. However, we discover Bowland shale gas wells fail to add value at gas price of $8/ Mmbtu. A minimum gas price of $12/Mmbtu at Opex of no more than $2/ Mcf and no more than $14.95M Capex are required to create value within the present petroleum tax regime, in the UK fracking industry.

Keywords: capex, economical, investment, profitability, shale gas development, sustainable

Procedia PDF Downloads 570
934 Use of the Gas Chromatography Method for Hydrocarbons' Quality Evaluation in the Offshore Fields of the Baltic Sea

Authors: Pavel Shcherban, Vlad Golovanov

Abstract:

Currently, there is an active geological exploration and development of the subsoil shelf of the Kaliningrad region. To carry out a comprehensive and accurate assessment of the volumes and degree of extraction of hydrocarbons from open deposits, it is necessary to establish not only a number of geological and lithological characteristics of the structures under study, but also to determine the oil quality, its viscosity, density, fractional composition as accurately as possible. In terms of considered works, gas chromatography is one of the most capacious methods that allow the rapid formation of a significant amount of initial data. The aspects of the application of the gas chromatography method for determining the chemical characteristics of the hydrocarbons of the Kaliningrad shelf fields are observed in the article, as well as the correlation-regression analysis of these parameters in comparison with the previously obtained chemical characteristics of hydrocarbon deposits located on the land of the region. In the process of research, a number of methods of mathematical statistics and computer processing of large data sets have been applied, which makes it possible to evaluate the identity of the deposits, to specify the amount of reserves and to make a number of assumptions about the genesis of the hydrocarbons under analysis.

Keywords: computer processing of large databases, correlation-regression analysis, hydrocarbon deposits, method of gas chromatography

Procedia PDF Downloads 147
933 Association of Genetic Variants of Apolipoprotein A5 Gene with the Metabolic Syndrome in the Pakistani Population

Authors: Muhammad Fiaz, Muhammad Saqlain, Bernard M. Y. Cheung, S. M. Saqlan Naqvi, Ghazala Kaukab Raja

Abstract:

Background: Association of C allele of rs662799 SNP of APOA5 gene with metabolic syndrome (MetS) has been reported in different populations around the world. A case control study was conducted to explore the relationship of rs662799 variants (T/C) with the MetS and the associated risk phenotypes in a population of Pakistani origin. Methods: MetS was defined according to the IDF criteria. Blood samples were collected from the Pakistan Institute of Medical Sciences, Islamabad, Pakistan for biochemical profiling and DNA extraction. Genotyping of rs662799 was performed using mass ARRAY, iPEX Gold technology. A total of 712 unrelated case and control subjects were genotyped. Data were analyzed using Plink software and SPSS 16.0. Results: The risk allele C of rs662799 showed highly significant association with MetS (OR=1.5, Ρ=0.002). Among risk phenotypes, dyslipidemia, and obesity showed strong association with SNP (OR=1.49, p=0.03; OR =1.46, p=0.01) respectively in models adjusted for age and gender. Conclusion: The rs662799C allele is a significant risk marker for MetS in the local Pakistani population studied. The effect of the SNP is more on dyslipidemia than the other components of the MetS.

Keywords: metabolic syndrome, APOA5, rs662799, dyslipidemia, obesity

Procedia PDF Downloads 490
932 Use of Chemical Extractions to Estimate the Metals Availability in Bricks Made of Dredged Sediments

Authors: Fabienne Baraud, Lydia Leleyter, Sandra Poree, Melanie Lemoine

Abstract:

SEDIBRIC (valorization de SEDIments en BRIQues et tuiles) is a French project that aims to replace a part of natural clays with dredged sediments in the preparation of fired bricks in order to propose an alternative solution for the management of harbor dredged sediments. The feasibility of such re-use is explored from a technical, economic, and environmental point of view. The present study focuses on the potential environmental impact of various chemical elements (Al, Ca, Cd, Co, Cr, Cu, Fe, Ni, Mg, Mn, Pb, Ti, and Zn) that are initially present in the dredged sediments. The total content (after acid digestion) and the environmental availability (estimated by single extractions with various extractants) of these elements are determined in the raw sediments and in the obtained fired bricks. The possible influence of some steps of the manufacturing process (sediment pre-treatment, firing) is also explored. The first results show that the pre-treatment step, which uses tap water to desalinate the raw sediment, does not influence the environmental availability of the studied elements. However, the firing process, performed at 900°C, can affect the amount of some elements detected in the bricks, as well as their environmental availability. We note that for Cr, or Ni, the HCl or EDTA availability was increased in the brick (compared to the availability in the raw sediment). For Cd, Cu, Pb, and Zn, the HCl and EDTA availability was reduced in the bricks, meaning that these elements were stabilized within the bricks.

Keywords: bricks, chemical extraction, metals, sediment

Procedia PDF Downloads 138
931 Analysis of Wheel Lock up Effects on Skidding Distance for Heavy Vehicles

Authors: Mahdieh Zamzamzadeh, Ahmad Abdullah Saifizul, Rahizar Ramli

Abstract:

The road accidents involving heavy vehicles have been showing worrying trends and, year after year, have increased the concern and awareness levels on safety of roads and transportations especially in developing countries like Malaysia. Statistics of road crashes continue to show that there are many contributing factors on the capability of a heavy vehicle to stop on safe distance and ultimately prevent traffic crashes. However, changes in the road condition due to weather variations and the vehicle dynamic specifications such as loading conditions and speed are the main risk factors because they will affect a heavy vehicle’s braking performance due to losing control and not being able to stop the vehicle, and in many cases will cause wheel lock up and accordingly skidding. Predicting heavy vehicle skidding distance is crucial for accident reconstruction and roadside safety engineers. Despite this, formal tools to study heavy vehicle skidding distance before stopping completely are totally limited, and most researchers have only considered braking distance in their studies. As a possible new tool, this work presents the iterative use of vehicle dynamic simulations to study heavy vehicle-roadway interaction in order to predict wheel lock up effects on skidding distance and safety. This research addresses the influence of the vehicle and road conditions on skidding distance after wheel lock up and presents a precise analysis of skidding phenomenon. The vehicle speed, vehicle loading condition and road friction parameters were all varied in a simulation-based analysis. In order to simulate the wheel lock up situation, a heavy vehicle model was constructed and simulated using multibody vehicle dynamics simulation software, and careful analysis was made on the conditions which caused the skidding distance to increase or decrease through a method using to predict skidding distance as part of braking distance. By applying many simulations, the results were quite revealing relation between the heavy vehicles loading condition, various sets of speed and road coefficient of friction and their interaction effect on the skidding distance. A number of results are presented which illustrate how the heavy vehicle overloading can seriously affect the skidding distance. Moreover, the results of simulation give the skid mark length, which is a necessary input data during accident reconstruction involving emergency braking.

Keywords: accident reconstruction, Braking, heavy vehicle, skidding distance, skid mark, wheel lock up

Procedia PDF Downloads 487
930 Postmortem Analysis of Lidocaine in Women Died of Criminal Abortion

Authors: Mohammed A. Arishy, Sultan M. Alharbi, Mohammed A. Hakami, Farid M. Abualsail, Mohammad A. Attafi, Riyadh M. Tobaiqi, Hussain M. Alsalem, Ibraheem M. Attafi

Abstract:

Lidocaine is the most common local anesthetics used for para cervical block to reduce pain associated with surgical abortion. A 25-year-old pregnant woman who. She died before reaching hospital, and she was undergoing criminal abortion during the first trimester. In post-mortem investigations and autopsy shows no clear finding; therefore, toxic substances must be suspected and searched for routinely toxicology analysis. In this case report, the postmortem concentration of lidocaine was detected blood, brain, liver, kidney, and stomach. For lidocaine identification and quantification, sample was extracted using solid phase extraction and analyzed by GC-MS (Shimadzu, Japan). Initial screening and confirmatory analysis results showed that only lidocaine was detected in all collected samples, and no other toxic substances or alcohol were detected. The concentrations of lidocaine in samples were 19, 17, 14, 7, and 3 ug/m in the brain, blood, kidney, liver, and stomach, respectively. Lidocaine blood concentration (17 ug/ml) was toxic level and may result in death. Among the tissues, brain showed the highest level of lidocaine, followed by the kidney, liver, and stomach.

Keywords: forensic toxicology, GC-MS, lidocaine, postmortem

Procedia PDF Downloads 200
929 A Technique for Image Segmentation Using K-Means Clustering Classification

Authors: Sadia Basar, Naila Habib, Awais Adnan

Abstract:

The paper presents the Technique for Image Segmentation Using K-Means Clustering Classification. The presented algorithms were specific, however, missed the neighboring information and required high-speed computerized machines to run the segmentation algorithms. Clustering is the process of partitioning a group of data points into a small number of clusters. The proposed method is content-aware and feature extraction method which is able to run on low-end computerized machines, simple algorithm, required low-quality streaming, efficient and used for security purpose. It has the capability to highlight the boundary and the object. At first, the user enters the data in the representation of the input. Then in the next step, the digital image is converted into groups clusters. Clusters are divided into many regions. The same categories with same features of clusters are assembled within a group and different clusters are placed in other groups. Finally, the clusters are combined with respect to similar features and then represented in the form of segments. The clustered image depicts the clear representation of the digital image in order to highlight the regions and boundaries of the image. At last, the final image is presented in the form of segments. All colors of the image are separated in clusters.

Keywords: clustering, image segmentation, K-means function, local and global minimum, region

Procedia PDF Downloads 364
928 Human LACE1 Functions Pro-Apoptotic and Interacts with Mitochondrial YME1L Protease

Authors: Lukas Stiburek, Jana Cesnekova, Josef Houstek, Jiri Zeman

Abstract:

Cellular function depends on mitochondrial function and integrity that is therefore maintained by several classes of proteins possessing chaperone and/or proteolytic activities. In this work, we focused on characterization of LACE1 (lactation elevated 1) function in mitochondrial protein homeostasis maintenance. LACE1 is the human homologue of yeast mitochondrial Afg1 ATPase, a member of SEC18-NSF, PAS1, CDC48-VCP, TBP family. Yeast Afg1 was shown to be involved in mitochondrial complex IV biogenesis, and based on its similarity with CDC48 (p97/VCP) it was suggested to facilitate extraction of polytopic membrane proteins. Here we show that LACE1, which is a mitochondrial integral membrane protein, exists as part of three complexes of approx. 140, 400 and 500 kDa and is essential for maintenance of fused mitochondrial reticulum and lamellar cristae morphology. Using affinity purification of LACE1-FLAG expressed in LACE1 knockdown background we show that the protein physically interacts with mitochondrial inner membrane protease YME1L. We further show that human LACE1 exhibits significant pro-apoptotic activity and that the protein is required for normal function of the mitochondrial respiratory chain. Thus, our work establishes LACE1 as a novel factor with the crucial role in mitochondrial homeostasis maintenance.

Keywords: LACE1, mitochondria, apoptosis, protease

Procedia PDF Downloads 298
927 Characterization of Triterpenoids Antimicrobial Potential in Ethyl Acetate Extracts from Aerial Parts of Deinbollia Pinnata

Authors: Rufai Yakubu And Suleiman Kabiru

Abstract:

Triterpenoids are a diverse class of secondary metabolites with potential antimicrobial properties. In this study, the crude extracts from ethyl acetate was obtained with ultrasonic extraction method. Using a combined chromatographic separation method to isolate squalene (1) stigmasterol (2), stigmasta-5,22-diene-3-ol acetate (3), γ-sitosterol (4), lupeol (5), taraxasterol (6), and betulinic acid (7) from ethyl acetate extracts. Ethyl acetate crude extracts and isolated compounds were both screened for antimicrobial activity and minimum inhibitory concentration (MIC). For ethyl acetate crude extracts with concentrations of (1.5, 0.75, 0.35, & 0.168 mg/mL) indicated marginal antibacterial activity with a range of 17, 20 and 14 mm zone of inhibition for Staphylococcus aureus, Escherichia coli and Candida albicans and lower minimum inhibitory concentrations ranges from 18.75 µg/ml to 150 µg/mL. Butulinic acid showed the highest activity against E. coli and C. albicans at 15 mm and 15 mm followed by Lupeol against S. aureus, E. coli and C. albicans at 13, 12, 12 mm. Moreso, no antimicrobial activity for both S. aureus and C. albicans with squalene except for E. coli which showed activity at 11 mm with 300 µg/mL (MIC). Thus, abundant triterpenoids in Deinbollia pinnata will be another centered area for antimicrobial drug discovery.

Keywords: triterpenoid, antimicrobial potentials, deinbollia pinnata, aerial parts

Procedia PDF Downloads 61
926 Sorption of Crystal Violet from Aqueous Solution Using Chitosan−Charcoal Composite

Authors: Kingsley Izuagbe Ikeke, Abayomi O. Adetuyi

Abstract:

The study investigated the removal efficiency of crystal violet from aqueous solution using chitosan-charcoal composite as adsorbent. Deproteination was carried out by placing 200g of powdered snail shell in 4% w/v NaOH for 2hours. The sample was then placed in 1% HCl for 24 hours to remove CaCO3. Deacetylation was done by boiling in 50% NaOH for 2hours. 10% Oxalic acid was used to dissolve the chitosan before mixing with charcoal at 55°C to form the composite. The composite was characterized by Fourier Transform Infra-Red and Scanning Electron Microscopy measurements. The efficiency of adsorption was evaluated by varying pH of the solution, contact time, initial concentration and adsorbent dose. Maximum removal of crystal violet by composite and activated charcoal was attained at pH10 while maximum removal of crystal violet by chitosan was achieved at pH 8. The results showed that adsorption of both dyes followed the pseudo-second-order rate equation and fit the Langmuir and Freundlich isotherms. The data showed that composite was best suited for crystal violet removal and also did relatively well in the removal of alizarin red. Thermodynamic parameters such as enthalpy change (ΔHº), free energy change (ΔGº) and entropy change (ΔSº) indicate that adsorption process of Crystal Violet was endothermic, spontaneous and feasible respectively.

Keywords: crystal violet, chitosan−charcoal composite, extraction process, sorption

Procedia PDF Downloads 422
925 Determination of Aflatoxins in Edible-Medicinal Plant Samples by HPLC with Fluorescence Detector and KOBRA-Cell

Authors: Isil Gazioglu, Abdulselam Ertas

Abstract:

Aflatoxins (AFs) are secondary toxic metabolites of Aspergillus flavus and A. parasiticus. AFs can be absorbed through the skin. Potent carcinogens like AFs should be completely absent from cosmetics, this can be achieved by careful quality control of the raw plant materials. Regulatory limits for aflatoxins have been established in many countries, and reliable testing methodology is needed to implement and enforce the regulatory limits. In this study, ten medicinal plant samples (Bundelia tournefortti, Capsella bursa-pastoris, Carduus tenuiflorus, Cardaria draba, Malva neglecta, Malvella sharardiana, Melissa officinalis, Sideritis libanotica, Stakys thirkei, Thymus nummularius) were investigated for aflatoxin (AF) contaminations by employing an HPLC assay for the determination of AFB1, B2, G1 and G2. The samples were extracted with 70% (v/v) methanol in water before further cleaned up with an immunoaffinity column and followed by the detection of AFs by using an electrochemically post-column derivatization with Kobra-Cell and fluorescence detector. The extraction procedure was optimized in order to obtain the best recovery. The method was successfully carried out with all medicinal plant samples. The results revealed that five (50%) of samples were contaminated with AFs. The association between particular samples and the AF contaminated could not be determined due to the low frequency of positive samples.

Keywords: aflatoxin B1, HPLC-FLD, KOBRA-Cell, mycotoxin

Procedia PDF Downloads 594
924 A Portable Cognitive Tool for Engagement Level and Activity Identification

Authors: Terry Teo, Sun Woh Lye, Yufei Li, Zainuddin Zakaria

Abstract:

Wearable devices such as Electroencephalography (EEG) hold immense potential in the monitoring and assessment of a person’s task engagement. This is especially so in remote or online sites. Research into its use in measuring an individual's cognitive state while performing task activities is therefore expected to increase. Despite the growing number of EEG research into brain functioning activities of a person, key challenges remain in adopting EEG for real-time operations. These include limited portability, long preparation time, high number of channel dimensionality, intrusiveness, as well as level of accuracy in acquiring neurological data. This paper proposes an approach using a 4-6 EEG channels to determine the cognitive states of a subject when undertaking a set of passive and active monitoring tasks of a subject. Air traffic controller (ATC) dynamic-tasks are used as a proxy. The work found that when using the channel reduction and identifier algorithm, good trend adherence of 89.1% can be obtained between a commercially available BCI 14 channel Emotiv EPOC+ EEG headset and that of a carefully selected set of reduced 4-6 channels. The approach can also identify different levels of engagement activities ranging from general monitoring ad hoc and repeated active monitoring activities involving information search, extraction, and memory activities.

Keywords: assessment, neurophysiology, monitoring, EEG

Procedia PDF Downloads 68
923 Effect of Sodium Chloride in the Recovery of Acetic Acid from Aqueous Solutions

Authors: Aidaoui Ahleme, Hasseine Abdelmalek

Abstract:

Acetic acid is one of the simplest and most widely used carboxylic acids having many important chemical and industrial applications. Total worldwide production of acetic acid is about 6.5 million tonnes per year. A great deal of efforts has been made in developing feasible and economic method for recovery of carboxylic acids. Among them, Liquid-liquid extraction using aqueous two-phase systems (ATPS) has been demonstrated to be a highly efficient separation technique. The study of efficiently separating and recovering Acetic acid from aqueous solutions is an important significance on industry and environmentally sustainable development. Many research groups in different countries are working in this field and some methods are proposed in the literature. In this work, effect of sodium chloride with different content (5%, 10% and 20%) on the liquid-liquid equilibrium data of (water+ acetic acid+ DCM) system is investigated. The addition of the salt in an aqueous solution introduces ionic forces which affect liquid-liquid equilibrium and which influence directly the distribution coefficient of the solute. From the experimental results, it can be concluded that when the percentage of salt increases in the aqueous solution, the equilibrium between phases is modified in favor of the extracted phase.

Keywords: acetic acid recovery, aqueous solution, salting-effect, sodium chloride

Procedia PDF Downloads 259
922 3D Simulation of Orthodontic Tooth Movement in the Presence of Horizontal Bone Loss

Authors: Azin Zargham, Gholamreza Rouhi, Allahyar Geramy

Abstract:

One of the most prevalent types of alveolar bone loss is horizontal bone loss (HBL) in which the bone height around teeth is reduced homogenously. In the presence of HBL the magnitudes of forces during orthodontic treatment should be altered according to the degree of HBL, in a way that without further bone loss, desired tooth movement can be obtained. In order to investigate the appropriate orthodontic force system in the presence of HBL, a three-dimensional numerical model capable of the simulation of orthodontic tooth movement was developed. The main goal of this research was to evaluate the effect of different degrees of HBL on a long-term orthodontic tooth movement. Moreover, the effect of different force magnitudes on orthodontic tooth movement in the presence of HBL was studied. Five three-dimensional finite element models of a maxillary lateral incisor with 0 mm, 1.5 mm, 3 mm, 4.5 mm and 6 mm of HBL were constructed. The long-term orthodontic tooth tipping movements were attained during a 4-weeks period in an iterative process through the external remodeling of the alveolar bone based on strains in periodontal ligament as the bone remodeling mechanical stimulus. To obtain long-term orthodontic tooth movement in each iteration, first the strains in periodontal ligament under a 1-N tipping force were calculated using finite element analysis. Then, bone remodeling and the subsequent tooth movement were computed in a post-processing software using a custom written program. Incisal edge, cervical, and apical area displacement in the models with different alveolar bone heights (0, 1.5, 3, 4.5, 6 mm bone loss) in response to a 1-N tipping force were calculated. Maximum tooth displacement was found to be 2.65 mm at the top of the crown of the model with a 6 mm bone loss. Minimum tooth displacement was 0.45 mm at the cervical level of the model with a normal bone support. Tooth tipping degrees of models in response to different tipping force magnitudes were also calculated for models with different degrees of HBL. Degrees of tipping tooth movement increased as force level was increased. This increase was more prominent in the models with smaller degrees of HBL. By using finite element method and bone remodeling theories, this study indicated that in the presence of HBL, under the same load, long-term orthodontic tooth movement will increase. The simulation also revealed that even though tooth movement increases with increasing the force, this increase was only prominent in the models with smaller degrees of HBL, and tooth models with greater degrees of HBL will be less affected by the magnitude of an orthodontic force. Based on our results, the applied force magnitude must be reduced in proportion of degree of HBL.

Keywords: bone remodeling, finite element method, horizontal bone loss, orthodontic tooth movement.

Procedia PDF Downloads 333
921 Reverse Engineering Genius: Through the Lens of World Language Collaborations

Authors: Cynthia Briggs, Kimberly Gerardi

Abstract:

Over the past six years, the authors have been working together on World Language Collaborations in the Middle School French Program at St. Luke's School in New Canaan, Connecticut, USA. Author 2 brings design expertise to the projects, and both teachers have utilized the fabrication lab, emerging technologies, and collaboration with students. Each year, author 1 proposes a project scope, and her students are challenged to design and engineer a signature project. Both partners have improved the iterative process to ensure deeper learning and sustained student inquiry. The projects range from a 1:32 scale model of the Eiffel Tower that was CNC routed to a fully functional jukebox that plays francophone music, lights up, and can hold up to one thousand songs powered by Raspberry Pi. The most recent project is a Fragrance Marketplace, culminating with a pop-up store for the entire community to discover. Each student will learn the history of fragrance and the chemistry behind making essential oils. Students then create a unique brand, marketing strategy, and concept for their signature fragrance. They are further tasked to use the industrial design process (bottling, packaging, and creating a brand name) to finalize their product for the public Marketplace. Sometimes, these dynamic projects require maintenance and updates. For example, our wall-mounted, three-foot francophone clock is constantly changing. The most recent iteration uses Chat GPT to program the Arduino to reconcile the real-time clock shield and keep perfect time as each hour passes. The lights, motors, and sounds from the clock are authentic to each region, represented with laser-cut embellishments. Inspired by Michel Parmigiani, the history of Swiss watch-making, and the precision of time instruments, we aim for perfection with each passing minute. The authors aim to share exemplary work that is possible with students of all ages. We implemented the reverse engineering process to focus on student outcomes to refine our collaborative process. The products that our students create are prime examples of how the design engineering process is applicable across disciplines. The authors firmly believe that the past and present of World cultures inspire innovation.

Keywords: collaboration, design thinking, emerging technologies, world language

Procedia PDF Downloads 33
920 Recovery of Metals from Electronic Waste by Physical and Chemical Recycling Processes

Authors: Muammer Kaya

Abstract:

The main purpose of this article is to provide a comprehensive review of various physical and chemical processes for electronic waste (e-waste) recycling, their advantages and shortfalls towards achieving a cleaner process of waste utilization, with especial attention towards extraction of metallic values. Current status and future perspectives of waste printed circuit boards (PCBs) recycling are described. E-waste characterization, dismantling/ disassembly methods, liberation and classification processes, composition determination techniques are covered. Manual selective dismantling and metal-nonmetal liberation at – 150 µm at two step crushing are found to be the best. After size reduction, mainly physical separation/concentration processes employing gravity, electrostatic, magnetic separators, froth floatation etc., which are commonly used in mineral processing, have been critically reviewed here for separation of metals and non-metals, along with useful utilizations of the non-metallic materials. The recovery of metals from e-waste material after physical separation through pyrometallurgical, hydrometallurgical or biohydrometallurgical routes is also discussed along with purification and refining and some suitable flowsheets are also given. It seems that hydrometallurgical route will be a key player in the base and precious metals recoveries from e-waste. E-waste recycling will be a very important sector in the near future from economic and environmental perspectives.

Keywords: e-waste, WEEE, recycling, metal recovery, hydrometallurgy, pirometallurgy, biometallurgy

Procedia PDF Downloads 335
919 Classification Using Worldview-2 Imagery of Giant Panda Habitat in Wolong, Sichuan Province, China

Authors: Yunwei Tang, Linhai Jing, Hui Li, Qingjie Liu, Xiuxia Li, Qi Yan, Haifeng Ding

Abstract:

The giant panda (Ailuropoda melanoleuca) is an endangered species, mainly live in central China, where bamboos act as the main food source of wild giant pandas. Knowledge of spatial distribution of bamboos therefore becomes important for identifying the habitat of giant pandas. There have been ongoing studies for mapping bamboos and other tree species using remote sensing. WorldView-2 (WV-2) is the first high resolution commercial satellite with eight Multi-Spectral (MS) bands. Recent studies demonstrated that WV-2 imagery has a high potential in classification of tree species. The advanced classification techniques are important for utilising high spatial resolution imagery. It is generally agreed that object-based image analysis is a more desirable method than pixel-based analysis in processing high spatial resolution remotely sensed data. Classifiers that use spatial information combined with spectral information are known as contextual classifiers. It is suggested that contextual classifiers can achieve greater accuracy than non-contextual classifiers. Thus, spatial correlation can be incorporated into classifiers to improve classification results. The study area is located at Wuyipeng area in Wolong, Sichuan Province. The complex environment makes it difficult for information extraction since bamboos are sparsely distributed, mixed with brushes, and covered by other trees. Extensive fieldworks in Wuyingpeng were carried out twice. The first one was on 11th June, 2014, aiming at sampling feature locations for geometric correction and collecting training samples for classification. The second fieldwork was on 11th September, 2014, for the purposes of testing the classification results. In this study, spectral separability analysis was first performed to select appropriate MS bands for classification. Also, the reflectance analysis provided information for expanding sample points under the circumstance of knowing only a few. Then, a spatially weighted object-based k-nearest neighbour (k-NN) classifier was applied to the selected MS bands to identify seven land cover types (bamboo, conifer, broadleaf, mixed forest, brush, bare land, and shadow), accounting for spatial correlation within classes using geostatistical modelling. The spatially weighted k-NN method was compared with three alternatives: the traditional k-NN classifier, the Support Vector Machine (SVM) method and the Classification and Regression Tree (CART). Through field validation, it was proved that the classification result obtained using the spatially weighted k-NN method has the highest overall classification accuracy (77.61%) and Kappa coefficient (0.729); the producer’s accuracy and user’s accuracy achieve 81.25% and 95.12% for the bamboo class, respectively, also higher than the other methods. Photos of tree crowns were taken at sample locations using a fisheye camera, so the canopy density could be estimated. It is found that it is difficult to identify bamboo in the areas with a large canopy density (over 0.70); it is possible to extract bamboos in the areas with a median canopy density (from 0.2 to 0.7) and in a sparse forest (canopy density is less than 0.2). In summary, this study explores the ability of WV-2 imagery for bamboo extraction in a mountainous region in Sichuan. The study successfully identified the bamboo distribution, providing supporting knowledge for assessing the habitats of giant pandas.

Keywords: bamboo mapping, classification, geostatistics, k-NN, worldview-2

Procedia PDF Downloads 301
918 Gait Biometric for Person Re-Identification

Authors: Lavanya Srinivasan

Abstract:

Biometric identification is to identify unique features in a person like fingerprints, iris, ear, and voice recognition that need the subject's permission and physical contact. Gait biometric is used to identify the unique gait of the person by extracting moving features. The main advantage of gait biometric to identify the gait of a person at a distance, without any physical contact. In this work, the gait biometric is used for person re-identification. The person walking naturally compared with the same person walking with bag, coat, and case recorded using longwave infrared, short wave infrared, medium wave infrared, and visible cameras. The videos are recorded in rural and in urban environments. The pre-processing technique includes human identified using YOLO, background subtraction, silhouettes extraction, and synthesis Gait Entropy Image by averaging the silhouettes. The moving features are extracted from the Gait Entropy Energy Image. The extracted features are dimensionality reduced by the principal component analysis and recognised using different classifiers. The comparative results with the different classifier show that linear discriminant analysis outperforms other classifiers with 95.8% for visible in the rural dataset and 94.8% for longwave infrared in the urban dataset.

Keywords: biometric, gait, silhouettes, YOLO

Procedia PDF Downloads 165
917 On the Implementation of The Pulse Coupled Neural Network (PCNN) in the Vision of Cognitive Systems

Authors: Hala Zaghloul, Taymoor Nazmy

Abstract:

One of the great challenges of the 21st century is to build a robot that can perceive and act within its environment and communicate with people, while also exhibiting the cognitive capabilities that lead to performance like that of people. The Pulse Coupled Neural Network, PCNN, is a relative new ANN model that derived from a neural mammal model with a great potential in the area of image processing as well as target recognition, feature extraction, speech recognition, combinatorial optimization, compressed encoding. PCNN has unique feature among other types of neural network, which make it a candid to be an important approach for perceiving in cognitive systems. This work show and emphasis on the potentials of PCNN to perform different tasks related to image processing. The main drawback or the obstacle that prevent the direct implementation of such technique, is the need to find away to control the PCNN parameters toward perform a specific task. This paper will evaluate the performance of PCNN standard model for processing images with different properties, and select the important parameters that give a significant result, also, the approaches towards find a way for the adaptation of the PCNN parameters to perform a specific task.

Keywords: cognitive system, image processing, segmentation, PCNN kernels

Procedia PDF Downloads 265
916 Building an Ontology for Researchers: An Application of Topic Maps and Social Information

Authors: Yu Hung Chiang, Hei Chia Wang

Abstract:

In the academic area, it is important for research to find proper research domain. Many researchers may refer to conference issues to find their interesting or new topics. Furthermore, conferences issues can help researchers realize current research trends in their field and learn about cutting-edge developments in their specialty. However, online published conference information may widely be distributed; it is not easy to be concluded. Many researchers use search engine of journals or conference issues to filter information in order to get what they want. However, this search engine has its limitation. There will still be some issues should be considered; i.e. researchers cannot find the associated topics which may be useful information for them. Hence, use Knowledge Management (KM) could be a way to resolve these issues. In KM, ontology is widely adopted; but most existed ontology construction methods do not consider social information between target users. To effective in academic KM, this study proposes a method of constructing research Topic Maps using Open Directory Project (ODP) and Social Information Processing (SIP). Through catching of social information in conference website: i.e. the information of co-authorship or collaborator, research topics can be associated among related researchers. Finally, the experiments show Topic Maps successfully help researchers to find the information they need more easily and quickly as well as construct associations between research topics.

Keywords: knowledge management, topic map, social information processing, ontology extraction

Procedia PDF Downloads 277
915 Determination of the Optimum Size of Building Stone Blocks: Case Study of Delichai Travertine Mine

Authors: Hesam Sedaghat Nejad, Navid Hosseini, Arash Nikvar Hassani

Abstract:

Determination of the optimum block size with high profitability is one of the significant parameters in designation of the building stone mines. The aim of this study was to determine the optimum dimensions of building stone blocks in Delichai travertine mine of Damavand in Tehran province through combining the effective parameters proven in determination of the optimum dimensions in building stones such as the spacing of joints and gaps, extraction tools constraints with the help of modeling by Gemcom software. To this end, following simulation of the topography of the mine, the block model was prepared and then in order to use spacing joints and discontinuities as a limiting factor, the existing joints set was added to the model. Since only one almost horizontal joint set with a slope of 5 degrees was available, this factor was effective only in determining the optimum height of the block, and thus to determine the longitudinal and transverse optimum dimensions of the extracted block, the power of available loader in the mine was considered as the secondary limiting factor. According to the aforementioned factors, the optimal block size in this mine was measured as 3.4×4×7 meter.

Keywords: building stone, optimum block size, Delichay travertine mine, loader power

Procedia PDF Downloads 351
914 A Study of Non-Coplanar Imaging Technique in INER Prototype Tomosynthesis System

Authors: Chia-Yu Lin, Yu-Hsiang Shen, Cing-Ciao Ke, Chia-Hao Chang, Fan-Pin Tseng, Yu-Ching Ni, Sheng-Pin Tseng

Abstract:

Tomosynthesis is an imaging system that generates a 3D image by scanning in a limited angular range. It could provide more depth information than traditional 2D X-ray single projection. Radiation dose in tomosynthesis is less than computed tomography (CT). Because of limited angular range scanning, there are many properties depending on scanning direction. Therefore, non-coplanar imaging technique was developed to improve image quality in traditional tomosynthesis. The purpose of this study was to establish the non-coplanar imaging technique of tomosynthesis system and evaluate this technique by the reconstructed image. INER prototype tomosynthesis system contains an X-ray tube, a flat panel detector, and a motion machine. This system could move X-ray tube in multiple directions during the acquisition. In this study, we investigated three different imaging techniques that were 2D X-ray single projection, traditional tomosynthesis, and non-coplanar tomosynthesis. An anthropopathic chest phantom was used to evaluate the image quality. It contained three different size lesions (3 mm, 5 mm and, 8 mm diameter). The traditional tomosynthesis acquired 61 projections over a 30 degrees angular range in one scanning direction. The non-coplanar tomosynthesis acquired 62 projections over 30 degrees angular range in two scanning directions. A 3D image was reconstructed by iterative image reconstruction algorithm (ML-EM). Our qualitative method was to evaluate artifacts in tomosynthesis reconstructed image. The quantitative method was used to calculate a peak-to-valley ratio (PVR) that means the intensity ratio of the lesion to the background. We used PVRs to evaluate the contrast of lesions. The qualitative results showed that in the reconstructed image of non-coplanar scanning, anatomic structures of chest and lesions could be identified clearly and no significant artifacts of scanning direction dependent could be discovered. In 2D X-ray single projection, anatomic structures overlapped and lesions could not be discovered. In traditional tomosynthesis image, anatomic structures and lesions could be identified clearly, but there were many artifacts of scanning direction dependent. The quantitative results of PVRs show that there were no significant differences between non-coplanar tomosynthesis and traditional tomosynthesis. The PVRs of the non-coplanar technique were slightly higher than traditional technique in 5 mm and 8 mm lesions. In non-coplanar tomosynthesis, artifacts of scanning direction dependent could be reduced and PVRs of lesions were not decreased. The reconstructed image was more isotropic uniformity in non-coplanar tomosynthesis than in traditional tomosynthesis. In the future, scan strategy and scan time will be the challenges of non-coplanar imaging technique.

Keywords: image reconstruction, non-coplanar imaging technique, tomosynthesis, X-ray imaging

Procedia PDF Downloads 355
913 Depolymerization of Lignin in Sugarcane Bagasse by Hydrothermal Liquefaction to Optimize Catechol Formation

Authors: Nirmala Deenadayalu, Kwanele B. Mazibuko, Lethiwe D. Mthembu

Abstract:

Sugarcane bagasse is the residue obtained after the extraction of sugar from the sugarcane. The main aim of this work was to produce catechol from sugarcane bagasse. The optimization of catechol production was investigated using a Box-Behnken design of experiments. The sugarcane bagasse was heated in a Parr reactor at a set temperature. The reactions were carried out at different temperatures (100-250) °C, catalyst loading (1% -10% KOH (m/v)) and reaction times (60 – 240 min) at 17 bar pressure. The solid and liquid fractions were then separated by vacuum filtration. The liquid fraction was analyzed for catechol using high-pressure liquid chromatography (HPLC) and characterized for the functional groups using Fourier transform infrared spectroscopy (FTIR). The optimized condition for catechol production was 175 oC, 240 min, and 10 % KOH with a catechol yield of 79.11 ppm. Since the maximum time was 240 min and 10 % KOH, a further series of experiments were conducted at 175 oC, 260 min, and 20 % KOH and yielded 2.46 ppm catechol, which was a large reduction in catechol produced. The HPLC peak for catechol was obtained at 2.5 min for the standards and the samples. The FTIR peak at 1750 cm⁻¹ was due to the C=C vibration band of the aromatic ring in the catechol present for both the standard and the samples. The peak at 3325 cm⁻¹ was due to the hydrogen-bonded phenolic OH vibration bands for the catechol. The ANOVA analysis was also performed on the set of experimental data to obtain the factors that most affected the amount of catechol produced.

Keywords: catechol, sugarcane bagasse, lignin, hydrothermal liquefaction

Procedia PDF Downloads 87
912 A Double Ended AC Series Arc Fault Location Algorithm Based on Currents Estimation and a Fault Map Trace Generation

Authors: Edwin Calderon-Mendoza, Patrick Schweitzer, Serge Weber

Abstract:

Series arc faults appear frequently and unpredictably in low voltage distribution systems. Many methods have been developed to detect this type of faults and commercial protection systems such AFCI (arc fault circuit interrupter) have been used successfully in electrical networks to prevent damage and catastrophic incidents like fires. However, these devices do not allow series arc faults to be located on the line in operating mode. This paper presents a location algorithm for series arc fault in a low-voltage indoor power line in an AC 230 V-50Hz home network. The method is validated through simulations using the MATLAB software. The fault location method uses electrical parameters (resistance, inductance, capacitance, and conductance) of a 49 m indoor power line. The mathematical model of a series arc fault is based on the analysis of the V-I characteristics of the arc and consists basically of two antiparallel diodes and DC voltage sources. In a first step, the arc fault model is inserted at some different positions across the line which is modeled using lumped parameters. At both ends of the line, currents and voltages are recorded for each arc fault generation at different distances. In the second step, a fault map trace is created by using signature coefficients obtained from Kirchhoff equations which allow a virtual decoupling of the line’s mutual capacitance. Each signature coefficient obtained from the subtraction of estimated currents is calculated taking into account the Discrete Fast Fourier Transform of currents and voltages and also the fault distance value. These parameters are then substituted into Kirchhoff equations. In a third step, the same procedure described previously to calculate signature coefficients is employed but this time by considering hypothetical fault distances where the fault can appear. In this step the fault distance is unknown. The iterative calculus from Kirchhoff equations considering stepped variations of the fault distance entails the obtaining of a curve with a linear trend. Finally, the fault distance location is estimated at the intersection of two curves obtained in steps 2 and 3. The series arc fault model is validated by comparing current registered from simulation with real recorded currents. The model of the complete circuit is obtained for a 49m line with a resistive load. Also, 11 different arc fault positions are considered for the map trace generation. By carrying out the complete simulation, the performance of the method and the perspectives of the work will be presented.

Keywords: indoor power line, fault location, fault map trace, series arc fault

Procedia PDF Downloads 127
911 Noninvasive Brain-Machine Interface to Control Both Mecha TE Robotic Hands Using Emotiv EEG Neuroheadset

Authors: Adrienne Kline, Jaydip Desai

Abstract:

Electroencephalogram (EEG) is a noninvasive technique that registers signals originating from the firing of neurons in the brain. The Emotiv EEG Neuroheadset is a consumer product comprised of 14 EEG channels and was used to record the reactions of the neurons within the brain to two forms of stimuli in 10 participants. These stimuli consisted of auditory and visual formats that provided directions of ‘right’ or ‘left.’ Participants were instructed to raise their right or left arm in accordance with the instruction given. A scenario in OpenViBE was generated to both stimulate the participants while recording their data. In OpenViBE, the Graz Motor BCI Stimulator algorithm was configured to govern the duration and number of visual stimuli. Utilizing EEGLAB under the cross platform MATLAB®, the electrodes most stimulated during the study were defined. Data outputs from EEGLAB were analyzed using IBM SPSS Statistics® Version 20. This aided in determining the electrodes to use in the development of a brain-machine interface (BMI) using real-time EEG signals from the Emotiv EEG Neuroheadset. Signal processing and feature extraction were accomplished via the Simulink® signal processing toolbox. An Arduino™ Duemilanove microcontroller was used to link the Emotiv EEG Neuroheadset and the right and left Mecha TE™ Hands.

Keywords: brain-machine interface, EEGLAB, emotiv EEG neuroheadset, OpenViBE, simulink

Procedia PDF Downloads 484
910 MB-Slam: A Slam Framework for Construction Monitoring

Authors: Mojtaba Noghabaei, Khashayar Asadi, Kevin Han

Abstract:

Simultaneous Localization and Mapping (SLAM) technology has recently attracted the attention of construction companies for real-time performance monitoring. To effectively use SLAM for construction performance monitoring, SLAM results should be registered to a Building Information Models (BIM). Registring SLAM and BIM can provide essential insights for construction managers to identify construction deficiencies in real-time and ultimately reduce rework. Also, registering SLAM to BIM in real-time can boost the accuracy of SLAM since SLAM can use features from both images and 3d models. However, registering SLAM with the BIM in real-time is a challenge. In this study, a novel SLAM platform named Model-Based SLAM (MB-SLAM) is proposed, which not only provides automated registration of SLAM and BIM but also improves the localization accuracy of the SLAM system in real-time. This framework improves the accuracy of SLAM by aligning perspective features such as depth, vanishing points, and vanishing lines from the BIM to the SLAM system. This framework extracts depth features from a monocular camera’s image and improves the localization accuracy of the SLAM system through a real-time iterative process. Initially, SLAM can be used to calculate a rough camera pose for each keyframe. In the next step, each SLAM video sequence keyframe is registered to the BIM in real-time by aligning the keyframe’s perspective with the equivalent BIM view. The alignment method is based on perspective detection that estimates vanishing lines and points by detecting straight edges on images. This process will generate the associated BIM views from the keyframes' views. The calculated poses are later improved during a real-time gradient descent-based iteration method. Two case studies were presented to validate MB-SLAM. The validation process demonstrated promising results and accurately registered SLAM to BIM and significantly improved the SLAM’s localization accuracy. Besides, MB-SLAM achieved real-time performance in both indoor and outdoor environments. The proposed method can fully automate past studies and generate as-built models that are aligned with BIM. The main contribution of this study is a SLAM framework for both research and commercial usage, which aims to monitor construction progress and performance in a unified framework. Through this platform, users can improve the accuracy of the SLAM by providing a rough 3D model of the environment. MB-SLAM further boosts the application to practical usage of the SLAM.

Keywords: perspective alignment, progress monitoring, slam, stereo matching.

Procedia PDF Downloads 210
909 Isolation Preserving Medical Conclusion Hold Structure via C5 Algorithm

Authors: Swati Kishor Zode, Rahul Ambekar

Abstract:

Data mining is the extraction of fascinating examples on the other hand information from enormous measure of information and choice is made as indicated by the applicable information extracted. As of late, with the dangerous advancement in internet, stockpiling of information and handling procedures, privacy preservation has been one of the major (higher) concerns in data mining. Various techniques and methods have been produced for protection saving data mining. In the situation of Clinical Decision Support System, the choice is to be made on the premise of the data separated from the remote servers by means of Internet to diagnose the patient. In this paper, the fundamental thought is to build the precision of Decision Support System for multiple diseases for different maladies and in addition protect persistent information while correspondence between Clinician side (Client side) also, the Server side. A privacy preserving protocol for clinical decision support network is proposed so that patients information dependably stay scrambled amid diagnose prepare by looking after the accuracy. To enhance the precision of Decision Support System for various malady C5.0 classifiers and to save security, a Homomorphism encryption algorithm Paillier cryptosystem is being utilized.

Keywords: classification, homomorphic encryption, clinical decision support, privacy

Procedia PDF Downloads 323