Search results for: emotion detection in the text
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4993

Search results for: emotion detection in the text

973 Using Predictive Analytics to Identify First-Year Engineering Students at Risk of Failing

Authors: Beng Yew Low, Cher Liang Cha, Cheng Yong Teoh

Abstract:

Due to a lack of continual assessment or grade related data, identifying first-year engineering students in a polytechnic education at risk of failing is challenging. Our experience over the years tells us that there is no strong correlation between having good entry grades in Mathematics and the Sciences and excelling in hardcore engineering subjects. Hence, identifying students at risk of failure cannot be on the basis of entry grades in Mathematics and the Sciences alone. These factors compound the difficulty of early identification and intervention. This paper describes the development of a predictive analytics model in the early detection of students at risk of failing and evaluates its effectiveness. Data from continual assessments conducted in term one, supplemented by data of student psychological profiles such as interests and study habits, were used. Three classification techniques, namely Logistic Regression, K Nearest Neighbour, and Random Forest, were used in our predictive model. Based on our findings, Random Forest was determined to be the strongest predictor with an Area Under the Curve (AUC) value of 0.994. Correspondingly, the Accuracy, Precision, Recall, and F-Score were also highest among these three classifiers. Using this Random Forest Classification technique, students at risk of failure could be identified at the end of term one. They could then be assigned to a Learning Support Programme at the beginning of term two. This paper gathers the results of our findings. It also proposes further improvements that can be made to the model.

Keywords: continual assessment, predictive analytics, random forest, student psychological profile

Procedia PDF Downloads 128
972 Grating Assisted Surface Plasmon Resonance Sensor for Monitoring of Hazardous Toxic Chemicals and Gases in an Underground Mines

Authors: Sanjeev Kumar Raghuwanshi, Yadvendra Singh

Abstract:

The objective of this paper is to develop and optimize the Fiber Bragg (FBG) grating based Surface Plasmon Resonance (SPR) sensor for monitoring the hazardous toxic chemicals and gases in underground mines or any industrial area. A fully cladded telecommunication standard FBG is proposed to develop to produce surface plasmon resonance. A thin few nm gold/silver film (subject to optimization) is proposed to apply over the FBG sensing head using e-beam deposition method. Sensitivity enhancement of the sensor will be done by adding a composite nanostructured Graphene Oxide (GO) sensing layer using the spin coating method. Both sensor configurations suppose to demonstrate high responsiveness towards the changes in resonance wavelength. The GO enhanced sensor may show increased sensitivity of many fold compared to the gold coated traditional fibre optic sensor. Our work is focused on to optimize GO, multilayer structure and to develop fibre coating techniques that will serve well for sensitive and multifunctional detection of hazardous chemicals. This research proposal shows great potential towards future development of optical fiber sensors using readily available components such as Bragg gratings as highly sensitive chemical sensors in areas such as environmental sensing.

Keywords: surface plasmon resonance, fibre Bragg grating, sensitivity, toxic gases, MATRIX method

Procedia PDF Downloads 263
971 Study of Cathodic Protection for Trunk Pipeline of Al-Garraf Oil Field

Authors: Maysoon Khalil Askar

Abstract:

The delineation of possible areas of corrosion along the external face of an underground oil pipeline in Trunk line of Al- Garraf oil field was investigated using the horizontal electrical resistivity profiling technique and study the contribution of pH, Moisture Content in Soil and Presence chlorides, sulfates and total dissolve salts in soil and water. The test sites represent a physical and chemical properties of soils. The hydrogen-ion concentration of soil and groundwater range from 7.2 to 9.6, and the resistivity values of the soil along the pipeline were obtained using the YH302B model resistivity meter having values between 1588 and 720 Ohm-cm. the chloride concentration in soil and groundwater is high (more than 1000 ppm), total soulable salt is more than 5000 ppm, and sulphate range from 0.17% and 0.98% in soil and more than 600 ppm in groundwater. The soil is poor aeration, the soil texture is fine (clay and silt soil), the water content is high (the groundwater is close to surface), the chloride and sulphate is high in the soil and groundwater, the total soulable salt is high in ground water and finally the soil electric resistivity is low that the soil is very corrosive and there is the possibility of the pipeline failure. These methods applied in the study are quick, economic and efficient for detecting along buried pipelines which need to be protected. Routine electrical geophysical investigations along buried oil pipelines should be undertaken for the early detection and prevention of pipeline failure with its attendant environmental, human and economic consequences.

Keywords: soil resistivity, corrosion, cathodic protection, chloride concentration, water content

Procedia PDF Downloads 433
970 Molecular Profiles of Microbial Etiologic Agents Forming Biofilm in Urinary Tract Infections of Pregnant Women by RTPCR Assay

Authors: B. Nageshwar Rao

Abstract:

Urinary tract infection (UTI) represents the most commonly acquired bacterial infection worldwide, with substantial morbidity, mortality, and economic burden. The objective of the study is to characterize the microbial profiles of uropathogenic in the obstetric population by RTPCR. Study design: An observational cross-sectional study was performed at a single tertiary health care hospital among 50 pregnant women with UTIs, including asymptomatic and symptomatic patients attending the outpatient department and inpatient department of Obstetrics and Gynaecology.Methods: Serotyping and genes detection of various uropathogens were studied using RTPCR. Pulse filed gel electrophoresis methods were used to determine the various genetic profiles. Results: The present study shows that CsgD protein, involved in biofilm formation in Escherichia coli, VIM1, IMP1 genes for Klebsiella were identified by using the RTPCR method. Our results showed that the prevalence of VIM1 and IMP1 genes and CsgD protein in E.coli showed a significant relationship between strong biofilm formation, and this may be due to the prevalence of specific genes. Finally, the genetic identification of RTPCR results for both bacteria was correlated with each other and concluded that the above uropathogens were common isolates in producing Biofilm in the pregnant woman suffering from urinary tract infection in our hospital observational study.

Keywords: biofilms, Klebsiella, E.coli, urinary tract infection

Procedia PDF Downloads 113
969 A Sui Generis Technique to Detect Pathogens in Post-Partum Breast Milk Using Image Processing Techniques

Authors: Yogesh Karunakar, Praveen Kandaswamy

Abstract:

Mother’s milk provides the most superior source of nutrition to a child. There is no other substitute to the mother’s milk. Postpartum secretions like breast milk can be analyzed on the go for testing the presence of any harmful pathogen before a mother can feed the child or donate the milk for the milk bank. Since breast feeding is one of the main causes for transmission of diseases to the newborn, it is mandatory to test the secretions. In this paper, we describe the detection of pathogens like E-coli, Human Immunodeficiency Virus (HIV), Hepatitis B (HBV), Hepatitis C (HCV), Cytomegalovirus (CMV), Zika and Ebola virus through an innovative method, in which we are developing a unique chip for testing the mother’s milk sample. The chip will contain an antibody specific to the target pathogen that will show a color change if there are enough pathogens present in the fluid that will be considered dangerous. A smart-phone camera will then be acquiring the image of the strip and using various image processing techniques we will detect the color development due to antigen antibody interaction within 5 minutes, thereby not adding to any delay, before the newborn is fed or prior to the collection of the milk for the milk bank. If the target pathogen comes positive through this method, then the health care provider can provide adequate treatment to bring down the number of pathogens. This will reduce the postpartum related mortality and morbidity which arises due to feeding infectious breast milk to own child.

Keywords: postpartum, fluids, camera, HIV, HCV, CMV, Zika, Ebola, smart-phones, breast milk, pathogens, image processing techniques

Procedia PDF Downloads 218
968 Using Cyclic Structure to Improve Inference on Network Community Structure

Authors: Behnaz Moradijamei, Michael Higgins

Abstract:

Identifying community structure is a critical task in analyzing social media data sets often modeled by networks. Statistical models such as the stochastic block model have proven to explain the structure of communities in real-world network data. In this work, we develop a goodness-of-fit test to examine community structure's existence by using a distinguishing property in networks: cyclic structures are more prevalent within communities than across them. To better understand how communities are shaped by the cyclic structure of the network rather than just the number of edges, we introduce a novel method for deciding on the existence of communities. We utilize these structures by using renewal non-backtracking random walk (RNBRW) to the existing goodness-of-fit test. RNBRW is an important variant of random walk in which the walk is prohibited from returning back to a node in exactly two steps and terminates and restarts once it completes a cycle. We investigate the use of RNBRW to improve the performance of existing goodness-of-fit tests for community detection algorithms based on the spectral properties of the adjacency matrix. Our proposed test on community structure is based on the probability distribution of eigenvalues of the normalized retracing probability matrix derived by RNBRW. We attempt to make the best use of asymptotic results on such a distribution when there is no community structure, i.e., asymptotic distribution under the null hypothesis. Moreover, we provide a theoretical foundation for our statistic by obtaining the true mean and a tight lower bound for RNBRW edge weights variance.

Keywords: hypothesis testing, RNBRW, network inference, community structure

Procedia PDF Downloads 144
967 Non-Destructive Testing of Selective Laser Melting Products

Authors: Luca Collini, Michele Antolotti, Diego Schiavi

Abstract:

At present, complex geometries within production time shrinkage, rapidly increasing demand, and high-quality standard requirement make the non-destructive (ND) control of additively manufactured components indispensable means. On the other hand, a technology gap and the lack of standards regulating the methods and the acceptance criteria indicate the NDT of these components a stimulating field to be still fully explored. Up to date, penetrant testing, acoustic wave, tomography, radiography, and semi-automated ultrasound methods have been tested on metal powder based products so far. External defects, distortion, surface porosity, roughness, texture, internal porosity, and inclusions are the typical defects in the focus of testing. Detection of density and layers compactness are also been tried on stainless steels by the ultrasonic scattering method. In this work, the authors want to present and discuss the radiographic and the ultrasound ND testing on additively manufactured Ti₆Al₄V and inconel parts obtained by the selective laser melting (SLM) technology. In order to test the possibilities given by the radiographic method, both X-Rays and γ-Rays are tried on a set of specifically designed specimens realized by the SLM. The specimens contain a family of defectology, which represent the most commonly found, as cracks and lack of fusion. The tests are also applied to real parts of various complexity and thickness. A set of practical indications and of acceptance criteria is finally drawn.

Keywords: non-destructive testing, selective laser melting, radiography, UT method

Procedia PDF Downloads 142
966 A Straightforward Method for Determining Inorganic Selenium Speciations by Graphite Furnace Atomic Absorption Spectroscopy in Water Samples

Authors: Sahar Ehsani, David James, Vernon Hodge

Abstract:

In this experimental study, total selenium in solution was measured with Graphite Furnace Atomic Absorption Spectroscopy, GFAAS, then chemical reactions with sodium borohydride were used to reduce selenite to hydrogen selenide. Hydrogen selenide was then stripped from the solution by purging the solution with nitrogen gas. Since the two main speciations in oxic waters are usually selenite, Se(IV) and selenate, Se(VI), it was assumed that after Se(IV) is removed, the remaining total selenium was Se(VI). Total selenium measured after stripping gave Se(VI) concentration, and the difference of total selenium measured before and after stripping gave Se(IV) concentration. An additional step of reducing Se(VI) to Se(IV) was performed by boiling the stripped solution under acidic conditions, then removing Se(IV) by a chemical reaction with sodium borohydride. This additional procedure of removing Se(VI) from the solution is useful in rare cases where the water sample is reducing and contains selenide speciation. In this study, once Se(IV) and Se(VI) were both removed from the water sample, the remaining total selenium concentration was zero. The method was tested to determine Se(IV) and Se(VI) in both purified water and synthetic irrigation water spiked with Se(IV) and Se(VI). Average recovery of spiked samples of diluted synthetic irrigation water was 99% for Se(IV) and 97% for Se(VI). Detection limits of the method were 0.11 µg L⁻¹ and 0.32 µg L⁻¹ for Se(IV) and Se(VI), respectively.

Keywords: Analytical Method, Graphite Furnace Atomic Absorption Spectroscopy, Selenate, Selenite, Selenium Speciations

Procedia PDF Downloads 137
965 Disease Level Assessment in Wheat Plots Using a Residual Deep Learning Algorithm

Authors: Felipe A. Guth, Shane Ward, Kevin McDonnell

Abstract:

The assessment of disease levels in crop fields is an important and time-consuming task that generally relies on expert knowledge of trained individuals. Image classification in agriculture problems historically has been based on classical machine learning strategies that make use of hand-engineered features in the top of a classification algorithm. This approach tends to not produce results with high accuracy and generalization to the classes classified by the system when the nature of the elements has a significant variability. The advent of deep convolutional neural networks has revolutionized the field of machine learning, especially in computer vision tasks. These networks have great resourcefulness of learning and have been applied successfully to image classification and object detection tasks in the last years. The objective of this work was to propose a new method based on deep learning convolutional neural networks towards the task of disease level monitoring. Common RGB images of winter wheat were obtained during a growing season. Five categories of disease levels presence were produced, in collaboration with agronomists, for the algorithm classification. Disease level tasks performed by experts provided ground truth data for the disease score of the same winter wheat plots were RGB images were acquired. The system had an overall accuracy of 84% on the discrimination of the disease level classes.

Keywords: crop disease assessment, deep learning, precision agriculture, residual neural networks

Procedia PDF Downloads 323
964 Measurements for Risk Analysis and Detecting Hazards by Active Wearables

Authors: Werner Grommes

Abstract:

Intelligent wearables (illuminated vests or hand and foot-bands, smart watches with a laser diode, Bluetooth smart glasses) overflow the market today. They are integrated with complex electronics and are worn very close to the body. Optical measurements and limitation of the maximum light density are needed. Smart watches are equipped with a laser diode or control different body currents. Special glasses generate readable text information that is received via radio transmission. Small high-performance batteries (lithium-ion/polymer) supply the electronics. All these products have been tested and evaluated for risk. These products must, for example, meet the requirements for electromagnetic compatibility as well as the requirements for electromagnetic fields affecting humans or implant wearers. Extensive analyses and measurements were carried out for this purpose. Many users are not aware of these risks. The result of this study should serve as a suggestion to do it better in the future or simply to point out these risks. Commercial LED warning vests, LED hand and foot-bands, illuminated surfaces with inverter (high voltage), flashlights, smart watches, and Bluetooth smart glasses were checked for risks. The luminance, the electromagnetic emissions in the low-frequency as well as in the high-frequency range, audible noises, and nervous flashing frequencies were checked by measurements and analyzed. Rechargeable lithium-ion or lithium-polymer batteries can burn or explode under special conditions like overheating, overcharging, deep discharge or using out of the temperature specification. Some risk analysis becomes necessary. The result of this study is that many smart wearables are worn very close to the body, and an extensive risk analysis becomes necessary. Wearers of active implants like a pacemaker or implantable cardiac defibrillator must be considered. If the wearable electronics include switching regulators or inverter circuits, active medical implants in the near field can be disturbed. A risk analysis is necessary.

Keywords: safety and hazards, electrical safety, EMC, EMF, active medical implants, optical radiation, illuminated warning vest, electric luminescent, hand and head lamps, LED, e-light, safety batteries, light density, optical glare effects

Procedia PDF Downloads 105
963 Image Processing of Scanning Electron Microscope Micrograph of Ferrite and Pearlite Steel for Recognition of Micro-Constituents

Authors: Subir Gupta, Subhas Ganguly

Abstract:

In this paper, we demonstrate the new area of application of image processing in metallurgical images to develop the more opportunity for structure-property correlation based approaches of alloy design. The present exercise focuses on the development of image processing tools suitable for phrase segmentation, grain boundary detection and recognition of micro-constituents in SEM micrographs of ferrite and pearlite steels. A comprehensive data of micrographs have been experimentally developed encompassing the variation of ferrite and pearlite volume fractions and taking images at different magnification (500X, 1000X, 15000X, 2000X, 3000X and 5000X) under scanning electron microscope. The variation in the volume fraction has been achieved using four different plain carbon steel containing 0.1, 0.22, 0.35 and 0.48 wt% C heat treated under annealing and normalizing treatments. The obtained data pool of micrographs arbitrarily divided into two parts to developing training and testing sets of micrographs. The statistical recognition features for ferrite and pearlite constituents have been developed by learning from training set of micrographs. The obtained features for microstructure pattern recognition are applied to test set of micrographs. The analysis of the result shows that the developed strategy can successfully detect the micro constitutes across the wide range of magnification and variation of volume fractions of the constituents in the structure with an accuracy of about +/- 5%.

Keywords: SEM micrograph, metallurgical image processing, ferrite pearlite steel, microstructure

Procedia PDF Downloads 195
962 Multi-Temporal Urban Land Cover Mapping Using Spectral Indices

Authors: Mst Ilme Faridatul, Bo Wu

Abstract:

Multi-temporal urban land cover mapping is of paramount importance for monitoring urban sprawl and managing the ecological environment. For diversified urban activities, it is challenging to map land covers in a complex urban environment. Spectral indices have proved to be effective for mapping urban land covers. To improve multi-temporal urban land cover classification and mapping, we evaluate the performance of three spectral indices, e.g. modified normalized difference bare-land index (MNDBI), tasseled cap water and vegetation index (TCWVI) and shadow index (ShDI). The MNDBI is developed to evaluate its performance of enhancing urban impervious areas by separating bare lands. A tasseled cap index, TCWVI is developed to evaluate its competence to detect vegetation and water simultaneously. The ShDI is developed to maximize the spectral difference between shadows of skyscrapers and water and enhance water detection. First, this paper presents a comparative analysis of three spectral indices using Landsat Enhanced Thematic Mapper (ETM), Thematic Mapper (TM) and Operational Land Imager (OLI) data. Second, optimized thresholds of the spectral indices are imputed to classify land covers, and finally, their performance of enhancing multi-temporal urban land cover mapping is assessed. The results indicate that the spectral indices are competent to enhance multi-temporal urban land cover mapping and achieves an overall classification accuracy of 93-96%.

Keywords: land cover, mapping, multi-temporal, spectral indices

Procedia PDF Downloads 148
961 Evaluation of Brca1/2 Mutational Status among Algerian Familial Breast Cancer

Authors: Arab M., Ait Abdallah M., Zeraoulia N., Boumaza H., Aoutia M., Griene L., Ait Abdelkader B.,

Abstract:

breast and ovarian cancer are respectively the first and fourth leading causes of cancer among women in Algeria. A family story of cancer in the most important risk factor, and in most cases of families with breast and /or ovarian cancer, the pattern of cancer family can be attributed to mutation in BRCA1/2genes. objectibes: the aim of our study in to investigate the spectrum of BRCA1/2 germiline mutation in familial breast and /or ovarian cancer and to determine the prevalence and the nature of BRCA1/2mutation in Algeria methods: we deremined the prevalence of BRCA1/2 mutation within a cohort of 161 probands selected according the eisinger score double stranded sanger sequencing of all coding exons of BRCA1/2including flanking intronic region were performed results: we identified a total of 23 distinct deleterious mutations (class5) 12 differents mutations in BRCA1(52%) and 11 in BRCA2(48%). 78% (18/23) were protein truncating and 22%(5/23) missens mutations.3 novel deleterious mutations have been identified, which have not been described in public mutation database. one new mutation were found in two unrelated patients. the overall mutation detection rate in our study is 28,5%(46/161).more over, an UVS c7783 located in BRCA2 is found in two unrelated probands and segregate in the 02 families/ conclusion: our results sugget of large spectrum of BRCA1/2 mutation in Algerian breast/ovarian cancer family. The nature and prevalence of BRCA1/2mutation in algerian families are ongoing in a larger study, 80 probands are to day under investigation. This study which may therefore identify the genetic particularity of Algerian breast /ovarian cancer.

Keywords: BRCA1/2 mutations, hereditary breast cancer, algerian women, prvalence

Procedia PDF Downloads 172
960 Stability-Indicating High-Performance Thin-Layer Chromatography Method for Estimation of Naftopidil

Authors: P. S. Jain, K. D. Bobade, S. J. Surana

Abstract:

A simple, selective, precise and Stability-indicating High-performance thin-layer chromatographic method for analysis of Naftopidil both in a bulk and in pharmaceutical formulation has been developed and validated. The method employed, HPTLC aluminium plates precoated with silica gel as the stationary phase. The solvent system consisted of hexane: ethyl acetate: glacial acetic acid (4:4:2 v/v). The system was found to give compact spot for Naftopidil (Rf value of 0.43±0.02). Densitometric analysis of Naftopidil was carried out in the absorbance mode at 253 nm. The linear regression analysis data for the calibration plots showed good linear relationship with r2=0.999±0.0001 with respect to peak area in the concentration range 200-1200 ng per spot. The method was validated for precision, recovery and robustness. The limits of detection and quantification were 20.35 and 61.68 ng per spot, respectively. Naftopidil was subjected to acid and alkali hydrolysis, oxidation and thermal degradation. The drug undergoes degradation under acidic, basic, oxidation and thermal conditions. This indicates that the drug is susceptible to acid, base, oxidation and thermal conditions. The degraded product was well resolved from the pure drug with significantly different Rf value. Statistical analysis proves that the method is repeatable, selective and accurate for the estimation of investigated drug. The proposed developed HPTLC method can be applied for identification and quantitative determination of Naftopidil in bulk drug and pharmaceutical formulation.

Keywords: naftopidil, HPTLC, validation, stability, degradation

Procedia PDF Downloads 393
959 National Branding through Education: South Korean Image in Romania through the Language Textbooks for Foreigners

Authors: Raluca-Ioana Antonescu

Abstract:

The paper treats about the Korean public diplomacy and national branding strategies, and how the Korean language textbooks were used in order to construct the Korean national image. The field research of the paper stands at the intersection between Linguistics and Political Science, while the problem of the research is the role of language and culture in national branding process. The research goal is to contribute to the literature situated at the intersection between International Relations and Applied Linguistics, while the objective is to conceptualize the idea of national branding by emphasizing a dimension which is not much discussed, and that would be the education as an instrument of the national branding and public diplomacy strategies. In order to examine the importance of language upon the national branding strategies, the paper will answer one main question, How is the Korean language used in the construction of national branding?, and two secondary questions, How are explored in literature the relations between language and national branding construction? and What kind of image of South Korea the language textbooks for foreigners transmit? In order to answer the research questions, the paper starts from one main hypothesis, that the language is an essential component of the culture, which is used in the construction of the national branding influenced by traditional elements (like Confucianism) but also by modern elements (like Western influence), and from two secondary hypothesis, the first one is that in the International Relations literature there are little explored the connections between language and national branding, while the second hypothesis is that the South Korean image is constructed through the promotion of a traditional society, but also a modern one. In terms of methodology, the paper will analyze the textbooks used in Romania at the universities which provide Korean Language classes during the three years program B.A., following the dialogs, the descriptive texts and the additional text about the Korean culture. The analysis will focus on the rank status difference, the individual in relation to the collectivity, the respect for the harmony, and the image of the foreigner. The results of the research show that the South Korean image projected in the textbooks convey the Confucian values and it does not emphasize the changes suffered by the society due to the modernity and globalization. The Westernized aspect of the Korean society is conveyed more in an informative way about the Korean international companies, Korean internal development (like the transport or other services), but it does not show the cultural changed the society underwent. Even if the paper is using the textbooks which are used in Romania as a teaching material, it could be used and applied at least to other European countries, since the textbooks are the ones issued by the South Korean language schools, which other European countries are using also.

Keywords: confucianism, modernism, national branding, public diplomacy, traditionalism

Procedia PDF Downloads 237
958 Articles, Delimitation of Speech and Perception

Authors: Nataliya L. Ogurechnikova

Abstract:

The paper aims to clarify the function of articles in the English speech and specify their place and role in the English language, taking into account the use of articles for delimitation of speech. A focus of the paper is the use of the definite and the indefinite articles with different types of noun phrases which comprise either one noun with or without attributes, such as the King, the Queen, the Lion, the Unicorn, a dimple, a smile, a new language, an unknown dialect, or several nouns with or without attributes, such as the King and Queen of Hearts, the Lion and Unicorn, a dimple or smile, a completely isolated language or dialect. It is stated that the function of delimitation is related to perception: the number of speech units in a text correlates with the way the speaker perceives and segments the denotation. The two following combinations of words the house and garden and the house and the garden contain different numbers of speech units, one and two respectively, and reveal two different perception modes which correspond to the use of the definite article in the examples given. Thus, the function of delimitation is twofold, it is related to perception and cognition, on the one hand, and, on the other hand, to grammar, if the subject of grammar is the structure of speech. Analysis of speech units in the paper is not limited by noun phrases and is amplified by discussion of peripheral phenomena which are nevertheless important because they enable to qualify articles as a syntactic phenomenon whereas they are not infrequently described in terms of noun morphology. With this regard attention is given to the history of linguistic studies, specifically to the description of English articles by Niels Haislund, a disciple of Otto Jespersen. A discrepancy is noted between the initial plan of Jespersen who intended to describe articles as a syntactic phenomenon in ‘A Modern English Grammar on Historical Principles’ and the interpretation of articles in terms of noun morphology, finally given by Haislund. Another issue of the paper is correlation between description and denotation, being a traditional aspect of linguistic studies focused on articles. An overview of relevant studies, given in the paper, goes back to the works of G. Frege, which gave rise to a series of scientific works where the meaning of articles was described within the scope of logical semantics. Correlation between denotation and description is treated in the paper as the meaning of article, i.e. a component in its semantic structure, which differs from the function of delimitation and is similar to the meaning of other quantifiers. The paper further explains why the relation between description and denotation, i.e. the meaning of English article, is irrelevant for noun morphology and has nothing to do with nominal categories of the English language.

Keywords: delimitation of speech, denotation, description, perception, speech units, syntax

Procedia PDF Downloads 237
957 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema

Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy

Abstract:

Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.

Keywords: natural language processing, natural language interfaces, human computer interaction, end user development, dialog systems, data recognition, spreadsheet

Procedia PDF Downloads 305
956 Assessment of the Number of Damaged Buildings from a Flood Event Using Remote Sensing Technique

Authors: Jaturong Som-ard

Abstract:

The heavy rainfall from 3rd to 22th January 2017 had swamped much area of Ranot district in southern Thailand. Due to heavy rainfall, the district was flooded which had a lot of effects on economy and social loss. The major objective of this study is to detect flooding extent using Sentinel-1A data and identify a number of damaged buildings over there. The data were collected in two stages as pre-flooding and during flood event. Calibration, speckle filtering, geometric correction, and histogram thresholding were performed with the data, based on intensity spectral values to classify thematic maps. The maps were used to identify flooding extent using change detection, along with the buildings digitized and collected on JOSM desktop. The numbers of damaged buildings were counted within the flooding extent with respect to building data. The total flooded areas were observed as 181.45 sq.km. These areas were mostly occurred at Ban khao, Ranot, Takhria, and Phang Yang sub-districts, respectively. The Ban khao sub-district had more occurrence than the others because this area is located at lower altitude and close to Thale Noi and Thale Luang lakes than others. The numbers of damaged buildings were high in Khlong Daen (726 features), Tha Bon (645 features), and Ranot sub-district (604 features), respectively. The final flood extent map might be very useful for the plan, prevention and management of flood occurrence area. The map of building damage can be used for the quick response, recovery and mitigation to the affected areas for different concern organization.

Keywords: flooding extent, Sentinel-1A data, JOSM desktop, damaged buildings

Procedia PDF Downloads 187
955 Detection and Distribution Pattern of Prevelant Genotypes of Hepatitis C in a Tertiary Care Hospital of Western India

Authors: Upasana Bhumbla

Abstract:

Background: Hepatitis C virus is a major cause of chronic hepatitis, which can further lead to cirrhosis of the liver and hepatocellular carcinoma. Worldwide the burden of Hepatitis C infection has become a serious threat to the human race. Hepatitis C virus (HCV) has population-specific genotypes and provides valuable epidemiological and therapeutic information. Genotyping and assessment of viral load in HCV patients are important for planning the therapeutic strategies. The aim of the study is to study the changing trends of prevalence and genotypic distribution of hepatitis C virus in a tertiary care hospital in Western India. Methods: It is a retrospective study; blood samples were collected and tested for anti HCV antibodies by ELISA in Dept. of Microbiology. In seropositive Hepatitis C patients, quantification of HCV-RNA was done by real-time PCR and in HCV-RNA positive samples, genotyping was conducted. Results: A total of 114 patients who were seropositive for Anti HCV were recruited in the study, out of which 79 (69.29%) were HCV-RNA positive. Out of these positive samples, 54 were further subjected to genotype determination using real-time PCR. Genotype was not detected in 24 samples due to low viral load; 30 samples were positive for genotype. Conclusion: Knowledge of genotype is crucial for the management of HCV infection and prediction of prognosis. Patients infected with HCV genotype 1 and 4 will have to receive Interferon and Ribavirin for 48 weeks. Patients with these genotypes show a poor sustained viral response when tested 24 weeks after completion of therapy. On the contrary, patients infected with HCV genotype 2 and 3 are reported to have a better response to therapy.

Keywords: hepatocellular, genotype, ribavarin, seropositive

Procedia PDF Downloads 125
954 Evaluating Multiple Diagnostic Tests: An Application to Cervical Intraepithelial Neoplasia

Authors: Areti Angeliki Veroniki, Sofia Tsokani, Evangelos Paraskevaidis, Dimitris Mavridis

Abstract:

The plethora of diagnostic test accuracy (DTA) studies has led to the increased use of systematic reviews and meta-analysis of DTA studies. Clinicians and healthcare professionals often consult DTA meta-analyses to make informed decisions regarding the optimum test to choose and use for a given setting. For example, the human papilloma virus (HPV) DNA, mRNA, and cytology can be used for the cervical intraepithelial neoplasia grade 2+ (CIN2+) diagnosis. But which test is the most accurate? Studies directly comparing test accuracy are not always available, and comparisons between multiple tests create a network of DTA studies that can be synthesized through a network meta-analysis of diagnostic tests (DTA-NMA). The aim is to summarize the DTA-NMA methods for at least three index tests presented in the methodological literature. We illustrate the application of the methods using a real data set for the comparative accuracy of HPV DNA, HPV mRNA, and cytology tests for cervical cancer. A search was conducted in PubMed, Web of Science, and Scopus from inception until the end of July 2019 to identify full-text research articles that describe a DTA-NMA method for three or more index tests. Since the joint classification of the results from one index against the results of another index test amongst those with the target condition and amongst those without the target condition are rarely reported in DTA studies, only methods requiring the 2x2 tables of the results of each index test against the reference standard were included. Studies of any design published in English were eligible for inclusion. Relevant unpublished material was also included. Ten relevant studies were finally included to evaluate their methodology. DTA-NMA methods that have been presented in the literature together with their advantages and disadvantages are described. In addition, using 37 studies for cervical cancer obtained from a published Cochrane review as a case study, an application of the identified DTA-NMA methods to determine the most promising test (in terms of sensitivity and specificity) for use as the best screening test to detect CIN2+ is presented. As a conclusion, different approaches for the comparative DTA meta-analysis of multiple tests may conclude to different results and hence may influence decision-making. Acknowledgment: This research is co-financed by Greece and the European Union (European Social Fund- ESF) through the Operational Programme «Human Resources Development, Education and Lifelong Learning 2014-2020» in the context of the project “Extension of Network Meta-Analysis for the Comparison of Diagnostic Tests ” (MIS 5047640).

Keywords: colposcopy, diagnostic test, HPV, network meta-analysis

Procedia PDF Downloads 135
953 Robust Heart Rate Estimation from Multiple Cardiovascular and Non-Cardiovascular Physiological Signals Using Signal Quality Indices and Kalman Filter

Authors: Shalini Rankawat, Mansi Rankawat, Rahul Dubey, Mazad Zaveri

Abstract:

Physiological signals such as electrocardiogram (ECG) and arterial blood pressure (ABP) in the intensive care unit (ICU) are often seriously corrupted by noise, artifacts, and missing data, which lead to errors in the estimation of heart rate (HR) and incidences of false alarm from ICU monitors. Clinical support in ICU requires most reliable heart rate estimation. Cardiac activity, because of its relatively high electrical energy, may introduce artifacts in Electroencephalogram (EEG), Electrooculogram (EOG), and Electromyogram (EMG) recordings. This paper presents a robust heart rate estimation method by detection of R-peaks of ECG artifacts in EEG, EMG & EOG signals, using energy-based function and a novel Signal Quality Index (SQI) assessment technique. SQIs of physiological signals (EEG, EMG, & EOG) were obtained by correlation of nonlinear energy operator (teager energy) of these signals with either ECG or ABP signal. HR is estimated from ECG, ABP, EEG, EMG, and EOG signals from separate Kalman filter based upon individual SQIs. Data fusion of each HR estimate was then performed by weighing each estimate by the Kalman filters’ SQI modified innovations. The fused signal HR estimate is more accurate and robust than any of the individual HR estimate. This method was evaluated on MIMIC II data base of PhysioNet from bedside monitors of ICU patients. The method provides an accurate HR estimate even in the presence of noise and artifacts.

Keywords: ECG, ABP, EEG, EMG, EOG, ECG artifacts, Teager-Kaiser energy, heart rate, signal quality index, Kalman filter, data fusion

Procedia PDF Downloads 691
952 The Confiscation of Ill-Gotten Gains in Pollution: The Taiwan Experience and the Interaction between Economic Analysis of Law and Environmental Economics Perspectives

Authors: Chiang-Lead Woo

Abstract:

In reply to serious environmental problems, the Taiwan government quickly adjusted some articles to suit the needs of environmental protection recently, such as the amendment to article 190-1 of the Taiwan Criminal Code. The transfer of legislation comes as an improvement which canceled the limitation of ‘endangering public safety’. At the same time, the article 190-1 goes from accumulative concrete offense to abstract crime of danger. Thus, the public looks forward to whether environmental crime following the imposition of fines or penalties works efficiently in anti-pollution by the deterrent effects. However, according to the addition to article 38-2 of the Taiwan Criminal Code, the confiscation system seems controversial legislation to restrain ill-gotten gains. Most prior studies focused on comparisons with the Administrative Penalty Law and the Criminal Code in environmental issue in Taiwan; recently, more and more studies emphasize calculations on ill-gotten gains. Hence, this paper try to examine the deterrent effect in environmental crime by economic analysis of law and environmental economics perspective. This analysis shows that only if there is an extremely high probability (equal to 100 percent) of an environmental crime case being prosecuted criminally by Taiwan Environmental Protection Agency, the deterrent effects will work. Therefore, this paper suggests deliberating the confiscation system from supplementing the System of Environmental and Economic Accounting, reasonable deterrent fines, input management, real-time system for detection of pollution, and whistleblower system, environmental education, and modernization of law.

Keywords: confiscation, ecosystem services, environmental crime, ill-gotten gains, the deterrent effect, the system of environmental and economic accounting

Procedia PDF Downloads 163
951 Comparison of Several Diagnostic Methods for Detecting Bovine Viral Diarrhea Virus Infection in Cattle

Authors: Azizollah Khodakaram- Tafti, Ali Mohammadi, Ghasem Farjanikish

Abstract:

Bovine viral diarrhea virus (BVDV) is one of the most important viral pathogens of cattle worldwide caused by Pestivirus genus, Flaviviridae family.The aim of the present study was to comparison several diagnostic methods and determine the prevalence of BVDV infection for the first time in dairy herds of Fars province, Iran. For initial screening, a total of 400 blood samples were randomly collected from 12 industrial dairy herds and analyzed using reverse transcription (RT)-PCR on the buffy coat. In the second step, blood samples and also ear notch biopsies were collected from 100 cattle of infected farms and tested by antigen capture ELISA (ACE), RT-PCR and immunohistochemistry (IHC). The results of nested RT-PCR (outer primers 0I100/1400R and inner primers BD1/BD2) was successful in 16 out of 400 buffy coat samples (4%) as acute infection in initial screening. Also, 8 out of 100 samples (2%) were positive as persistent infection (PI) by all of the diagnostic tests similarly including RT-PCR, ACE and IHC on buffy coat, serum and skin samples, respectively. Immunoreactivity for bovine BVDV antigen as brown, coarsely to finely granular was observed within the cytoplasm of epithelial cells of epidermis and hair follicles and also subcutaneous stromal cells. These findings confirm the importance of monitoring BVDV infection in cattle of this region and suggest detection and elimination of PI calves for controlling and eradication of this disease.

Keywords: antigen capture ELISA, bovine viral diarrhea virus, immunohistochemistry, RT-PCR, cattle

Procedia PDF Downloads 357
950 A Framework of Virtualized Software Controller for Smart Manufacturing

Authors: Pin Xiu Chen, Shang Liang Chen

Abstract:

A virtualized software controller is developed in this research to replace traditional hardware control units. This virtualized software controller transfers motion interpolation calculations from the motion control units of end devices to edge computing platforms, thereby reducing the end devices' computational load and hardware requirements and making maintenance and updates easier. The study also applies the concept of microservices, dividing the control system into several small functional modules and then deploy into a cloud data server. This reduces the interdependency among modules and enhances the overall system's flexibility and scalability. Finally, with containerization technology, the system can be deployed and started in a matter of seconds, which is more efficient than traditional virtual machine deployment methods. Furthermore, this virtualized software controller communicates with end control devices via wireless networks, making the placement of production equipment or the redesign of processes more flexible and no longer limited by physical wiring. To handle the large data flow and maintain low-latency transmission, this study integrates 5G technology, fully utilizing its high speed, wide bandwidth, and low latency features to achieve rapid and stable remote machine control. An experimental setup is designed to verify the feasibility and test the performance of this framework. This study designs a smart manufacturing site with a 5G communication architecture, serving as a field for experimental data collection and performance testing. The smart manufacturing site includes one robotic arm, three Computer Numerical Control machine tools, several Input/Output ports, and an edge computing architecture. All machinery information is uploaded to edge computing servers and cloud servers via 5G communication and the Internet of Things framework. After analysis and computation, this information is converted into motion control commands, which are transmitted back to the relevant machinery for motion control through 5G communication. The communication time intervals at each stage are calculated using the C++ chrono library to measure the time difference for each command transmission. The relevant test results will be organized and displayed in the full-text.

Keywords: 5G, MEC, microservices, virtualized software controller, smart manufacturing

Procedia PDF Downloads 75
949 The Problem of Suffering: Job, The Servant and Prophet of God

Authors: Barbara Pemberton

Abstract:

Now that people of all faiths are experiencing suffering due to many global issues, shared narratives may provide common ground in which true understanding of each other may take root. This paper will consider the all too common problem of suffering and address how adherents of the three great monotheistic religions seek understanding and the appropriate believer’s response from the same story found within their respective sacred texts. Most scholars from each of these three traditions—Judaism, Christianity, and Islam— consider the writings of the Tanakh/Old Testament to at least contain divine revelation. While they may not agree on the extent of the revelation or the method of its delivery, they do share stories as well as a common desire to glean God’s message for God’s people from the pages of the text. One such shared story is that of Job, the servant of Yahweh--called Ayyub, the prophet of Allah, in the Qur’an. Job is described as a pious, righteous man who loses everything—family, possessions, and health—when his faith is tested. Three friends come to console him. Through it, all Job remains faithful to his God who rewards him by restoring all that was lost. All three hermeneutic communities consider Job to be an archetype of human response to suffering, regarding Job’s response to his situation as exemplary. The story of Job addresses more than the distribution of the evil problem. At stake in the story is Job’s very relationship to his God. Some exegetes believe that Job was adapted into the Jewish milieu by a gifted redactor who used the original ancient tale as the “frame” for the biblical account (chapters 1, 2, and 4:7-17) and then enlarged the story with the complex center section of poetic dialogues creating a complex work with numerous possible interpretations. Within the poetic center, Job goes so far as to question God, a response to which Jews relate, finding strength in dialogue—even in wrestling with God. Muslims only embrace the Job of the biblical narrative frame, as further identified through the Qur’an and the prophetic traditions, considering the center section an errant human addition not representative of a true prophet of Islam. The Qur’anic injunction against questioning God also renders the center theologically suspect. Christians also draw various responses from the story of Job. While many believers may agree with the Islamic perspective of God’s ultimate sovereignty, others would join their Jewish neighbors in questioning God, not anticipating answers but rather an awareness of his presence—peace and hope becoming a reality experienced through the indwelling presence of God’s Holy Spirit. Related questions are as endless as the possible responses. This paper will consider a few of the many Jewish, Christian, and Islamic insights from the ancient story, in hopes adherents within each tradition will use it to better understand the other faiths’ approach to suffering.

Keywords: suffering, Job, Qur'an, tanakh

Procedia PDF Downloads 179
948 Contribution of Automated Early Warning Score Usage to Patient Safety

Authors: Phang Moon Leng

Abstract:

Automated Early Warning Scores is a newly developed clinical decision tool that is used to streamline and improve the process of obtaining a patient’s vital signs so a clinical decision can be made at an earlier stage to prevent the patient from further deterioration. This technology provides immediate update on the score and clinical decision to be taken based on the outcome. This paper aims to study the use of an automated early warning score system on whether the technology has assisted the hospital in early detection and escalation of clinical condition and improve patient outcome. The hospital adopted the Modified Early Warning Scores (MEWS) Scoring System and MEWS Clinical Response into Philips IntelliVue Guardian Automated Early Warning Score equipment and studied whether the process has been leaned, whether the use of technology improved the usage & experience of the nurses, and whether the technology has improved patient care and outcome. It was found the steps required to obtain vital signs has been significantly reduced and is used more frequently to obtain patient vital signs. The number of deaths, and length of stay has significantly decreased as clinical decisions can be made and escalated more quickly with the Automated EWS. The automated early warning score equipment has helped improve work efficiency by removing the need for documenting into patient’s EMR. The technology streamlines clinical decision-making and allows faster care and intervention to be carried out and improves overall patient outcome which translates to better care for patient.

Keywords: automated early warning score, clinical quality and safety, patient safety, medical technology

Procedia PDF Downloads 175
947 Comics as an Intermediary for Media Literacy Education

Authors: Ryan C. Zlomek

Abstract:

The value of using comics in the literacy classroom has been explored since the 1930s. At that point in time researchers had begun to implement comics into daily lesson plans and, in some instances, had started the development process for comics-supported curriculum. In the mid-1950s, this type of research was cut short due to the work of psychiatrist Frederic Wertham whose research seemingly discovered a correlation between comic readership and juvenile delinquency. Since Wertham’s allegations the comics medium has had a hard time finding its way back to education. Now, over fifty years later, the definition of literacy is in mid-transition as the world has become more visually-oriented and students require the ability to interpret images as often as words. Through this transition, comics has found a place in the field of literacy education research as the shift focuses from traditional print to multimodal and media literacies. Comics are now believed to be an effective resource in bridging the gap between these different types of literacies. This paper seeks to better understand what students learn from the process of reading comics and how those skills line up with the core principles of media literacy education in the United States. In the first section, comics are defined to determine the exact medium that is being examined. The different conventions that the medium utilizes are also discussed. In the second section, the comics reading process is explored through a dissection of the ways a reader interacts with the page, panel, gutter, and different comic conventions found within a traditional graphic narrative. The concepts of intersubjective acts and visualization are attributed to the comics reading process as readers draw in real world knowledge to decode meaning. In the next section, the learning processes that comics encourage are explored parallel to the core principles of media literacy education. Each principle is explained and the extent to which comics can act as an intermediary for this type of education is theorized. In the final section, the author examines comics use in his computer science and technology classroom. He lays out different theories he utilizes from Scott McCloud’s text Understanding Comics and how he uses them to break down media literacy strategies with his students. The article concludes with examples of how comics has positively impacted classrooms around the United States. It is stated that integrating comics into the classroom will not solve all issues related to literacy education but, rather, that comics can be a powerful multimodal resource for educators looking for new mediums to explore with their students.

Keywords: comics, graphics novels, mass communication, media literacy, metacognition

Procedia PDF Downloads 295
946 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores

Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay

Abstract:

Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.

Keywords: retail stores, faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition

Procedia PDF Downloads 149
945 The Economic Burden of Mental Disorders: A Systematic Review

Authors: Maria Klitgaard Christensen, Carmen Lim, Sukanta Saha, Danielle Cannon, Finley Prentis, Oleguer Plana-Ripoll, Natalie Momen, Kim Moesgaard Iburg, John J. McGrath

Abstract:

Introduction: About a third of the world’s population will develop a mental disorder over their lifetime. Having a mental disorder is a huge burden in health loss and cost for the individual, but also for society because of treatment cost, production loss and caregivers’ cost. The objective of this study is to synthesize the international published literature on the economic burden of mental disorders. Methods: Systematic literature searches were conducted in the databases PubMed, Embase, Web of Science, EconLit, NHS York Database and PsychInfo using key terms for cost and mental disorders. Searches were restricted to 1980 until May 2019. The inclusion criteria were: (1) cost-of-illness studies or cost-analyses, (2) diagnosis of at least one mental disorder, (3) samples based on the general population, and (4) outcome in monetary units. 13,640 publications were screened by their title/abstract and 439 articles were full-text screened by at least two independent reviewers. 112 articles were included from the systematic searches and 31 articles from snowball searching, giving a total of 143 included articles. Results: Information about diagnosis, diagnostic criteria, sample size, age, sex, data sources, study perspective, study period, costing approach, cost categories, discount rate and production loss method and cost unit was extracted. The vast majority of the included studies were from Western countries and only a few from Africa and South America. The disorder group most often investigated was mood disorders, followed by schizophrenia and neurotic disorders. The disorder group least examined was intellectual disabilities, followed by eating disorders. The preliminary results show a substantial variety in the used perspective, methodology, costs components and outcomes in the included studies. An online tool is under development enabling the reader to explore the published information on costs by type of mental disorder, subgroups, country, methodology, and study quality. Discussion: This is the first systematic review synthesizing the economic cost of mental disorders worldwide. The paper will provide an important and comprehensive overview over the economic burden of mental disorders, and the output from this review will inform policymaking.

Keywords: cost-of-illness, health economics, mental disorders, systematic review

Procedia PDF Downloads 127
944 Early Diagnosis and Treatment of Cancer Using Synthetic Cationic Peptide

Authors: D. J. Kalita

Abstract:

Cancer is one of the prime causes of early death worldwide. Mutation of the gene involve in DNA repair and damage, like BRCA2 (Breast cancer gene two) genes, can be detected efficiently by PCR-RFLP to early breast cancer diagnosis and adopt the suitable method of treatment. Host Defense Peptide can be used as blueprint for the design and synthesis of novel anticancer drugs to avoid the side effect of conventional chemotherapy and chemo resistance. The change at nucleotide position 392 of a -› c in the cancer sample of dog mammary tumour at BRCA2 (exon 7) gene lead the creation of a new restriction site for SsiI restriction enzyme. This SNP may be a marker for detection of canine mammary tumour. Support vector machine (SVM) algorithm was used to design and predict the anticancer peptide from the mature functional peptide. MTT assay of MCF-7 cell line after 48 hours of post treatment showed an increase in the number of rounded cells when compared with untreated control cells. The ability of the synthesized peptide to induce apoptosis in MCF-7 cells was further investigated by staining the cells with the fluorescent dye Hoechst stain solution, which allows the evaluation of the nuclear morphology. Numerous cells with dense, pyknotic nuclei (the brighter fluorescence) were observed in treated but not in control MCF-7 cells when viewed using an inverted phase-contrast microscope. Thus, PCR-RFLP is one of the attractive approach for early diagnosis, and synthetic cationic peptide can be used for the treatment of canine mammary tumour.

Keywords: cancer, cationic peptide, host defense peptides, Breast cancer genes

Procedia PDF Downloads 85