Search results for: hate speech detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4201

Search results for: hate speech detection

1201 Graph Codes - 2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval

Authors: Stefan Wagenpfeil, Felix Engel, Paul McKevitt, Matthias Hemmje

Abstract:

Multimedia Indexing and Retrieval is generally designed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, especially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelization. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.

Keywords: indexing, retrieval, multimedia, graph algorithm, graph code

Procedia PDF Downloads 157
1200 Imp_hist-Si: Improved Hybrid Image Segmentation Technique for Satellite Imagery to Decrease the Segmentation Error Rate

Authors: Neetu Manocha

Abstract:

Image segmentation is a technique where a picture is parted into distinct parts having similar features which have a place with similar items. Various segmentation strategies have been proposed as of late by prominent analysts. But, after ultimate thorough research, the novelists have analyzed that generally, the old methods do not decrease the segmentation error rate. Then author finds the technique HIST-SI to decrease the segmentation error rates. In this technique, cluster-based and threshold-based segmentation techniques are merged together. After then, to improve the result of HIST-SI, the authors added the method of filtering and linking in this technique named Imp_HIST-SI to decrease the segmentation error rates. The goal of this research is to find a new technique to decrease the segmentation error rates and produce much better results than the HIST-SI technique. For testing the proposed technique, a dataset of Bhuvan – a National Geoportal developed and hosted by ISRO (Indian Space Research Organisation) is used. Experiments are conducted using Scikit-image & OpenCV tools of Python, and performance is evaluated and compared over various existing image segmentation techniques for several matrices, i.e., Mean Square Error (MSE) and Peak Signal Noise Ratio (PSNR).

Keywords: satellite image, image segmentation, edge detection, error rate, MSE, PSNR, HIST-SI, linking, filtering, imp_HIST-SI

Procedia PDF Downloads 137
1199 Plasma Electrolytes and Gamma Glutamyl Transpeptidase (GGT) Status in Dementia Subjects in Southern Nigeria

Authors: Salaam Mujeeb, Adeola Segun, Abdullahi Olasunkanmi

Abstract:

Dementia is becoming a major concern as the world population is increasing and elderly populations are being neglected. Liver and kidney Diseases have been implicated as risk factors in the etiology of Dementia. This study, therefore, evaluates the plasma Gamma Glutamyl Transferase (GGT) activity and plasma Electrolytes in other to find an association between the biomarkers and Dementia. The subjects (38) were age and sex-matched with their corresponding controls and structured questionnaires were used to obtain medical information. Using spectrophotometric and ion selective Electrode techniques respectively, we found and elevated GGT activity in the Dementia Subjects. Remarkably, no association was found between the plasma Electrolytes level and Dementia subjects. It was also observed that severity of Dementia worsens with age. Moreover, the condition of the dementia subjects worsens with reducing weight. Furthermore, the presence of Comorbidity e.g. Hypertension, Obesity, Diabetes and Habits like Smoking, Drugs and Alcohol consumption interferes with Electrolyte balance. Weight loss monitoring and IBM check are advised in Elderly individuals particularly females as they may be inductive of early or future cognitive impairments. Therefore, it might be useful as an early detection tool. Government and society should invest more on the Geriatric population by establishing Old people's home and providing social care services.

Keywords: clinical characteristics, dementia, electrolytes, gamma glutamyl transpeptidase, GGT

Procedia PDF Downloads 323
1198 A Self-Heating Gas Sensor of SnO2-Based Nanoparticles Electrophoretic Deposited

Authors: Glauco M. M. M. Lustosa, João Paulo C. Costa, Sonia M. Zanetti, Mario Cilense, Leinig Antônio Perazolli, Maria Aparecida Zaghete

Abstract:

The contamination of the environment has been one of the biggest problems of our time, mostly due to developments of many industries. SnO2 is an n-type semiconductor with band gap about 3.5 eV and has its electrical conductivity dependent of type and amount of modifiers agents added into matrix ceramic during synthesis process, allowing applications as sensing of gaseous pollutants on ambient. The chemical synthesis by polymeric precursor method consists in a complexation reaction between tin ion and citric acid at 90 °C/2 hours and subsequently addition of ethyleneglycol for polymerization at 130 °C/2 hours. It also prepared polymeric resin of zinc, cobalt and niobium ions. Stoichiometric amounts of the solutions were mixed to obtain the systems (Zn, Nb)-SnO2 and (Co, Nb) SnO2 . The metal immobilization reduces its segregation during the calcination resulting in a crystalline oxide with high chemical homogeneity. The resin was pre-calcined at 300 °C/1 hour, milled in Atritor Mill at 500 rpm/1 hour, and then calcined at 600 °C/2 hours. X-Ray Diffraction (XDR) indicated formation of SnO2 -rutile phase (JCPDS card nº 41-1445). The characterization by Scanning Electron Microscope of High Resolution showed spherical ceramic powder nanostructured with 10-20 nm of diameter. 20 mg of SnO2 -based powder was kept in 20 ml of isopropyl alcohol and then taken to an electrophoretic deposition (EPD) system. The EPD method allows control the thickness films through the voltage or current applied in the electrophoretic cell and by the time used for deposition of ceramics particles. This procedure obtains films in a short time with low costs, bringing prospects for a new generation of smaller size devices with easy integration technology. In this research, films were obtained in an alumina substrate with interdigital electrodes after applying 2 kV during 5 and 10 minutes in cells containing alcoholic suspension of (Zn, Nb)-SnO2 and (Co, Nb) SnO2 of powders, forming a sensing layer. The substrate has designed integrated micro hotplates that provide an instantaneous and precise temperature control capability when a voltage is applied. The films were sintered at 900 and 1000 °C in a microwave oven of 770 W, adapted by the research group itself with a temperature controller. This sintering is a fast process with homogeneous heating rate which promotes controlled growth of grain size and also the diffusion of modifiers agents, inducing the creation of intrinsic defects which will change the electrical characteristics of SnO2 -based powders. This study has successfully demonstrated a microfabricated system with an integrated micro-hotplate for detection of CO and NO2 gas at different concentrations and temperature, with self-heating SnO2 - based nanoparticles films, being suitable for both industrial process monitoring and detection of low concentrations in buildings/residences in order to safeguard human health. The results indicate the possibility for development of gas sensors devices with low power consumption for integration in portable electronic equipment with fast analysis. Acknowledgments The authors thanks to the LMA-IQ for providing the FEG-SEM images, and the financial support of this project by the Brazilian research funding agencies CNPq, FAPESP 2014/11314-9 and CEPID/CDMF- FAPESP 2013/07296-2.

Keywords: chemical synthesis, electrophoretic deposition, self-heating, gas sensor

Procedia PDF Downloads 274
1197 Biochemical and Molecular Analysis of Staphylococcus aureus Various Isolates from Different Places

Authors: Kiran Fatima, Kashif Ali

Abstract:

Staphylococcus aureus is an opportunistic human as well as animal pathogen that causes a variety of diseases. A total of 70 staphylococci isolates were obtained from soil, water, yogurt, and clinical samples. The likely staphylococci clinical isolates were identified phenotypically by different biochemical tests. Molecular identification was done by PCR using species-specific 16S rRNA primer pairs, and finally, 50 isolates were found to be positive as Staphylococcus aureus, sciuri, xylous and cohnii. Screened isolates were further analyzed by several microbiological diagnostics tests, including gram staining, coagulase, capsule, hemolysis, fermentation of glucose, lactose, maltose, and sucrose tests enzymatic reactions. It was found that 78%, 81%, and 51% of isolates were positive for gelatin hydrolysis, protease, and lipase activities, respectively. Antibiogram analysis of isolated Staphylococcus aureus strains with respect to different antimicrobial agents revealed resistance patterns ranging from 57 to 96%. Our study also shows 70% of strains to be MRSA, 54.3% as VRSA, and 54.3% as both MRSA and VRSA. All the identified isolates were subjected to detection of mecA, nuc, and hlb genes, and 70%, 84%, and 40% were found to harbour mecA, nuc, and hlb genes, respectively. The current investigation is highly important and informative for the high-level multidrug-resistant Staphylococcus aureus infections inclusive also of methicillin and vancomycin.

Keywords: MRSA, VRSA, mecA, MSSA

Procedia PDF Downloads 128
1196 Tagging a corpus of Media Interviews with Diplomats: Challenges and Solutions

Authors: Roberta Facchinetti, Sara Corrizzato, Silvia Cavalieri

Abstract:

Increasing interconnection between data digitalization and linguistic investigation has given rise to unprecedented potentialities and challenges for corpus linguists, who need to master IT tools for data analysis and text processing, as well as to develop techniques for efficient and reliable annotation in specific mark-up languages that encode documents in a format that is both human and machine-readable. In the present paper, the challenges emerging from the compilation of a linguistic corpus will be taken into consideration, focusing on the English language in particular. To do so, the case study of the InterDiplo corpus will be illustrated. The corpus, currently under development at the University of Verona (Italy), represents a novelty in terms both of the data included and of the tag set used for its annotation. The corpus covers media interviews and debates with diplomats and international operators conversing in English with journalists who do not share the same lingua-cultural background as their interviewees. To date, this appears to be the first tagged corpus of international institutional spoken discourse and will be an important database not only for linguists interested in corpus analysis but also for experts operating in international relations. In the present paper, special attention will be dedicated to the structural mark-up, parts of speech annotation, and tagging of discursive traits, that are the innovational parts of the project being the result of a thorough study to find the best solution to suit the analytical needs of the data. Several aspects will be addressed, with special attention to the tagging of the speakers’ identity, the communicative events, and anthropophagic. Prominence will be given to the annotation of question/answer exchanges to investigate the interlocutors’ choices and how such choices impact communication. Indeed, the automated identification of questions, in relation to the expected answers, is functional to understand how interviewers elicit information as well as how interviewees provide their answers to fulfill their respective communicative aims. A detailed description of the aforementioned elements will be given using the InterDiplo-Covid19 pilot corpus. The data yielded by our preliminary analysis of the data will highlight the viable solutions found in the construction of the corpus in terms of XML conversion, metadata definition, tagging system, and discursive-pragmatic annotation to be included via Oxygen.

Keywords: spoken corpus, diplomats’ interviews, tagging system, discursive-pragmatic annotation, english linguistics

Procedia PDF Downloads 184
1195 Reconfigurable Consensus Achievement of Multi Agent Systems Subject to Actuator Faults in a Leaderless Architecture

Authors: F. Amirarfaei, K. Khorasani

Abstract:

In this paper, reconfigurable consensus achievement of a team of agents with marginally stable linear dynamics and single input channel has been considered. The control algorithm is based on a first order linear protocol. After occurrence of a LOE fault in one of the actuators, using the imperfect information of the effectiveness of the actuators from fault detection and identification module, the control gain is redesigned in a way to still reach consensus. The idea is based on the modeling of change in effectiveness as change of Laplacian matrix. Then as special cases of this class of systems, a team of single integrators as well as double integrators are considered and their behavior subject to a LOE fault is considered. The well-known relative measurements consensus protocol is applied to a leaderless team of single integrator as well as double integrator systems, and Gersgorin disk theorem is employed to determine whether fault occurrence has an effect on system stability and team consensus achievement or not. The analyses show that loss of effectiveness fault in actuator(s) of integrator systems affects neither system stability nor consensus achievement.

Keywords: multi-agent system, actuator fault, stability analysis, consensus achievement

Procedia PDF Downloads 332
1194 Dynamic Cardiac Mitochondrial Proteome Alterations after Ischemic Preconditioning

Authors: Abdelbary Prince, Said Moussa, Hyungkyu Kim, Eman Gouda, Jin Han

Abstract:

We compared the dynamic alterations of mitochondrial proteome of control, ischemia-reperfusion (IR) and ischemic preconditioned (IPC) rabbit hearts. Using 2-DE, we identified 29 mitochondrial proteins that were differentially expressed in the IR heart compared with the control and IPC hearts. For two of the spots, the expression patterns were confirmed by Western blotting analysis. These proteins included succinate dehydrogenase complex, Acyl-CoA dehydrogenase, carnitine acetyltransferase, dihydrolipoamide dehydrogenase, Atpase, ATP synthase, dihydrolipoamide succinyltransferase, ubiquinol-cytochrome c reductase, translation elongation factor, acyl-CoA dehydrogenase, actin alpha, succinyl-CoA Ligase, dihydrolipoamide S-succinyltransferase, citrate synthase, acetyl-Coenzyme A dehydrogenase, creatine kinase, isocitrate dehydrogenase, pyruvate dehydrogenase, prohibitin, NADH dehydrogenase (ubiquinone) Fe-S protein, enoyl Coenzyme A hydratase, superoxide dismutase [Mn], and 24-kDa subunit of complex I. Interestingly, most of these proteins are associated with the mitochondrial respiratory chain, antioxidant enzyme system, and energy metabolism. The results provide clues as to the cardioprotective mechanism of ischemic preconditioning at the protein level and may serve as potential biomarkers for detection of ischemia-induced cardiac injury.

Keywords: ischemic preconditioning, mitochondria, proteome, cardioprotection

Procedia PDF Downloads 348
1193 Capnography for Detection of Return of Spontaneous Circulation Pseudo-Pea

Authors: Yiyuan David Hu, Alex Lindqwister, Samuel B. Klein, Karen Moodie, Norman A. Paradis

Abstract:

Introduction: Pseudo-Pulseless Electrical Activity (p-PEA) is a lifeless form of profound cardiac shock characterized by measurable cardiac mechanical activity without clinically detectable pulses. Patients in pseudo-PEA carry different prognoses than those in true PEA and may require different therapies. End-tidal carbon dioxide (ET-CO2) is a reliable indicator of the return of spontaneous circulation (ROSC) in ventricular fibrillation and true-PEA but has not been studied p-PEA. Hypothesis: ET-CO2 can be used as an independent indicator of ROSC in p-PEA resuscitation. Methods: 30kg female swine (N = 14) under intravenous anesthesia were instrumented with aortic and right atrial micromanometer pressure. ECG and ET-CO2 were measured continuously. p-PEA was induced by ventilation with 6% oxygen in 94% nitrogen and was defined as a systolic Ao less than 40 mmHg. The statistical relationships between ET-CO2 and ROSC are reported. Results: ET-CO2 during resuscitation strongly correlated with ROSC (Figure 1). Mean ET-CO2 during p-PEA was 28.4 ± 8.4, while mean ET-CO2 in ROSC for 100% O2 cohort was 42.2 ± 12.6 (p < 0.0001), mean ET-CO2 in ROSC for 100% O2 + CPR was 33.0 ± 15.4 (p < 0.0001). Analysis of slope was limited to one minute of resuscitation data to capture local linearity; assessment began 10 seconds after resuscitation started to allow the ventilator to mix 100% O2. Pigs who would recover with 100% O2 had a slope of 0.023 ± 0.001, oxygen + CPR had a slope of 0.018 ± 0.002, and oxygen + CPR + epinephrine had a slope of 0.0050 ± 0.0009. Conclusions: During resuscitation from porcine hypoxic p-PEA, a rise in ET-CO2 is indicative of ROSC.

Keywords: ET-CO2, resuscitation, capnography, pseudo-PEA

Procedia PDF Downloads 186
1192 Design of Speed Bump Recognition System Integrated with Adjustable Shock Absorber Control

Authors: Ming-Yen Chang, Sheng-Hung Ke

Abstract:

This research focuses on the development of a speed bump identification system for real-time control of adjustable shock absorbers in vehicular suspension systems. The study initially involved the collection of images of various speed bumps, and rubber speed bump profiles found on roadways. These images were utilized for training and recognition purposes through the deep learning object detection algorithm YOLOv5. Subsequently, the trained speed bump identification program was integrated with an in-vehicle camera system for live image capture during driving. These images were instantly transmitted to a computer for processing. Using the principles of monocular vision ranging, the distance between the vehicle and an approaching speed bump was determined. The appropriate control distance was established through both practical vehicle measurements and theoretical calculations. Collaboratively, with the electronically adjustable shock absorbers equipped in the vehicle, a shock absorber control system was devised to dynamically adapt the damping force just prior to encountering a speed bump. This system effectively mitigates passenger discomfort and enhances ride quality.

Keywords: adjustable shock absorbers, image recognition, monocular vision ranging, ride

Procedia PDF Downloads 65
1191 An Integrative Computational Pipeline for Detection of Tumor Epitopes in Cancer Patients

Authors: Tanushree Jaitly, Shailendra Gupta, Leila Taher, Gerold Schuler, Julio Vera

Abstract:

Genomics-based personalized medicine is a promising approach to fight aggressive tumors based on patient's specific tumor mutation and expression profiles. A remarkable case is, dendritic cell-based immunotherapy, in which tumor epitopes targeting patient's specific mutations are used to design a vaccine that helps in stimulating cytotoxic T cell mediated anticancer immunity. Here we present a computational pipeline for epitope-based personalized cancer vaccines using patient-specific haplotype and cancer mutation profiles. In the workflow proposed, we analyze Whole Exome Sequencing and RNA Sequencing patient data to detect patient-specific mutations and their expression level. Epitopes including the tumor mutations are computationally predicted using patient's haplotype and filtered based on their expression level, binding affinity, and immunogenicity. We calculate binding energy for each filtered major histocompatibility complex (MHC)-peptide complex using docking studies, and use this feature to select good epitope candidates further.

Keywords: cancer immunotherapy, epitope prediction, NGS data, personalized medicine

Procedia PDF Downloads 249
1190 Reading against the Grain: Transcodifying Stimulus Meaning

Authors: Aba-Carina Pârlog

Abstract:

On translating, reading against the grain results in a wrong effect in the TL. Quine’s ocular irradiation plays an important part in the process of understanding and translating a text. The various types of textual radiation must be rendered by the translator by paying close attention to the types of field that produce it. The literary work must be seen as an indirect cause of an expressive effect in the TL that is supposed to be similar to the effect it has in the SL. If the adaptive transformative codes are so flexible that they encourage the translator to repeatedly leave out parts of the original work, then a subversive pattern emerges which changes the entire book. In this case, the translator is a writer per se who decides what goes in and out of the book, how the style is to be ciphered and what elements of ideology are to be highlighted. Figurative language must not be flattened for the sake of clarity or naturalness. The missing figurative elements make the translated text less interesting, less challenging and less vivid which reflects poorly on the writer. There is a close connection between style and the writer’s person. If the writer’s style is very much changed in a translation, the translation is useless as the original writer and his / her imaginative world can no longer be discovered. Then, a different writer appears and his / her creation surfaces. Changing meaning considered as a “negative shift” in translation defines one of the faulty transformative codes used by some translators. It is a dangerous tool which leads to adaptations that sometimes reflect the original less than the reader would wish to. It contradicts the very essence of the process of translation which is that of making a work available in a foreign language. Employing speculative aesthetics at the level of a text indicates the wish to create manipulative or subversive effects in the translated work. This is generally achieved by adding new words or connotations, creating new figures of speech or using explicitations. The irradiation patterns of the original work are neglected and the translator creates new meanings, implications, emphases and contexts. Again s/he turns into a new author who enjoys the freedom of expressing his / her ideas without the constraints of the original text. The stimulus meaning of a text is very important for a translator which is why reading against the grain is unadvisable during the process of translation. By paying attention to the waves of the SL input, a faithful literary work is produced which does not contradict general knowledge about foreign cultures and civilizations. Following personal common sense is essential in the field of translation as well as everywhere else.

Keywords: stimulus meaning, substance of expression, transformative code, translation

Procedia PDF Downloads 445
1189 A Sensitive Uric Acid Electrochemical Sensing in Biofluids Based on Ni/Zn Hydroxide Nanocatalyst

Authors: Nathalia Florencia Barros Azeredo, Josué Martins Gonçalves, Pamela De Oliveira Rossini, Koiti Araki, Lucio Angnes

Abstract:

This work demonstrates the electroanalysis of uric acid (UA) at very low working potential (0 V vs Ag/AgCl) directly in body fluids such as saliva and sweat using electrodes modified with mixed -Ni0.75Zn0.25(OH)2 nanoparticles exhibiting stable electrocatalytic responses from alkaline down to weakly acidic media (pH 14 to 3 range). These materials were prepared for the first time and fully characterized by TEM, XRD, and spectroscopic techniques. The electrochemical properties of the modified electrodes were evaluated in a fast and simple procedure for uric acid analyses based on cyclic voltammetry and chronoamperometry, pushing down the detection and quantification limits (respectively of 2.3*10-8 and 7.6*10-8 mol L-1) with good repeatability (RSD = 3.2% for 30 successive analyses pH 14). Finally, the possibility of real application was demonstrated upon realization of unexpectedly robust and sensitive modified FTO (fluorine doped tin oxide) glass and screen-printed sensors for measurement of uric acid directly in real saliva and sweat samples, with no significant interference of usual concentrations of ascorbic acid, acetaminophen, lactate and glucose present in those body fluids (Fig. 1).

Keywords: nickel hydroxide, mixed catalyst, uric acid sensors, biofluids

Procedia PDF Downloads 126
1188 Measurement System for Human Arm Muscle Magnetic Field and Grip Strength

Authors: Shuai Yuan, Minxia Shi, Xu Zhang, Jianzhi Yang, Kangqi Tian, Yuzheng Ma

Abstract:

The precise measurement of muscle activities is essential for understanding the function of various body movements. This work aims to develop a muscle magnetic field signal detection system based on mathematical analysis. Medical research has underscored that early detection of muscle atrophy, coupled with lifestyle adjustments such as dietary control and increased exercise, can significantly enhance muscle-related diseases. Currently, surface electromyography (sEMG) is widely employed in research as an early predictor of muscle atrophy. Nonetheless, the primary limitation of using sEMG to forecast muscle strength is its inability to directly measure the signals generated by muscles. Challenges arise from potential skin-electrode contact issues due to perspiration, leading to inaccurate signals or even signal loss. Additionally, resistance and phase are significantly impacted by adipose layers. The recent emergence of optically pumped magnetometers introduces a fresh avenue for bio-magnetic field measurement techniques. These magnetometers possess high sensitivity and obviate the need for a cryogenic environment unlike superconducting quantum interference devices (SQUIDs). They detect muscle magnetic field signals in the range of tens to thousands of femtoteslas (fT). The utilization of magnetometers for capturing muscle magnetic field signals remains unaffected by issues of perspiration and adipose layers. Since their introduction, optically pumped atomic magnetometers have found extensive application in exploring the magnetic fields of organs such as cardiac and brain magnetism. The optimal operation of these magnetometers necessitates an environment with an ultra-weak magnetic field. To achieve such an environment, researchers usually utilize a combination of active magnetic compensation technology with passive magnetic shielding technology. Passive magnetic shielding technology uses a magnetic shielding device built with high permeability materials to attenuate the external magnetic field to a few nT. Compared with more layers, the coils that can generate a reverse magnetic field to precisely compensate for the residual magnetic fields are cheaper and more flexible. To attain even lower magnetic fields, compensation coils designed by Biot-Savart law are involved to generate a counteractive magnetic field to eliminate residual magnetic fields. By solving the magnetic field expression of discrete points in the target region, the parameters that determine the current density distribution on the plane can be obtained through the conventional target field method. The current density is obtained from the partial derivative of the stream function, which can be represented by the combination of trigonometric functions. Optimization algorithms in mathematics are introduced into coil design to obtain the optimal current density distribution. A one-dimensional linear regression analysis was performed on the collected data, obtaining a coefficient of determination R2 of 0.9349 with a p-value of 0. This statistical result indicates a stable relationship between the peak-to-peak value (PPV) of the muscle magnetic field signal and the magnitude of grip strength. This system is expected to be a widely used tool for healthcare professionals to gain deeper insights into the muscle health of their patients.

Keywords: muscle magnetic signal, magnetic shielding, compensation coils, trigonometric functions.

Procedia PDF Downloads 55
1187 Development of Trigger Tool to Identify Adverse Drug Events From Warfarin Administered to Patient Admitted in Medical Wards of Chumphae Hospital

Authors: Puntarikorn Rungrattanakasin

Abstract:

Objectives: To develop the trigger tool to warn about the risk of bleeding as an adverse event from warfarin drug usage during admission in Medical Wards of Chumphae Hospital. Methods: A retrospective study was performed by reviewing the medical records for the patients admitted between June 1st,2020- May 31st, 2021. ADEs were evaluated by Naranjo’s algorithm. The international normalized ratio (INR) and events of bleeding during admissions were collected. Statistical analyses, including Chi-square test and Reciever Operating Characteristic (ROC) curve for optimal INR threshold, were used for the study. Results: Among the 139 admissions, the INR range was found to vary between 0.86-14.91, there was a total of 15 bleeding events, out of which 9 were mild, and 6 were severe. The occurrence of bleeding started whenever the INR was greater than 2.5 and reached the statistical significance (p <0.05), which was in concordance with the ROC curve and yielded 100 % sensitivity and 60% specificity in the detection of a bleeding event. In this regard, the INR greater than 2.5 was considered to be an optimal threshold to alert promptly for bleeding tendency. Conclusions: The INR value of greater than 2.5 (>2.5) would be an appropriate trigger tool to warn of the risk of bleeding for patients taking warfarin in Chumphae Hospital.

Keywords: trigger tool, warfarin, risk of bleeding, medical wards

Procedia PDF Downloads 145
1186 Computer Aided Analysis of Breast Based Diagnostic Problems from Mammograms Using Image Processing and Deep Learning Methods

Authors: Ali Berkan Ural

Abstract:

This paper presents the analysis, evaluation, and pre-diagnosis of early stage breast based diagnostic problems (breast cancer, nodulesorlumps) by Computer Aided Diagnosing (CAD) system from mammogram radiological images. According to the statistics, the time factor is crucial to discover the disease in the patient (especially in women) as possible as early and fast. In the study, a new algorithm is developed using advanced image processing and deep learning method to detect and classify the problem at earlystagewithmoreaccuracy. This system first works with image processing methods (Image acquisition, Noiseremoval, Region Growing Segmentation, Morphological Operations, Breast BorderExtraction, Advanced Segmentation, ObtainingRegion Of Interests (ROIs), etc.) and segments the area of interest of the breast and then analyzes these partly obtained area for cancer detection/lumps in order to diagnosis the disease. After segmentation, with using the Spectrogramimages, 5 different deep learning based methods (specified Convolutional Neural Network (CNN) basedAlexNet, ResNet50, VGG16, DenseNet, Xception) are applied to classify the breast based problems.

Keywords: computer aided diagnosis, breast cancer, region growing, segmentation, deep learning

Procedia PDF Downloads 92
1185 Detection of Paenibacillus larvae (American Foulbrood Disease) by the PCR and Culture in the Remains of the Hive Collected at the Bottom of the Colony

Authors: N. Adjlane, N. Haddad

Abstract:

The American foulbrood is one of the most serious diseases that may affect brood of larvae and pupae stages. The causative organism is a gram positive bacterium Paaenibacillus larvae. American foulbrood infected apiaries suffer from severe economic losses, resulting from significant decreases in honeybee populations and honey production. The aim of this study was to detect Paenibacillus larvae in the remains collected at the bottom of the hive from the suspected hives by direct PCR and culture growth. A total of 56 suspected beehive wax debris samples collected in 40 different apiaries located in the central region of Algeria. MYPGP the culture medium is used during all the identifications of the bacterium. After positive results on samples, biochemical confirmation tests (test of catalase, presence hydrolysis of casein) and microscopic (gram stain) are used in order to verify the accuracy of the initial results. The QIAamp DNA Mini Kit is used to identify the DNA of Paaenibacillus larvae. Paaenibacillus larvae were identified in 14 samples out of 16 by the PCR. A suspected culture-negative sample was found positive through evaluation with PCR. This research is for the bacterium Paaenibacillus larvae in the debris of the colony is an effective method for diagnosis of the pathology of American foulbrood.

Keywords: Paenibacillus larvae, honeybee, PCR, microbiological method

Procedia PDF Downloads 409
1184 Analysis of Vibratory Signals Based on Local Mean Decomposition (LMD) for Rolling Bearing Fault Diagnosis

Authors: Toufik Bensana, Medkour Mihoub, Slimane Mekhilef

Abstract:

The use of vibration analysis has been established as the most common and reliable method of analysis in the field of condition monitoring and diagnostics of rotating machinery. Rolling bearings cover a broad range of rotary machines and plays a crucial role in the modern manufacturing industry. Unfortunately, the vibration signals collected from a faulty bearing are generally nonstationary, nonlinear and with strong noise interference, so it is essential to obtain the fault features correctly. In this paper, a novel numerical analysis method based on local mean decomposition (LMD) is proposed. LMD decompose the signal into a series of product functions (PFs), each of which is the product of an envelope signal and a purely frequency modulated FM signal. The envelope of a PF is the instantaneous amplitude (IA), and the derivative of the unwrapped phase of a purely flat frequency demodulated (FM) signal is the IF. After that, the fault characteristic frequency of the roller bearing can be extracted by performing spectrum analysis to the instantaneous amplitude of PF component containing dominant fault information. The results show the effectiveness of the proposed technique in fault detection and diagnosis of rolling element bearing.

Keywords: fault diagnosis, rolling element bearing, local mean decomposition, condition monitoring

Procedia PDF Downloads 388
1183 Giftedness Cloud Model: A Psychological and Ecological Vision of Giftedness Concept

Authors: Rimeyah H. S. Almutairi, Alaa Eldin A. Ayoub

Abstract:

The aim of this study was to identify empirical and theoretical studies that explored giftedness theories and identification. In order to assess and synthesize the mechanisms, outcomes, and impacts of gifted identification models. Thus, we sought to provide an evidence-informed answer to how does current giftedness theories work and effectiveness. In order to develop a model that incorporates the advantages of existing models and avoids their disadvantages as much as possible. We conducted a systematic literature review (SLR). The disciplined analysis resulted in a final sample consisting of 30 appropriate searches. The results indicated that: (a) there is no uniform and consistent definition of Giftedness; (b) researchers are using several non-consistent criteria to detect gifted, and (d) The detection of talent is largely limited to early ages, and there is obvious neglect of adults. This study contributes to the development of Giftedness Cloud Model (GCM) which defined as a model that attempts to interpretation giftedness within an interactive psychological and ecological framework. GCM aims to help a talented to reach giftedness core and manifestation talent in creative productivity or invention. Besides that, GCM suggests classifying giftedness into four levels of mastery, excellence, creative productivity, and manifestation. In addition, GCM presents an idea to distinguish between talent and giftedness.

Keywords: giftedness cloud model, talent, systematic literature review, giftedness concept

Procedia PDF Downloads 165
1182 Vapochromism of 3,3’,5,5’-Tetramethylbenzidine-Tetrasilisicfluormica Intercalation Compounds with High Selectivity for Water and Acetonitrile

Authors: Reira Kinoshita, Shin'ichi Ishimaru

Abstract:

Vapochromism is a type of chromism in which the color of a substance changes when it is exposed to the vapor of volatile materials, and has been investigated for the application of chemical sensors for volatile organic compounds causing sick building syndrome and health hazards in workspaces. We synthesized intercalation compounds of 3,3',5,5'-tetramethylbenzidine (TMB), and tetrasilisicfluormica (TSFM) by the commonly used cation-exchange method with the cation ratio TMB²⁺/CEC of TSFM = 1.0, 2.0, 2.7 and 5.4 to investigate the vapochromism of these materials. The obtained samples were characterized by powder XRD, XRF, TG-DTA, N₂ adsorption, and SEM. Vapochromism was measured for each sample under a controlled atmosphere by a handy reflectance spectrometer directly from the outside of the glass sample tubes. The color was yellow for all specimens vacuum-dried at 50 °C, but it turned green under H₂O vapor exposure for the samples with TMB²⁺/CEC = 2.0, 2.7, and 5.4 and blue under acetonitrile vapor for all cation ratios. Especially the sample TMB²⁺/CEC = 2.0 showed clear chromism both for water and acetonitrile. On the other hand, no clear color change was observed for vapors of alcohols, acetone, and non-polar solvents. From these results, this material can be expected to apply for easy detection of humidity and acetonitrile vapor in the environment.

Keywords: chemical sensor, intercalation compound, tetramethylbenzidine, tetrasilisicfluormica, vapochromism, volatile organic compounds

Procedia PDF Downloads 116
1181 Evidence-Based Practices in Education: A General Review of the Literature on Elementary Classroom Setting

Authors: Carolina S. Correia, Thalita V. Thomé, Andersen Boniolo, Dhayana I. Veiga

Abstract:

Evidence-based practices (EBP) in education is a set of principles and practices used to raise educational policy, it involves the integration of professional expertise in education with the best empirical evidence in making decisions about how to deliver instruction. The purpose of this presentation is to describe and characterize studies about EBP in education in elementary classroom setting. Data here presented is part of an ongoing systematic review research. Articles were searched and selected from four academic databases: ProQuest, Scielo, Science Direct and Capes. The search terms were evidence-based practices or program effectiveness, and education or teaching or teaching practices or teaching methods. Articles were included according to the following criteria: The studies were explicitly described as evidence-based or discussed the most effective practices in education, they discussed teaching practices in classroom context in elementary school level. Document excerpts were extracted and recorded in Excel, organized by reference, descriptors, abstract, purpose, setting, participants, type of teaching practice, study design and main results. The total amount of articles selected were 1.185, 569 articles from Proquest Research Library; 216 from CAPES; 251 from ScienceDirect and 149 from Scielo Library. The potentially relevant references were 178, from which duplicates were removed. The final number of articles analyzed was 140. From 140 articles, are 47 theoretical studies and 93 empirical articles. The following research design methods were identified: longitudinal intervention study, cluster-randomized trial, meta-analysis and pretest-posttest studies. From 140 articles, 103 studies were about regular school teaching and 37 were on special education teaching practices. In several studies, used as teaching method: active learning, content acquisition podcast (CAP), precision teaching (PT), mediated reading practice, speech therapist programs and peer-assisted learning strategies (PALS). The countries of origin of the studies were United States of America, United Kingdom, Panama, Sweden, Scotland, South Korea, Argentina, Chile, New Zealand and Brunei. The present study in is an ongoing project, so some representative findings will be discussed, providing further acknowledgment on the best teaching practices in elementary classroom setting.

Keywords: best practices, children, evidence-based education, elementary school, teaching methods

Procedia PDF Downloads 331
1180 Sea-Land Segmentation Method Based on the Transformer with Enhanced Edge Supervision

Authors: Lianzhong Zhang, Chao Huang

Abstract:

Sea-land segmentation is a basic step in many tasks such as sea surface monitoring and ship detection. The existing sea-land segmentation algorithms have poor segmentation accuracy, and the parameter adjustments are cumbersome and difficult to meet actual needs. Also, the current sea-land segmentation adopts traditional deep learning models that use Convolutional Neural Networks (CNN). At present, the transformer architecture has achieved great success in the field of natural images, but its application in the field of radar images is less studied. Therefore, this paper proposes a sea-land segmentation method based on the transformer architecture to strengthen edge supervision. It uses a self-attention mechanism with a gating strategy to better learn relative position bias. Meanwhile, an additional edge supervision branch is introduced. The decoder stage allows the feature information of the two branches to interact, thereby improving the edge precision of the sea-land segmentation. Based on the Gaofen-3 satellite image dataset, the experimental results show that the method proposed in this paper can effectively improve the accuracy of sea-land segmentation, especially the accuracy of sea-land edges. The mean IoU (Intersection over Union), edge precision, overall precision, and F1 scores respectively reach 96.36%, 84.54%, 99.74%, and 98.05%, which are superior to those of the mainstream segmentation models and have high practical application values.

Keywords: SAR, sea-land segmentation, deep learning, transformer

Procedia PDF Downloads 176
1179 Identification System for Grading Banana in Food Processing Industry

Authors: Ebenezer O. Olaniyi, Oyebade K. Oyedotun, Khashman Adnan

Abstract:

In the food industry high quality production is required within a limited time to meet up with the demand in the society. In this research work, we have developed a model which can be used to replace the human operator due to their low output in production and slow in making decisions as a result of an individual differences in deciding the defective and healthy banana. This model can perform the vision attributes of human operators in deciding if the banana is defective or healthy for food production based. This research work is divided into two phase, the first phase is the image processing where several image processing techniques such as colour conversion, edge detection, thresholding and morphological operation were employed to extract features for training and testing the network in the second phase. These features extracted in the first phase were used in the second phase; the classification system phase where the multilayer perceptron using backpropagation neural network was employed to train the network. After the network has learned and converges, the network was tested with feedforward neural network to determine the performance of the network. From this experiment, a recognition rate of 97% was obtained and the time taken for this experiment was limited which makes the system accurate for use in the food industry.

Keywords: banana, food processing, identification system, neural network

Procedia PDF Downloads 466
1178 Design and Characterization of a Smart Composite Fabric for Knee Brace

Authors: Rohith J. K., Amir Nazemi, Abbas S. Milani

Abstract:

In Paralympic sports, athletes often depend on some form of equipment to enable competitive sporting, where most of this equipment would only allow passive physiological supports and discrete physiological measurements. Active feedback physiological support and continuous detection of performance indicators, without time or space constraints, would be beneficial in more effective training and performance measures of Paralympic athletes. Moreover, occasionally the athletes suffer from fatigue and muscular stains due to improper monitoring systems. The latter challenges can be overcome by using Smart Composites technology when manufacturing, e.g., knee brace and other sports wearables utilities, where the sensors can be fused together into the fabric and an assisted system actively support the athlete. This paper shows how different sensing functionality may be created by intrinsic and extrinsic modifications onto different types of composite fabrics, depending on the level of integration and the employed functional elements. Results demonstrate that fabric sensors can be well-tailored to measure muscular strain and be used in the fabrication of a smart knee brace as a sample potential application. Materials, connectors, fabric circuits, interconnects, encapsulation and fabrication methods associated with such smart fabric technologies prove to be customizable and versatile.

Keywords: smart composites, sensors, smart fabrics, knee brace

Procedia PDF Downloads 176
1177 Oral Examination: An Important Adjunct to the Diagnosis of Dermatological Disorders

Authors: Sanjay Saraf

Abstract:

The oral cavity can be the site for early manifestations of mucocutaneous disorders (MD) or the only site for occurrence of these disorders. It can also exhibit oral lesions with simultaneous associated skin lesions. The MD involving the oral mucosa commonly presents with signs such as ulcers, vesicles and bullae. The unique environment of the oral cavity may modify these signs of the disease, thereby making the clinical diagnosis an arduous task. In addition to the unique environment of oral cavity, the overlapping of the signs of various mucocutaneous disorders, also makes the clinical diagnosis more intricate. The aim of this review is to present the oral signs of dermatological disorders having common oral involvement and emphasize their importance in early detection of the systemic disorders. The aim is also to highlight the necessity of oral examination by a dermatologist while examining the skin lesions. Prior to the oral examination, it must be imperative for the dermatologists and the dental clinicians to have the knowledge of oral anatomy. It is also important to know the impact of various diseases on oral mucosa, and the characteristic features of various oral mucocutaneous lesions. An initial clinical oral examination is may help in the early diagnosis of the MD. Failure to identify the oral manifestations may reduce the likelihood of early treatment and lead to more serious problems. This paper reviews the oral manifestations of immune mediated dermatological disorders with common oral manifestations.

Keywords: dermatological investigations, genodermatosis, histological features, oral examination

Procedia PDF Downloads 354
1176 The Influence of Cognitive Load in the Acquisition of Words through Sentence or Essay Writing

Authors: Breno Barrreto Silva, Agnieszka Otwinowska, Katarzyna Kutylowska

Abstract:

Research comparing lexical learning following the writing of sentences and longer texts with keywords is limited and contradictory. One possibility is that the recursivity of writing may enhance processing and increase lexical learning; another possibility is that the higher cognitive load of complex-text writing (e.g., essays), at least when timed, may hinder the learning of words. In our study, we selected 2 sets of 10 academic keywords matched for part of speech, length (number of characters), frequency (SUBTLEXus), and concreteness, and we asked 90 L1-Polish advanced-level English majors to use the keywords when writing sentences, timed (60 minutes) or untimed essays. First, all participants wrote a timed Control essay (60 minutes) without keywords. Then different groups produced Timed essays (60 minutes; n=33), Untimed essays (n=24), or Sentences (n=33) using the two sets of glossed keywords (counterbalanced). The comparability of the participants in the three groups was ensured by matching them for proficiency in English (LexTALE), and for few measures derived from the control essay: VocD (assessing productive lexical diversity), normed errors (assessing productive accuracy), words per minute (assessing productive written fluency), and holistic scores (assessing overall quality of production). We measured lexical learning (depth and breadth) via an adapted Vocabulary Knowledge Scale (VKS) and a free association test. Cognitive load was measured in the three essays (Control, Timed, Untimed) using normed number of errors and holistic scores (TOEFL criteria). The number of errors and essay scores were obtained from two raters (interrater reliability Pearson’s r=.78-91). Generalized linear mixed models showed no difference in the breadth and depth of keyword knowledge after writing Sentences, Timed essays, and Untimed essays. The task-based measurements found that Control and Timed essays had similar holistic scores, but that Untimed essay had better quality than Timed essay. Also, Untimed essay was the most accurate, and Timed essay the most error prone. Concluding, using keywords in Timed, but not Untimed, essays increased cognitive load, leading to more errors and lower quality. Still, writing sentences and essays yielded similar lexical learning, and differences in the cognitive load between Timed and Untimed essays did not affect lexical acquisition.

Keywords: learning academic words, writing essays, cognitive load, english as an L2

Procedia PDF Downloads 71
1175 Parameter Estimation for Contact Tracing in Graph-Based Models

Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar

Abstract:

We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.

Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference

Procedia PDF Downloads 75
1174 Electroencephalogram Based Approach for Mental Stress Detection during Gameplay with Level Prediction

Authors: Priyadarsini Samal, Rajesh Singla

Abstract:

Many mobile games come with the benefits of entertainment by introducing stress to the human brain. In recognizing this mental stress, the brain-computer interface (BCI) plays an important role. It has various neuroimaging approaches which help in analyzing the brain signals. Electroencephalogram (EEG) is the most commonly used method among them as it is non-invasive, portable, and economical. Here, this paper investigates the pattern in brain signals when introduced with mental stress. Two healthy volunteers played a game whose aim was to search hidden words from the grid, and the levels were chosen randomly. The EEG signals during gameplay were recorded to investigate the impacts of stress with the changing levels from easy to medium to hard. A total of 16 features of EEG were analyzed for this experiment which includes power band features with relative powers, event-related desynchronization, along statistical features. Support vector machine was used as the classifier, which resulted in an accuracy of 93.9% for three-level stress analysis; for two levels, the accuracy of 92% and 98% are achieved. In addition to that, another game that was similar in nature was played by the volunteers. A suitable regression model was designed for prediction where the feature sets of the first and second game were used for testing and training purposes, respectively, and an accuracy of 73% was found.

Keywords: brain computer interface, electroencephalogram, regression model, stress, word search

Procedia PDF Downloads 185
1173 A Comparison Study of Different Methods Used in the Detection of Giardia lamblia on Fecal Specimen of Children

Authors: Muhammad Farooq Baig

Abstract:

Objective: The purpose of this study was to compare results obtained using a single fecal specimen for O&P examination, direct immunofluorescence assay (DFA), and two conventional staining methods. Design: Hundred and fifty children fecal specimens were collected and examined by each method. The O&P and the DFA were used as the reference method. Setting: The study was performed at the laboratory in the Basic Medical Science Institute JPMC Karachi. Patients or Other Participants: The fecal specimens were collected from children with a suspected Giardia lamblia infection. Main Outcome Measures: The amount of agreement and disagreement between methods.1) Presence of giardiasis in our population. 2) The sensitivity and specificity of each method. Results: There was 45(30%) positive 105 (70%) negative on DFA, 41 (27.4%) positive 109 (72.6%) negative on iodine and 34 (22.6%) positive 116(77.4%) on saline method. The sensitivity and specificity of DFA in comparision to iodine were 92.2%, 92.7% respectively. The sensitivity and specificity of DFA in comparisoin to saline method were 91.2%, 87.9% respectively. The sensitivity of iodine method and saline method in compariosn to DFA were 82.2%, 68.8% respectively. There is mark diffrence in sensitivity of DFA to conventional method. Conclusion: The study supported findings of other investigators who concluded that DFA method have the greater sensitivity. The immunologic methods were more efficient and quicker than the conventional O&P method.

Keywords: direct immunofluorescence assay (DFA), ova and parasite (O&P), Giardia lamblia, children, medical science

Procedia PDF Downloads 418
1172 Assisted Prediction of Hypertension Based on Heart Rate Variability and Improved Residual Networks

Authors: Yong Zhao, Jian He, Cheng Zhang

Abstract:

Cardiovascular diseases caused by hypertension are extremely threatening to human health, and early diagnosis of hypertension can save a large number of lives. Traditional hypertension detection methods require special equipment and are difficult to detect continuous blood pressure changes. In this regard, this paper first analyzes the principle of heart rate variability (HRV) and introduces sliding window and power spectral density (PSD) to analyze the time domain features and frequency domain features of HRV, and secondly, designs an HRV-based hypertension prediction network by combining Resnet, attention mechanism, and multilayer perceptron, which extracts the frequency domain through the improved ResNet18 features through a modified ResNet18, its fusion with time-domain features through an attention mechanism, and the auxiliary prediction of hypertension through a multilayer perceptron. Finally, the network was trained and tested using the publicly available SHAREE dataset on PhysioNet, and the test results showed that this network achieved 92.06% prediction accuracy for hypertension and outperformed K Near Neighbor(KNN), Bayes, Logistic, and traditional Convolutional Neural Network(CNN) models in prediction performance.

Keywords: feature extraction, heart rate variability, hypertension, residual networks

Procedia PDF Downloads 104