Search results for: signal processing for transmission carrier frequency offset
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10566

Search results for: signal processing for transmission carrier frequency offset

6756 Performance Evaluation of Distributed and Co-Located MIMO LTE Physical Layer Using Wireless Open-Access Research Platform

Authors: Ishak Suleiman, Ahmad Kamsani Samingan, Yeoh Chun Yeow, Abdul Aziz Bin Abdul Rahman

Abstract:

In this paper, we evaluate the benefits of distributed 4x4 MIMO LTE downlink systems compared to that of the co-located 4x4 MIMO LTE downlink system. The performance evaluation was carried out experimentally by using Wireless Open-Access Research Platform (WARP), where the comparison between the 4x4 MIMO LTE transmission downlink system in distributed and co-located techniques was examined. The measured Error Vector Magnitude (EVM) results showed that the distributed technique achieved better system performance compared to the co-located arrangement.

Keywords: multiple-input-multiple-output (MIMO), distributed MIMO, co-located MIMO, LTE

Procedia PDF Downloads 417
6755 Clinical Validation of an Automated Natural Language Processing Algorithm for Finding COVID-19 Symptoms and Complications in Patient Notes

Authors: Karolina Wieczorek, Sophie Wiliams

Abstract:

Introduction: Patient data is often collected in Electronic Health Record Systems (EHR) for purposes such as providing care as well as reporting data. This information can be re-used to validate data models in clinical trials or in epidemiological studies. Manual validation of automated tools is vital to pick up errors in processing and to provide confidence in the output. Mentioning a disease in a discharge letter does not necessarily mean that a patient suffers from this disease. Many of them discuss a diagnostic process, different tests, or discuss whether a patient has a certain disease. The COVID-19 dataset in this study used natural language processing (NLP), an automated algorithm which extracts information related to COVID-19 symptoms, complications, and medications prescribed within the hospital. Free-text patient clinical patient notes are rich sources of information which contain patient data not captured in a structured form, hence the use of named entity recognition (NER) to capture additional information. Methods: Patient data (discharge summary letters) were exported and screened by an algorithm to pick up relevant terms related to COVID-19. Manual validation of automated tools is vital to pick up errors in processing and to provide confidence in the output. A list of 124 Systematized Nomenclature of Medicine (SNOMED) Clinical Terms has been provided in Excel with corresponding IDs. Two independent medical student researchers were provided with a dictionary of SNOMED list of terms to refer to when screening the notes. They worked on two separate datasets called "A” and "B”, respectively. Notes were screened to check if the correct term had been picked-up by the algorithm to ensure that negated terms were not picked up. Results: Its implementation in the hospital began on March 31, 2020, and the first EHR-derived extract was generated for use in an audit study on June 04, 2020. The dataset has contributed to large, priority clinical trials (including International Severe Acute Respiratory and Emerging Infection Consortium (ISARIC) by bulk upload to REDcap research databases) and local research and audit studies. Successful sharing of EHR-extracted datasets requires communicating the provenance and quality, including completeness and accuracy of this data. The results of the validation of the algorithm were the following: precision (0.907), recall (0.416), and F-score test (0.570). Percentage enhancement with NLP extracted terms compared to regular data extraction alone was low (0.3%) for relatively well-documented data such as previous medical history but higher (16.6%, 29.53%, 30.3%, 45.1%) for complications, presenting illness, chronic procedures, acute procedures respectively. Conclusions: This automated NLP algorithm is shown to be useful in facilitating patient data analysis and has the potential to be used in more large-scale clinical trials to assess potential study exclusion criteria for participants in the development of vaccines.

Keywords: automated, algorithm, NLP, COVID-19

Procedia PDF Downloads 96
6754 Secure Transfer of Medical Images Using Hybrid Encryption Authentication, Confidentiality, Integrity

Authors: Boukhatem Mohammed Belkaid, Lahdir Mourad

Abstract:

In this paper, we propose a new encryption system for security issues medical images. The hybrid encryption scheme is based on AES and RSA algorithms to validate the three security services are authentication, integrity, and confidentiality. Privacy is ensured by AES, authenticity is ensured by the RSA algorithm. Integrity is assured by the basic function of the correlation between adjacent pixels. Our system generates a unique password every new session of encryption, that will be used to encrypt each frame of the medical image basis to strengthen and ensure his safety. Several metrics have been used for various tests of our analysis. For the integrity test, we noticed the efficiencies of our system and how the imprint cryptographic changes at reception if a change affects the image in the transmission channel.

Keywords: AES, RSA, integrity, confidentiality, authentication, medical images, encryption, decryption, key, correlation

Procedia PDF Downloads 536
6753 Formulation and in Vitro Evaluation of Cubosomes Containing CeO₂ Nanoparticles Loaded with Glatiramer Acetate Drug

Authors: Akbar Esmaeili, Zahra Salarieh

Abstract:

Cerium oxide nanoparticles (nano-series) are used as catalysts in industrial applications due to their free radical scavenging properties. Given that free radicals play an essential role in the pathology of many neurological diseases, we investigated the use of nanocrystals as a potential therapeutic agent for oxidative damage. This project synthesized nano-series from a new and environmentally friendly bio-pathway. Investigation of cerium nitrate in culture medium containing inoculated Lactobacillus acidophilus strain before incubation produces nano-series. Loaded with glatiramer acetate (GA) was formed by coating carboxymethylcellulose (CMC) and CeO2. FE-SEM analysis showed nano-series in the 9-11 nm range, spherical shape, and uniform particle size distribution. Cubic nanoparticles containing anti-multiple sclerosis (anti-Ms) treatment called GA were used. Glycerol monostearate (GMS) was used as a fat base, and evening primrose extract was used as an anti-inflammatory in cubosomes. Design-Expert® software was used to study the effects of different formulation factors on the properties of GAloaded cubic dispersions. Thirty GA-labeled cubic dispersions were prepared with GA-labeled carboxymethylcellulose and evaluated in vitro. The results showed an average nano-series size of 89.02 and a zeta potential of -49.9. Cubosomes containing GA-CMC/CeO2 showed a stable release profile for 180 min. The results showed that cubosomes containing GA-CMC/CeO2 could be a promising drug carrier with normal release behavior.

Keywords: ciochemistry, biotechnology, molecular, biology

Procedia PDF Downloads 46
6752 CuO Thin Films Deposition by Spray Pyrolysis: Influence of Precursor Solution Properties

Authors: M. Lamri Zeggar, F. Bourfaa, A. Adjimi, F. Boutebakh, M. S. Aida, N. Attaf

Abstract:

CuO thin films were deposited by spray ultrasonic pyrolysis with different precursor solution. Two staring solution slats were used namely: Copper acetate and copper chloride. The influence of these solutions on CuO thin films proprieties of is instigated. The X rays diffraction (XDR) analysis indicated that the films deposed with copper acetate are amorphous however the films elaborated with copper chloride have monoclinic structure. UV- Visible transmission spectra showed a strong absorbance of the deposited CuO thin films in the visible region. Electrical characterization has shown that CuO thin films prepared with copper acetate have a higher electrical conductivity.

Keywords: thin films, cuprous oxide, spray pyrolysis, precursor solution

Procedia PDF Downloads 306
6751 Combined Synchrotron Radiography and Diffraction for in Situ Study of Reactive Infiltration of Aluminum into Iron Porous Preform

Authors: S. Djaziri, F. Sket, A. Hynowska, S. Milenkovic

Abstract:

The use of Fe-Al based intermetallics as an alternative to Cr/Ni based stainless steels is very promising for industrial applications that use critical raw materials parts under extreme conditions. However, the development of advanced Fe-Al based intermetallics with appropriate mechanical properties presents several challenges that involve appropriate processing and microstructure control. A processing strategy is being developed which aims at producing a net-shape porous Fe-based preform that is infiltrated with molten Al or Al-alloy. In the present work, porous Fe-based preforms produced by two different methods (selective laser melting (SLM) and Kochanek-process (KE)) are studied during infiltration with molten aluminum. In the objective to elucidate the mechanisms underlying the formation of Fe-Al intermetallic phases during infiltration, an in-house furnace has been designed for in situ observation of infiltration at synchrotron facilities combining x-ray radiography (XR) and x-ray diffraction (XRD) techniques. The feasibility of this approach has been demonstrated, and information about the melt flow front propagation has been obtained. In addition, reactive infiltration has been achieved where a bi-phased intermetallic layer has been identified to be formed between the solid Fe and liquid Al. In particular, a tongue-like Fe₂Al₅ phase adhering to the Fe and a needle-like Fe₄Al₁₃ phase adhering to the Al were observed. The growth of the intermetallic compound was found to be dependent on the temperature gradient present along the preform as well as on the reaction time which will be discussed in view of the different obtained results.

Keywords: combined synchrotron radiography and diffraction, Fe-Al intermetallic compounds, in-situ molten Al infiltration, porous solid Fe preforms

Procedia PDF Downloads 220
6750 Reverse Logistics Network Optimization for E-Commerce

Authors: Albert W. K. Tan

Abstract:

This research consolidates a comprehensive array of publications from peer-reviewed journals, case studies, and seminar reports focused on reverse logistics and network design. By synthesizing this secondary knowledge, our objective is to identify and articulate key decision factors crucial to reverse logistics network design for e-commerce. Through this exploration, we aim to present a refined mathematical model that offers valuable insights for companies seeking to optimize their reverse logistics operations. The primary goal of this research endeavor is to develop a comprehensive framework tailored to advising organizations and companies on crafting effective networks for their reverse logistics operations, thereby facilitating the achievement of their organizational goals. This involves a thorough examination of various network configurations, weighing their advantages and disadvantages to ensure alignment with specific business objectives. The key objectives of this research include: (i) Identifying pivotal factors pertinent to network design decisions within the realm of reverse logistics across diverse supply chains. (ii) Formulating a structured framework designed to offer informed recommendations for sound network design decisions applicable to relevant industries and scenarios. (iii) Propose a mathematical model to optimize its reverse logistics network. A conceptual framework for designing a reverse logistics network has been developed through a combination of insights from the literature review and information gathered from company websites. This framework encompasses four key stages in the selection of reverse logistics operations modes: (1) Collection, (2) Sorting and testing, (3) Processing, and (4) Storage. Key factors to consider in reverse logistics network design: I) Centralized vs. decentralized processing: Centralized processing, a long-standing practice in reverse logistics, has recently gained greater attention from manufacturing companies. In this system, all products within the reverse logistics pipeline are brought to a central facility for sorting, processing, and subsequent shipment to their next destinations. Centralization offers the advantage of efficiently managing the reverse logistics flow, potentially leading to increased revenues from returned items. Moreover, it aids in determining the most appropriate reverse channel for handling returns. On the contrary, a decentralized system is more suitable when products are returned directly from consumers to retailers. In this scenario, individual sales outlets serve as gatekeepers for processing returns. Considerations encompass the product lifecycle, product value and cost, return volume, and the geographic distribution of returns. II) In-house vs. third-party logistics providers: The decision between insourcing and outsourcing in reverse logistics network design is pivotal. In insourcing, a company handles the entire reverse logistics process, including material reuse. In contrast, outsourcing involves third-party providers taking on various aspects of reverse logistics. Companies may choose outsourcing due to resource constraints or lack of expertise, with the extent of outsourcing varying based on factors such as personnel skills and cost considerations. Based on the conceptual framework, the authors have constructed a mathematical model that optimizes reverse logistics network design decisions. The model will consider key factors identified in the framework, such as transportation costs, facility capacities, and lead times. The authors have employed mixed LP to find the optimal solutions that minimize costs while meeting organizational objectives.

Keywords: reverse logistics, supply chain management, optimization, e-commerce

Procedia PDF Downloads 33
6749 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema

Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy

Abstract:

Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.

Keywords: natural language processing, natural language interfaces, human computer interaction, end user development, dialog systems, data recognition, spreadsheet

Procedia PDF Downloads 305
6748 Radiofrequency and Near-Infrared Responsive Core-Shell Multifunctional Nanostructures Using Lipid Templates for Cancer Theranostics

Authors: Animesh Pan, Geoffrey D. Bothun

Abstract:

With the development of nanotechnology, research in multifunctional delivery systems has a new pace and dimension. An incipient challenge is to design an all-in-one delivery system that can be used for multiple purposes, including tumor targeting therapy, radio-frequency (RF-), near-infrared (NIR-), light-, or pH-induced controlled release, photothermal therapy (PTT), photodynamic therapy (PDT), and medical diagnosis. In this regard, various inorganic nanoparticles (NPs) are known to show great potential as the 'functional components' because of their fascinating and tunable physicochemical properties and the possibility of multiple theranostic modalities from individual NPs. Magnetic, luminescent, and plasmonic properties are the three most extensively studied and, more importantly biomedically exploitable properties of inorganic NPs. Although successful attempts of combining any two of them above mentioned functionalities have been made, integrating them in one system has remained challenge. Keeping those in mind, controlled designs of complex colloidal nanoparticle system are one of the most significant challenges in nanoscience and nanotechnology. Therefore, systematic and planned studies providing better revelation are demanded. We report a multifunctional delivery platform-based liposome loaded with drug, iron-oxide magnetic nanoparticles (MNPs), and a gold shell on the surface of liposomes, were synthesized using a lipid with polyelectrolyte (layersomes) templating technique. MNPs and the anti-cancer drug doxorubicin (DOX) were co-encapsulated inside liposomes composed by zwitterionic phophatidylcholine and anionic phosphatidylglycerol using reverse phase evaporation (REV) method. The liposomes were coated with positively charge polyelectrolyte (poly-L-lysine) to enrich the interface with gold anion, exposed to a reducing agent to form a gold nanoshell, and then capped with thio-terminated polyethylene glycol (SH-PEG2000). The core-shell nanostructures were characterized by different techniques like; UV-Vis/NIR scanning spectrophotometer, dynamic light scattering (DLS), transmission electron microscope (TEM). This multifunctional system achieves a variety of functions, such as radiofrequency (RF)-triggered release, chemo-hyperthermia, and NIR laser-triggered for photothermal therapy. Herein, we highlight some of the remaining major design challenges in combination with preliminary studies assessing therapeutic objectives. We demonstrate an efficient loading and delivery system to significant cell death of human cancer cells (A549) with therapeutic capabilities. Coupled with RF and NIR excitation to the doxorubicin-loaded core-shell nanostructure helped in securing targeted and controlled drug release to the cancer cells. The present core-shell multifunctional system with their multimodal imaging and therapeutic capabilities would be eminent candidates for cancer theranostics.

Keywords: cancer thernostics, multifunctional nanostructure, photothermal therapy, radiofrequency targeting

Procedia PDF Downloads 122
6747 The Influence of Cognitive Load in the Acquisition of Words through Sentence or Essay Writing

Authors: Breno Barrreto Silva, Agnieszka Otwinowska, Katarzyna Kutylowska

Abstract:

Research comparing lexical learning following the writing of sentences and longer texts with keywords is limited and contradictory. One possibility is that the recursivity of writing may enhance processing and increase lexical learning; another possibility is that the higher cognitive load of complex-text writing (e.g., essays), at least when timed, may hinder the learning of words. In our study, we selected 2 sets of 10 academic keywords matched for part of speech, length (number of characters), frequency (SUBTLEXus), and concreteness, and we asked 90 L1-Polish advanced-level English majors to use the keywords when writing sentences, timed (60 minutes) or untimed essays. First, all participants wrote a timed Control essay (60 minutes) without keywords. Then different groups produced Timed essays (60 minutes; n=33), Untimed essays (n=24), or Sentences (n=33) using the two sets of glossed keywords (counterbalanced). The comparability of the participants in the three groups was ensured by matching them for proficiency in English (LexTALE), and for few measures derived from the control essay: VocD (assessing productive lexical diversity), normed errors (assessing productive accuracy), words per minute (assessing productive written fluency), and holistic scores (assessing overall quality of production). We measured lexical learning (depth and breadth) via an adapted Vocabulary Knowledge Scale (VKS) and a free association test. Cognitive load was measured in the three essays (Control, Timed, Untimed) using normed number of errors and holistic scores (TOEFL criteria). The number of errors and essay scores were obtained from two raters (interrater reliability Pearson’s r=.78-91). Generalized linear mixed models showed no difference in the breadth and depth of keyword knowledge after writing Sentences, Timed essays, and Untimed essays. The task-based measurements found that Control and Timed essays had similar holistic scores, but that Untimed essay had better quality than Timed essay. Also, Untimed essay was the most accurate, and Timed essay the most error prone. Concluding, using keywords in Timed, but not Untimed, essays increased cognitive load, leading to more errors and lower quality. Still, writing sentences and essays yielded similar lexical learning, and differences in the cognitive load between Timed and Untimed essays did not affect lexical acquisition.

Keywords: learning academic words, writing essays, cognitive load, english as an L2

Procedia PDF Downloads 65
6746 Studying the Effect of Reducing Thermal Processing over the Bioactive Composition of Non-Centrifugal Cane Sugar: Towards Natural Products with High Therapeutic Value

Authors: Laura Rueda-Gensini, Jader Rodríguez, Juan C. Cruz, Carolina Munoz-Camargo

Abstract:

There is an emerging interest in botanicals and plant extracts for medicinal practices due to their widely reported health benefits. A large variety of phytochemicals found in plants have been correlated with antioxidant, immunomodulatory, and analgesic properties, which makes plant-derived products promising candidates for modulating the progression and treatment of numerous diseases. Non-centrifugal cane sugar (NCS), in particular, has been known for its high antioxidant and nutritional value, but composition-wise variability due to changing environmental and processing conditions have considerably limited its use in the nutraceutical and biomedical fields. This work is therefore aimed at assessing the effect of thermal exposure during NCS production over its bioactive composition and, in turn, its therapeutic value. Accordingly, two modified dehydration methods are proposed that employ: (i) vacuum-aided evaporation, which reduces the necessary temperatures to dehydrate the sample, and (ii) window refractance evaporation, which reduces thermal exposure time. The biochemical composition of NCS produced under these two methods was compared to traditionally-produced NCS by estimating their total polyphenolic and protein content with Folin-Ciocalteu and Bradford assays, as well as identifying the major phenolic compounds in each sample via HPLC-coupled mass spectrometry. Their antioxidant activities were also compared as measured by their scavenging potential of ABTS and DPPH radicals. Results show that the two modified production methods enhance polyphenolic and protein yield in resulting NCS samples when compared to traditional production methods. In particular, reducing employed temperatures with vacuum-aided evaporation demonstrated to be superior at preserving polyphenolic compounds, as evidenced both in the total and individual polyphenol concentrations. However, antioxidant activities were not significantly different between these. Although additional studies should be performed to determine if the observed compositional differences affect other therapeutic activities (e.g., anti-inflammatory, analgesic, and immunoprotective), these results suggest that reducing thermal exposure holds great promise for the production of natural products with enhanced nutritional value.

Keywords: non-centrifugal cane sugar, polyphenolic compounds, thermal processing, antioxidant activity

Procedia PDF Downloads 87
6745 Dynamic Economic Load Dispatch Using Quadratic Programming: Application to Algerian Electrical Network

Authors: A. Graa, I. Ziane, F. Benhamida, S. Souag

Abstract:

This paper presents a comparative analysis study of an efficient and reliable quadratic programming (QP) to solve economic load dispatch (ELD) problem with considering transmission losses in a power system. The proposed QP method takes care of different unit and system constraints to find optimal solution. To validate the effectiveness of the proposed QP solution, simulations have been performed using Algerian test system. Results obtained with the QP method have been compared with other existing relevant approaches available in literatures. Experimental results show a proficiency of the QP method over other existing techniques in terms of robustness and its optimal search.

Keywords: economic dispatch, quadratic programming, Algerian network, dynamic load

Procedia PDF Downloads 559
6744 Epidemiology, Clinical, Immune, and Molecular Profiles of Microsporidiosis and Cryptosporidiosis among HIV/AIDS patients

Authors: Roger WUMBA

Abstract:

The objective of this study was to determine the prevalence of intestinal parasites, with special emphasis on microsporidia and Cryptosporidium, as well as their association with human immunodeficiency virus (HIV) symptoms, risk factors, and other digestive parasites. We also wish to determine the molecular biology definitions of the species and genotypes of microsporidia and Cryptosporidium in HIV patients. In this cross-sectional study, carried out in Kinshasa, Democratic Republic of the Congo, stool samples were collected from 242 HIV patients (87 men and 155 women) with referred symptoms and risk factors for opportunistic intestinal parasites. The analysis of feces specimen were performed using Ziehl–Neelsen stainings, real-time polymerase chain reaction (PCR), immunofluorescence indirect monoclonal antibody, nested PCR-restriction fragment length polymorphism, and PCR amplification and sequencing. Odds ratio (OR) and 95% confidence intervals were used to quantify the risk. Of the 242 HIV patients, 7.8%, 0.4%, 5.4%, 0.4%, 2%, 10.6%, and 2.8% had Enterocytozoon bieneusi, Encephalitozoon intestinalis, Cryptosporidium spp., Isospora belli, pathogenic intestinal protozoa, nonpathogenic intestinal protozoa, and helminths, respectively. We found five genotypes of E. bieneusi: two older, NIA1 and D, and three new, KIN1, KIN2, and KIN3. Only 0.4% and 1.6% had Cryptosporidium parvum and Cryptosporidium hominis, respectively. Of the patients, 36.4%, 34.3%, 31%, and 39% had asthenia, diarrhea, a CD4 count of ,100 cells/mm³, and no antiretroviral therapy (ART), respectively. The majority of those with opportunistic intestinal parasites and C. hominis, and all with C. parvum and new E. bieneusi genotypes, had diarrhea, low CD4+ counts of ,100 cells/mm³, and no ART. There was a significant association between Entamoeba coli, Kaposi sarcoma, herpes zoster, chronic diarrhea, and asthenia, and the presence of 28 cases with opportunistic intestinal parasites. Rural areas, public toilets, and exposure to farm pigs were the univariate risk factors present in the 28 cases with opportunistic intestinal parasites. In logistic regression analysis, a CD4 count of ,100 cells/mm³ (OR = 4.60; 95% CI 1.70–12.20; P = 0.002), no ART (OR = 5.00; 95% CI 1.90–13.20; P , 0.001), and exposure to surface water (OR = 2.90; 95% CI 1.01–8.40; P = 0.048) were identified as the significant and independent determinants for the presence of opportunistic intestinal parasites. E. bieneusi and Cryptosporidium are becoming more prevalent in Kinshasa, Congo. Based on the findings, we recommend epidemiology surveillance and prevention by means of hygiene, the emphasis of sensitive PCR methods, and treating opportunistic intestinal parasites that may be acquired through fecal–oral transmission, surface water, normal immunity, rural area-based person–person and animal–human nfection, and transmission of HIV. Therapy, including ART and treatment with fumagillin, is needed.

Keywords: diarrhea, enterocytozoon bieneusi, cryptosporidium hominis, cryptosporidium parvum, risk factors, africans

Procedia PDF Downloads 120
6743 Dynamic Pricing With Demand Response Managment in Smart Grid: Stackelberg Game Approach

Authors: Hasibe Berfu Demi̇r, Şakir Esnaf

Abstract:

In the past decade, extensive improvements have been done in electrical grid infrastructures. It is very important to make plans on supply, demand, transmission, distribution and pricing for the development of the electricity energy sector. Based on this perspective, in this study, Stackelberg game approach is proposed for demand participation management (DRM), which has become an important component in the smart grid to effectively reduce power generation costs and user bills. The purpose of this study is to examine electricity consumption from a dynamic pricing perspective. The results obtained were compared with the current situation and the results were interpreted.

Keywords: lectricity, stackelberg, smart grid, demand response managment, dynamic pricing

Procedia PDF Downloads 91
6742 Thermodynamic and Immunochemical Studies of Antibody Biofunctionalized Gold Nanoparticles Mediated Photothermal Ablation in Human Liver Cancer Cells

Authors: Lucian Mocan, Flaviu Tabaran, Teodora Mocan, Cristian Matea, Cornel Iancu

Abstract:

We present method of Gold Nanoparticle enhanced laser thermal ablation of HepG2 cells (Human hepatocellular liver carcinoma cell line), based on a simple gold nanoparticle carrier system, such as serum albumin (BSA), and demonstrate its selective therapeutic efficacy. Hyperspectral, contrast phase, and confocal microscopy combined immunochemical staining were used to demonstrate the selective internalization of HSA-GNPs via Gp60 receptors and the caveolin-mediated endocytosis inside HepG2 cells. We examined the ability of laser-activated carbon nanotubes to induce Hsp70 expression using confocal microscopy. Hep G2 cells heat-shocked (laser activated BSA-GNPs) to 42°C demonstrated an up-regulation of Hsp70 compared with control cells (BSA-GNPs treated cells without laser), which showed no detectable constitutive expression of Hsp70. We observed a time-dependent induction in Hsp70 expression in Hep G2 treated with BSA-GNPs and LASER irradiated. The post-irradiation apoptotic rate of HepG2 cells treated with HSA-GNPs ranged from 88.24% (for 50 mg/L) at 60 seconds, while at 30 minute the rate increased to 92.34% (50 mg/L). These unique results may represent a major step in liver cancer treatment using nanolocalized thermal ablation by laser heating.

Keywords: gold nanoparticles, liver cancer, albumin, laser irradiation

Procedia PDF Downloads 295
6741 Effectiveness of Cold Calling on Students’ Behavior and Participation during Class Discussions: Punishment or Opportunity to Shine

Authors: Maimuna Akram, Khadija Zia, Sohaib Naseer

Abstract:

Pedagogical objectives and the nature of the course content may lead instructors to take varied approaches to selecting a student for the cold call, specifically in a studio setup where students work on different projects independently and show progress work time to time at scheduled critiques. Cold-calling often proves to be an effective tool in eliciting a response without enforcing judgment onto the recipients. While there is a mixed range of behavior exhibited by students who are cold-called, a classification of responses from anxiety-provoking to inspiring may be elicited; there is a need for a greater understanding of utilizing the exchanges in bringing about fruitful and engaging outcomes of studio discussions. This study aims to unravel the dimensions of utilizing the cold-call approach in a didactic exchange within studio pedagogy. A questionnaire survey was conducted in an undergraduate class at Arts and Design School. The impact of cold calling on students’ participation was determined through various parameters, including course choice, participation frequency, students’ comfortability, and teaching methodology. After analyzing the surveys, specific classroom teachers were interviewed to provide a qualitative perspective of the faculty. It was concluded that cold-calling increases students’ participation frequency and also increases preparation for class. Around 67% of students responded that teaching methods play an important role in learning activities and students’ participation during class discussions. 84% of participants agreed that cold calling is an effective way of learning. According to research, cold-calling can be done in large numbers without making students uncomfortable. As a result, the findings of this study support the use of this instructional method to encourage more students to participate in class discussions.

Keywords: active learning, class discussion, class participation, cold calling, pedagogical methods, student engagement

Procedia PDF Downloads 30
6740 Polyphenol-Rich Aronia Melanocarpa Juice Consumption and Line-1 Dna Methylation in a Cohort at Cardiovascular Risk

Authors: Ljiljana Stojković, Manja Zec, Maja Zivkovic, Maja Bundalo, Marija Glibetić, Dragan Alavantić, Aleksandra Stankovic

Abstract:

Cardiovascular disease (CVD) is associated with alterations in DNA methylation, the latter modulated by dietary polyphenols. The present pilot study (part of the original clinical study registered as NCT02800967 at www.clinicaltrials.gov) aimed to investigate the impact of 4-week daily consumption of polyphenol-rich Aronia melanocarpa juice on Long Interspersed Nucleotide Element-1 (LINE-1) methylation in peripheral blood leukocytes, in subjects (n=34, age of 41.1±6.6 years) at moderate CVD risk, including an increased body mass index, central obesity, high normal blood pressure and/or dyslipidemia. The goal was also to examine whether factors known to affect DNA methylation, such as folate intake levels, MTHFR C677T gene variant, as well as the anthropometric and metabolic parameters, modulated the LINE-1 methylation levels upon consumption of polyphenol-rich Aronia juice. The experimental analysis of LINE-1 methylation was done by the MethyLight method. MTHFR C677T genotypes were determined by the polymerase chain reaction-restriction fragment length polymorphism method. Folate intake was assessed by processing the data from the food frequency questionnaire and repeated 24-hour dietary recalls. Serum lipid profile was determined by using Roche Diagnostics kits. The statistical analyses were performed using the Statistica software package. In women, after vs. before the treatment period, a significant decrease in LINE-1 methylation levels was observed (97.54±1.50% vs. 98.39±0.86%, respectively; P=0.01). The change (after vs. before treatment) in LINE-1 methylation correlated directly with MTHFR 677T allele presence, average daily folate intake and the change in serum low-density lipoprotein cholesterol, while inversely with the change in serum triacylglycerols (R=0.72, R2=0.52, adjusted R2=0.36, P=0.03). The current results imply potential cardioprotective effects of habitual polyphenol-rich Aronia juice consumption achieved through the modifications of DNA methylation pattern in subjects at CVD risk, which should be further confirmed. Hence, the precision nutrition-driven modulations of DNA methylation may become targets for new approaches in the prevention and treatment of CVD.

Keywords: Aronia melanocarpa, cardiovascular risk, LINE-1, methylation, peripheral blood leukocytes, polyphenol

Procedia PDF Downloads 190
6739 Qualitative and Quantitative Methods in Multidisciplinary Fields Collection Development

Authors: Hui Wang

Abstract:

Traditional collection building approaches are limited in breadth and scope and are not necessarily suitable for multidisciplinary fields development in the institutes of the Chinese Academy of Sciences. The increasing of multidisciplinary fields researches require a viable approach to collection development in these libraries. This study uses qualitative and quantitative analysis to assess collection. The quantitative analysis consists of three levels of evaluation, which including realistic demand, potential demand and trend demand analysis. For one institute, three samples were separately selected from the object institute, more than one international top institutes in highly relative research fields and future research hotspots. Each sample contains an appropriate number of papers published in recent five years. Several keywords and the organization names were reasonably combined to search in commercial databases and the institutional repositories. The publishing information and citations in the bibliographies of these papers were selected to build the dataset. One weighted evaluation model and citation analysis were used to calculate the demand intensity index of every journal and book. Principal Investigator selector and database traffic provide a qualitative evidence to describe the demand frequency. The demand intensity, demand frequency and academic committee recommendations were comprehensively considered to recommend collection development. The collection gaps or weaknesses were ascertained by comparing the current collection and the recommend collection. This approach was applied in more than 80 institutes’ libraries in Chinese Academy of Sciences in the past three years. The evaluation results provided an important evidence for collections building in the second year. The latest user survey results showed that the updated collection’s capacity to support research in a multidisciplinary subject area have increased significantly.

Keywords: citation analysis, collection assessment, collection development, quantitative analysis

Procedia PDF Downloads 212
6738 Empirical Analysis of the Global Impact of Cybercrime Laws on Cyber Attacks and Malware Types

Authors: Essang Anwana Onuntuei, Chinyere Blessing Azunwoke

Abstract:

The study focused on probing the effectiveness of online consumer privacy and protection laws, electronic transaction laws, privacy and data protection laws, and cybercrime legislation amid frequent cyber-attacks and malware types worldwide. An empirical analysis was engaged to uncover ties and causations between the stringency and implementation of these legal structures and the prevalence of cyber threats. A deliberate sample of seventy-eight countries (thirteen countries each from six continents) was chosen as sample size to study the challenges linked with trending regulations and possible panoramas for improving cybersecurity through refined legal approaches. Findings establish if the frequency of cyber-attacks and malware types vary significantly. Also, the result proved that various cybercrime laws differ statistically, and electronic transactions law does not statistically impact the frequency of cyber-attacks. The result also statistically revealed that the online Consumer Privacy and Protection law does not influence the total number of cyber-attacks. In addition, the results implied that Privacy and Data Protection laws do not statistically impact the total number of cyber-attacks worldwide. The calculated value also proved that cybercrime law does not statistically impact the total number of cyber-attacks. Finally, the computed value concludes that combined multiple cyber laws do not significantly impact the total number of cyber-attacks worldwide. Suggestions were produced based on findings from the study, contributing to the ongoing debate on the validity of legal approaches in battling cybercrime and shielding consumers in the digital age.

Keywords: cybercrime legislation, cyber attacks, consumer privacy and protection law, detection, electronic transaction law, prevention, privacy and data protection law, prohibition, prosecution

Procedia PDF Downloads 31
6737 Analysis of Fuel Efficiency in Heavy Construction Compaction Machine and Factors Affecting Fuel Efficiency

Authors: Amey Kulkarni, Paavan Shetty, Amol Patil, B. Rajiv

Abstract:

Fuel Efficiency plays a very important role in overall performance of an automobile. In this paper study of fuel efficiency of heavy construction, compaction machine is done. The fuel Consumption trials are performed in order to obtain the consumption of fuel in performing certain set of actions by the compactor. Usually, Heavy Construction machines are put to work in locations where refilling the fuel tank is not an easy task and also the fuel is consumed at a greater rate than a passenger automobile. So it becomes important to have a fuel efficient machine for long working hours. The fuel efficiency is the most important point in determining the future scope of the product. A heavy construction compaction machine operates in five major roles. These five roles are traveling, Static working, High-frequency Low amplitude compaction, Low-frequency High amplitude compaction, low idle. Fuel consumption readings for 1950 rpm, 2000 rpm & 2350 rpm of the engine are taken by using differential fuel flow meter and are analyzed. And the optimum RPM setting which fulfills the fuel efficiency, as well as engine performance criteria, is considered. Also, other factors such as rear end gears, Intake and exhaust restriction for an engine, vehicle operating techniques, air drag, Tribological aspects, Tires are considered for increasing the fuel efficiency of the compactor. The fuel efficiency of compactor can be precisely calculated by using Differential Fuel Flow Meter. By testing the compactor at different combinations of Engine RPM and also considering other factors such as rear end gears, Intake and exhaust restriction of an engine, vehicle operating techniques, air drag, Tribological aspects, The optimum solution was obtained which lead to significant improvement in fuel efficiency of the compactor.

Keywords: differential fuel flow meter, engine RPM, fuel efficiency, heavy construction compaction machine

Procedia PDF Downloads 283
6736 Acoustic Emission for Investigation of Processes Occurring at Hydrogenation of Metallic Titanium

Authors: Anatoly A. Kuznetsov, Pavel G. Berezhko, Sergey M. Kunavin, Eugeny V. Zhilkin, Maxim V. Tsarev, Vyacheslav V. Yaroshenko, Valery V. Mokrushin, Olga Y. Yunchina, Sergey A. Mityashin

Abstract:

The acoustic emission is caused by short-time propagation of elastic waves that are generated as a result of quick energy release from sources localized inside some material. In particular, the acoustic emission phenomenon lies in the generation of acoustic waves resulted from the reconstruction of material internal structures. This phenomenon is observed at various physicochemical transformations, in particular, at those accompanying hydrogenation processes of metals or intermetallic compounds that make it possible to study parameters of these transformations through recording and analyzing the acoustic signals. It has been known that at the interaction between metals or inter metallides with hydrogen the most intensive acoustic signals are generated as a result of cracking or crumbling of an initial compact powder sample as a result of the change of material crystal structure under hydrogenation. This work is dedicated to the study into changes occurring in metallic titanium samples at their interaction with hydrogen and followed by acoustic emission signals. In this work the subjects for investigation were specimens of metallic titanium in two various initial forms: titanium sponge and fine titanium powder made of this sponge. The kinetic of the interaction of these materials with hydrogen, the acoustic emission signals accompanying hydrogenation processes and the structure of the materials before and after hydrogenation were investigated. It was determined that in both cases interaction of metallic titanium and hydrogen is followed by acoustic emission signals of high amplitude generated on reaching some certain value of the atomic ratio [H]/[Ti] in a solid phase because of metal cracking at a macrolevel. The typical sizes of the cracks are comparable with particle sizes of hydrogenated specimens. The reasons for cracking are internal stresses initiated in a sample due to the increasing volume of a solid phase as a result of changes in a material crystal lattice under hydrogenation. When the titanium powder is used, the atomic ratio [H]/[Ti] in a solid phase corresponding to the maximum amplitude of an acoustic emission signal are, as a rule, higher than when titanium sponge is used.

Keywords: acoustic emission signal, cracking, hydrogenation, titanium specimen

Procedia PDF Downloads 380
6735 Designing a Cyclic Redundancy Checker-8 for 32 Bit Input Using VHDL

Authors: Ankit Shai

Abstract:

CRC or Cyclic Redundancy Check is one of the most common, and one of the most powerful error-detecting codes implemented on modern computers. Most of the modern communication protocols use some error detection algorithms in digital networks and storage devices to detect accidental changes to raw data between transmission and reception. Cyclic Redundancy Check, or CRC, is the most popular one among these error detection codes. CRC properties are defined by the generator polynomial length and coefficients. The aim of this project is to implement an efficient FPGA based CRC-8 that accepts a 32 bit input, taking into consideration optimal chip area and high performance, using VHDL. The proposed architecture is implemented on Xilinx ISE Simulator. It is designed while keeping in mind the hardware design, complexity and cost factor.

Keywords: cyclic redundancy checker, CRC-8, 32-bit input, FPGA, VHDL, ModelSim, Xilinx

Procedia PDF Downloads 290
6734 Water Droplet Impact on Vibrating Rigid Superhydrophobic Surfaces

Authors: Jingcheng Ma, Patricia B. Weisensee, Young H. Shin, Yujin Chang, Junjiao Tian, William P. King, Nenad Miljkovic

Abstract:

Water droplet impact on surfaces is a ubiquitous phenomenon in both nature and industry. The transfer of mass, momentum and energy can be influenced by the time of contact between droplet and surface. In order to reduce the contact time, we study the influence of substrate motion prior to impact on the dynamics of droplet recoil. Using optical high speed imaging, we investigated the impact dynamics of macroscopic water droplets (~ 2mm) on rigid nanostructured superhydrophobic surfaces vibrating at 60 – 300 Hz and amplitudes of 0 – 3 mm. In addition, we studied the influence of the phase of the substrate at the moment of impact on total contact time. We demonstrate that substrate vibration can alter droplet dynamics, and decrease total contact time by as much as 50% compared to impact on stationary rigid superhydrophobic surfaces. Impact analysis revealed that the vibration frequency mainly affected the maximum contact time, while the amplitude of vibration had little direct effect on the contact time. Through mathematical modeling, we show that the oscillation amplitude influences the possibility density function of droplet impact at a given phase, and thus indirectly influences the average contact time. We also observed more vigorous droplet splashing and breakup during impact at larger amplitudes. Through semi-empirical mathematical modeling, we describe the relationship between contact time and vibration frequency, phase, and amplitude of the substrate. We also show that the maximum acceleration during the impact process is better suited as a threshold parameter for the onset of splashing than a Weber-number criterion. This study not only provides new insights into droplet impact physics on vibrating surfaces, but develops guidelines for the rational design of surfaces to achieve controllable droplet wetting in applications utilizing vibration.

Keywords: contact time, impact dynamics, oscillation, pear-shape droplet

Procedia PDF Downloads 450
6733 The Relation between Cognitive Fluency and Utterance Fluency in Second Language Spoken Fluency: Studying Fluency through a Psycholinguistic Lens

Authors: Tannistha Dasgupta

Abstract:

This study explores the aspects of second language (L2) spoken fluency that are related to L2 linguistic knowledge and processing skill. It draws on Levelt’s ‘blueprint’ of the L2 speaker which discusses the cognitive issues underlying the act of speaking. However, L2 speaking assessments have largely neglected the underlying mechanism involved in language production; emphasis is given on the relationship between subjective ratings of L2 speech sample and objectively measured aspects of fluency. Hence, in this study, the relation between L2 linguistic knowledge and processing skill i.e. Cognitive Fluency (CF), and objectively measurable aspects of L2 spoken fluency i.e. Utterance Fluency (UF) is examined. The participants of the study are L2 learners of English, studying at high school level in Hyderabad, India. 50 participants with intermediate level of proficiency in English performed several lexical retrieval tasks and attention-shifting tasks to measure CF, and 8 oral tasks to measure UF. Each aspect of UF (speed, pause, and repair) were measured against the scores of CF to find out those aspects of UF which are reliable indicators of CF. Quantitative analysis of the data shows that among the three aspects of UF; speed is the best predictor of CF, and pause is weakly related to CF. The study suggests that including the speed aspect of UF could make L2 fluency assessment more reliable, valid, and objective. Thus, incorporating the assessment of psycholinguistic mechanisms into L2 spoken fluency testing, could result in fairer evaluation.

Keywords: attention-shifting, cognitive fluency, lexical retrieval, utterance fluency

Procedia PDF Downloads 707
6732 Digitalisation of the Railway Industry: Recent Advances in the Field of Dialogue Systems: Systematic Review

Authors: Andrei Nosov

Abstract:

This paper discusses the development directions of dialogue systems within the digitalisation of the railway industry, where technologies based on conversational AI are already potentially applied or will be applied. Conversational AI is one of the popular natural language processing (NLP) tasks, as it has great prospects for real-world applications today. At the same time, it is a challenging task as it involves many areas of NLP based on complex computations and deep insights from linguistics and psychology. In this review, we focus on dialogue systems and their implementation in the railway domain. We comprehensively review the state-of-the-art research results on dialogue systems and analyse them from three perspectives: type of problem to be solved, type of model, and type of system. In particular, from the perspective of the type of tasks to be solved, we discuss characteristics and applications. This will help to understand how to prioritise tasks. In terms of the type of models, we give an overview that will allow researchers to become familiar with how to apply them in dialogue systems. By analysing the types of dialogue systems, we propose an unconventional approach in contrast to colleagues who traditionally contrast goal-oriented dialogue systems with open-domain systems. Our view focuses on considering retrieval and generative approaches. Furthermore, the work comprehensively presents evaluation methods and datasets for dialogue systems in the railway domain to pave the way for future research. Finally, some possible directions for future research are identified based on recent research results.

Keywords: digitalisation, railway, dialogue systems, conversational AI, natural language processing, natural language understanding, natural language generation

Procedia PDF Downloads 59
6731 Magnetic Properties of Sr-Ferrite Nano-Powder Synthesized by Sol-Gel Auto-Combustion Method

Authors: M. Ghobeiti-Hasab, Z. Shariati

Abstract:

In this paper, strontium ferrite (SrO.6Fe2O3) was synthesized by the sol-gel auto-combustion process. The thermal behavior of powder obtained from self-propagating combustion of initial gel was evaluated by simultaneous differential thermal analysis (DTA) and thermo gravimetric (TG), from room temperature to 1200°C. The as-burnt powder was calcined at various temperatures from 700-900°C to achieve the single-phase Sr-ferrite. Phase composition, morphology and magnetic properties were investigated using X-ray diffraction (XRD), transmission electron microscopy (TEM) and vibrating sample magnetometry (VSM) techniques. Results showed that the single-phase and nano-sized hexagonal strontium ferrite particles were formed at calcination temperature of 800°C with crystallite size of 27 nm and coercivity of 6238 Oe.

Keywords: hard magnet, Sr-ferrite, sol-gel auto-combustion, nano-powder

Procedia PDF Downloads 360
6730 F-IVT Actuation System to Power Artificial Knee Joint

Authors: Alò Roberta, Bottiglione Francesco, Mantriota Giacomo

Abstract:

The efficiency of the actuation system of lower limb exoskeletons and of active orthoses is a significant aspect of the design of such devices because it affects their efficacy. F-IVT is an innovative actuation system to power artificial knee joint with energy recovery capabilities. Its key and non-conventional elements are a flywheel, that acts as a mechanical energy storage system, and an Infinitely Variable Transmission (IVT). The design of the F-IVT can be optimized for a certain walking condition, resulting in a heavy reduction of both the electric energy consumption and of the electric peak power. In this work, by means of simulations of level ground walking at different speeds, it is demonstrated how F-IVT is still an advantageous actuator, even when it does not work in nominal conditions.

Keywords: active orthoses, actuators, lower extremity exoskeletons, knee joint

Procedia PDF Downloads 595
6729 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier

Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh

Abstract:

This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.

Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems

Procedia PDF Downloads 36
6728 Phenotypic Characterization of Listeria Spp Isolated from Chicken Carcasses Marketed in Northeast of Iran

Authors: Abdollah Jamshidi, Tayebeh Zeinali, Mehrnaz Rad, Jamshid Razmyar

Abstract:

Listeria infections occur worldwide in variety of animals and man. Listeriae are widely distributed in nature. The organism has been isolated from the feces of humans and several animals, different soils, plants, aquatic environments and food of animal and vegetable origin. Listeria monocytogenes is recognized as important food-borne pathogens due to its high mortality rate. This organism is able to growth at refrigeration temperature, and high osmotic pressure. Poultry can become contaminated environmentally or through healthy carrier birds. In recent decades, prophylactic use of antimicrobial agents may be lead to emergence of antibiotic resistant organisms, which can be transmitted to human through consumption of contaminated foods. In this study, from 200 fresh chicken carcasses samples which were collected randomly from different supermarkets and butcheries, 80 samples were detected as contaminate with Listeria spp. and 19% of the isolates identified as Listeria monocytogene using multiplex PCR assay. Conventional methods were used to differentiate other species of the listeria genus. The results showed the most prevalent isolates as L. monocytogenes (48.75%). Other isolates were detected as Listeria innocua (28.75%), Listeria murrayi (20%), Listeria grayi (3.75%) and Listeria welshimeri (2.5%).The Majority of the isolates had multidrug resistance to commonly used antibiotics. Most of them were resistant to erythromycin (50%), followed by Tetracycline (44.44%), Clindamycin (41.66%), and Trimethoprim (25%). Some of them showed resistance to chloramphenicol (17.65%). The results indicate the resistance of the isolates to antimicrobials commonly used to treat human listeriosis, which could be a potential health hazard for consumers.

Keywords: listeria species, L. monocytogenes, antibiotic resistance, chicken carcass

Procedia PDF Downloads 375
6727 Sentiment Analysis of Chinese Microblog Comments: Comparison between Support Vector Machine and Long Short-Term Memory

Authors: Xu Jiaqiao

Abstract:

Text sentiment analysis is an important branch of natural language processing. This technology is widely used in public opinion analysis and web surfing recommendations. At present, the mainstream sentiment analysis methods include three parts: sentiment analysis based on a sentiment dictionary, based on traditional machine learning, and based on deep learning. This paper mainly analyzes and compares the advantages and disadvantages of the SVM method of traditional machine learning and the Long Short-term Memory (LSTM) method of deep learning in the field of Chinese sentiment analysis, using Chinese comments on Sina Microblog as the data set. Firstly, this paper classifies and adds labels to the original comment dataset obtained by the web crawler, and then uses Jieba word segmentation to classify the original dataset and remove stop words. After that, this paper extracts text feature vectors and builds document word vectors to facilitate the training of the model. Finally, SVM and LSTM models are trained respectively. After accuracy calculation, it can be obtained that the accuracy of the LSTM model is 85.80%, while the accuracy of SVM is 91.07%. But at the same time, LSTM operation only needs 2.57 seconds, SVM model needs 6.06 seconds. Therefore, this paper concludes that: compared with the SVM model, the LSTM model is worse in accuracy but faster in processing speed.

Keywords: sentiment analysis, support vector machine, long short-term memory, Chinese microblog comments

Procedia PDF Downloads 86