Search results for: diagnostic accuracy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4596

Search results for: diagnostic accuracy

666 Modeling of the Dynamic Characteristics of a Spindle with Experimental Validation

Authors: Jhe-Hao Huang, Kun-Da Wu, Wei-Cheng Shih, Jui-Pin Hung

Abstract:

This study presented the investigation on the dynamic characteristics of a spindle tool system by experimental and finite element modeling approaches. As well known facts, the machining stability is greatly determined by the dynamic characteristics of the spindle tool system. Therefore, understanding the factors affecting dynamic behavior of a spindle tooling system is a prerequisite in dominating the final machining performance of machine tool system. To this purpose, a physical spindle unit was employed to assess the dynamic characteristics by vibration tests. Then, a three-dimensional finite element model of a high-speed spindle system integrated with tool holder was created to simulate the dynamic behaviors. For modeling the angular contact bearings, a series of spring elements were introduced between the inner and outer rings. The spring constant can be represented by the contact stiffness of the rolling bearing based on Hertz theory. The interface characteristic between spindle nose and tool holder taper can be quantified from the comparison of the measurements and predictions. According to the results obtained from experiments and finite element predictions, the vibration behavior of the spindle is dominated by the bending deformation of the spindle shaft in different modes, which is further determined by the stiffness of the bearings in spindle housing. Also, the spindle unit with tool holder shows a different dynamic behavior from that of spindle without tool holder. This indicates the interface property between tool holder and spindle nose plays an dominance on the dynamic characteristics the spindle tool system. Overall, the dynamic behaviors the spindle with and without tool holder can be successfully investigated through the finite element model proposed in this study. The prediction accuracy is determined by the modeling of the rolling interface of ball bearings in spindles and the interface characteristics between tool holder and spindle nose. Besides, identifications of the interface characteristics of a ball bearing and spindle tool holder are important for the refinement of the spindle tooling system to achieve the optimum machining performance.

Keywords: contact stiffness, dynamic characteristics, spindle, tool holder interface

Procedia PDF Downloads 297
665 Empirical Analysis of Forensic Accounting Practices for Tackling Persistent Fraud and Financial Irregularities in the Nigerian Public Sector

Authors: Sani AbdulRahman Bala

Abstract:

This empirical study delves into the realm of forensic accounting practices within the Nigerian Public Sector, seeking to quantitatively analyze their efficacy in addressing the persistent challenges of fraud and financial irregularities. With a focus on empirical data, this research employs a robust methodology to assess the current state of fraud in the Nigerian Public Sector and evaluate the performance of existing forensic accounting measures. Through quantitative analyses, including statistical models and data-driven insights, the study aims to identify patterns, trends, and correlations associated with fraudulent activities. The research objectives include scrutinizing documented fraud cases, examining the effectiveness of established forensic accounting practices, and proposing data-driven strategies for enhancing fraud detection and prevention. Leveraging quantitative methodologies, the study seeks to measure the impact of technological advancements on forensic accounting accuracy and efficiency. Additionally, the research explores collaborative mechanisms among government agencies, regulatory bodies, and the private sector by quantifying the effects of information sharing on fraud prevention. The empirical findings from this study are expected to provide a nuanced understanding of the challenges and opportunities in combating fraud within the Nigerian Public Sector. The quantitative insights derived from real-world data will contribute to the refinement of forensic accounting strategies, ensuring their effectiveness in addressing the unique complexities of financial irregularities in the public sector. The study's outcomes aim to inform policymakers, practitioners, and stakeholders, fostering evidence-based decision-making and proactive measures for a more resilient and fraud-resistant financial governance system in Nigeria.

Keywords: fraud, financial irregularities, nigerian public sector, quantitative investigation

Procedia PDF Downloads 57
664 Landslide and Liquefaction Vulnerability Analysis Using Risk Assessment Analysis and Analytic Hierarchy Process Implication: Suitability of the New Capital of the Republic of Indonesia on Borneo Island

Authors: Rifaldy, Misbahudin, Khalid Rizky, Ricky Aryanto, M. Alfiyan Bagus, Fahri Septianto, Firman Najib Wibisana, Excobar Arman

Abstract:

Indonesia is a country that has a high level of disaster because it is on the ring of fire, and there are several regions with three major plates meeting in the world. So that disaster analysis must always be done to see the potential disasters that might always occur, especially in this research are landslides and liquefaction. This research was conducted to analyze areas that are vulnerable to landslides and liquefaction hazards and their relationship with the assessment of the issue of moving the new capital of the Republic of Indonesia to the island of Kalimantan with a total area of 612,267.22 km². The method in this analysis uses the Analytical Hierarchy Process and consistency ratio testing as a complex and unstructured problem-solving process into several parameters by providing values. The parameters used in this analysis are the slope, land cover, lithology distribution, wetness index, earthquake data, peak ground acceleration. Weighted overlay was carried out from all these parameters using the percentage value obtained from the Analytical Hierarchy Process and confirmed its accuracy with a consistency ratio so that a percentage of the area obtained with different vulnerability classification values was obtained. Based on the analysis results obtained vulnerability classification from very high to low vulnerability. There are (0.15%) 918.40083 km² of highly vulnerable, medium (20.75%) 127,045,44815 km², low (56.54%) 346,175.886188 km², very low (22.56%) 138,127.484832 km². This research is expected to be able to map landslides and liquefaction disasters on the island of Kalimantan and provide consideration of the suitability of regional development of the new capital of the Republic of Indonesia. Also, this research is expected to provide input or can be applied to all regions that are analyzing the vulnerability of landslides and liquefaction or the suitability of the development of certain regions.

Keywords: analytic hierarchy process, Borneo Island, landslide and liquefaction, vulnerability analysis

Procedia PDF Downloads 166
663 An Advanced Approach to Detect and Enumerate Soil-Transmitted Helminth Ova from Wastewater

Authors: Vivek B. Ravindran, Aravind Surapaneni, Rebecca Traub, Sarvesh K. Soni, Andrew S. Ball

Abstract:

Parasitic diseases have a devastating, long-term impact on human health and welfare. More than two billion people are infected with soil-transmitted helminths (STHs), including the roundworms (Ascaris), hookworms (Necator and Ancylostoma) and whipworm (Trichuris) with majority occurring in the tropical and subtropical regions of the world. Despite its low prevalence in developed countries, the removal of STHs from wastewater remains crucial to allow the safe use of sludge or recycled water in agriculture. Conventional methods such as incubation and optical microscopy are cumbersome; consequently, the results drastically vary from person-to-person observing the ova (eggs) under microscope. Although PCR-based methods are an alternative to conventional techniques, it lacks the ability to distinguish between viable and non-viable helminth ova. As a result, wastewater treatment industries are in major need for radically new and innovative tools to detect and quantify STHs eggs with precision, accuracy and being cost-effective. In our study, we focus on the following novel and innovative techniques: -Recombinase polymerase amplification and Surface enhanced Raman spectroscopy (RPA-SERS) based detection of helminth ova. -Use of metal nanoparticles and their relative nanozyme activity. -Colorimetric detection, differentiation and enumeration of genera of helminth ova using hydrolytic enzymes (chitinase and lipase). -Propidium monoazide (PMA)-qPCR to detect viable helminth ova. -Modified assay to recover and enumerate helminth eggs from fresh raw sewage. -Transcriptome analysis of ascaris ova in fresh raw sewage. The aforementioned techniques have the potential to replace current conventional and molecular methods thereby producing a standard protocol for the determination and enumeration of helminth ova in sewage sludge.

Keywords: colorimetry, helminth, PMA-QPCR, nanoparticles, RPA, viable

Procedia PDF Downloads 297
662 Technology Futures in Global Militaries: A Forecasting Method Using Abstraction Hierarchies

Authors: Mark Andrew

Abstract:

Geopolitical tensions are at a thirty-year high, and the pace of technological innovation is driving asymmetry in force capabilities between nation states and between non-state actors. Technology futures are a vital component of defence capability growth, and investments in technology futures need to be informed by accurate and reliable forecasts of the options for ‘systems of systems’ innovation, development, and deployment. This paper describes a method for forecasting technology futures developed through an analysis of four key systems’ development stages, namely: technology domain categorisation, scanning results examining novel systems’ signals and signs, potential system-of systems’ implications in warfare theatres, and political ramifications in terms of funding and development priorities. The method has been applied to several technology domains, including physical systems (e.g., nano weapons, loitering munitions, inflight charging, and hypersonic missiles), biological systems (e.g., molecular virus weaponry, genetic engineering, brain-computer interfaces, and trans-human augmentation), and information systems (e.g., sensor technologies supporting situation awareness, cyber-driven social attacks, and goal-specification challenges to proliferation and alliance testing). Although the current application of the method has been team-centred using paper-based rapid prototyping and iteration, the application of autonomous language models (such as GPT-3) is anticipated as a next-stage operating platform. The importance of forecasting accuracy and reliability is considered a vital element in guiding technology development to afford stronger contingencies as ideological changes are forecast to expand threats to ecology and earth systems, possibly eclipsing the traditional vulnerabilities of nation states. The early results from the method will be subjected to ground truthing using longitudinal investigation.

Keywords: forecasting, technology futures, uncertainty, complexity

Procedia PDF Downloads 113
661 The Relationship between Spindle Sound and Tool Performance in Turning

Authors: N. Seemuang, T. McLeay, T. Slatter

Abstract:

Worn tools have a direct effect on the surface finish and part accuracy. Tool condition monitoring systems have been developed over a long period and used to avoid a loss of productivity resulting from using a worn tool. However, the majority of tool monitoring research has applied expensive sensing systems not suitable for production. In this work, the cutting sound in turning machine was studied using microphone. Machining trials using seven cutting conditions were conducted until the observable flank wear width (FWW) on the main cutting edge exceeded 0.4 mm. The cutting inserts were removed from the tool holder and the flank wear width was measured optically. A microphone with built-in preamplifier was used to record the machining sound of EN24 steel being face turned by a CNC lathe in a wet cutting condition using constant surface speed control. The sound was sampled at 50 kS/s and all sound signals recorded from microphone were transformed into the frequency domain by FFT in order to establish the frequency content in the audio signature that could be then used for tool condition monitoring. The extracted feature from audio signal was compared to the flank wear progression on the cutting inserts. The spectrogram reveals a promising feature, named as ‘spindle noise’, which emits from the main spindle motor of turning machine. The spindle noise frequency was detected at 5.86 kHz of regardless of cutting conditions used on this particular CNC lathe. Varying cutting speed and feed rate have an influence on the magnitude of power spectrum of spindle noise. The magnitude of spindle noise frequency alters in conjunction with the tool wear progression. The magnitude increases significantly in the transition state between steady-state wear and severe wear. This could be used as a warning signal to prepare for tool replacement or adapt cutting parameters to extend tool life.

Keywords: tool wear, flank wear, condition monitoring, spindle noise

Procedia PDF Downloads 333
660 Semantic Search Engine Based on Query Expansion with Google Ranking and Similarity Measures

Authors: Ahmad Shahin, Fadi Chakik, Walid Moudani

Abstract:

Our study is about elaborating a potential solution for a search engine that involves semantic technology to retrieve information and display it significantly. Semantic search engines are not used widely over the web as the majorities are still in Beta stage or under construction. Many problems face the current applications in semantic search, the major problem is to analyze and calculate the meaning of query in order to retrieve relevant information. Another problem is the ontology based index and its updates. Ranking results according to concept meaning and its relation with query is another challenge. In this paper, we are offering a light meta-engine (QESM) which uses Google search, and therefore Google’s index, with some adaptations to its returned results by adding multi-query expansion. The mission was to find a reliable ranking algorithm that involves semantics and uses concepts and meanings to rank results. At the beginning, the engine finds synonyms of each query term entered by the user based on a lexical database. Then, query expansion is applied to generate different semantically analogous sentences. These are generated randomly by combining the found synonyms and the original query terms. Our model suggests the use of semantic similarity measures between two sentences. Practically, we used this method to calculate semantic similarity between each query and the description of each page’s content generated by Google. The generated sentences are sent to Google engine one by one, and ranked again all together with the adapted ranking method (QESM). Finally, our system will place Google pages with higher similarities on the top of the results. We have conducted experimentations with 6 different queries. We have observed that most ranked results with QESM were altered with Google’s original generated pages. With our experimented queries, QESM generates frequently better accuracy than Google. In some worst cases, it behaves like Google.

Keywords: semantic search engine, Google indexing, query expansion, similarity measures

Procedia PDF Downloads 423
659 Describing Cognitive Decline in Alzheimer's Disease via a Picture Description Writing Task

Authors: Marielle Leijten, Catherine Meulemans, Sven De Maeyer, Luuk Van Waes

Abstract:

For the diagnosis of Alzheimer's disease (AD), a large variety of neuropsychological tests are available. In some of these tests, linguistic processing - both oral and written - is an important factor. Language disturbances might serve as a strong indicator for an underlying neurodegenerative disorder like AD. However, the current diagnostic instruments for language assessment mainly focus on product measures, such as text length or number of errors, ignoring the importance of the process that leads to written or spoken language production. In this study, it is our aim to describe and test differences between cognitive and impaired elderly on the basis of a selection of writing process variables (inter- and intrapersonal characteristics). These process variables are mainly related to pause times, because the number, length, and location of pauses have proven to be an important indicator of the cognitive complexity of a process. Method: Participants that were enrolled in our research were chosen on the basis of a number of basic criteria necessary to collect reliable writing process data. Furthermore, we opted to match the thirteen cognitively impaired patients (8 MCI and 5 AD) with thirteen cognitively healthy elderly. At the start of the experiment, participants were each given a number of tests, such as the Mini-Mental State Examination test (MMSE), the Geriatric Depression Scale (GDS), the forward and backward digit span and the Edinburgh Handedness Inventory (EHI). Also, a questionnaire was used to collect socio-demographic information (age, gender, eduction) of the subjects as well as more details on their level of computer literacy. The tests and questionnaire were followed by two typing tasks and two picture description tasks. For the typing tasks participants had to copy (type) characters, words and sentences from a screen, whereas the picture description tasks each consisted of an image they had to describe in a few sentences. Both the typing and the picture description tasks were logged with Inputlog, a keystroke logging tool that allows us to log and time stamp keystroke activity to reconstruct and describe text production processes. The main rationale behind keystroke logging is that writing fluency and flow reveal traces of the underlying cognitive processes. This explains the analytical focus on pause (length, number, distribution, location, etc.) and revision (number, type, operation, embeddedness, location, etc.) characteristics. As in speech, pause times are seen as indexical of cognitive effort. Results. Preliminary analysis already showed some promising results concerning pause times before, within and after words. For all variables, mixed effects models were used that included participants as a random effect and MMSE scores, GDS scores and word categories (such as determiners and nouns) as a fixed effect. For pause times before and after words cognitively impaired patients paused longer than healthy elderly. These variables did not show an interaction effect between the group participants (cognitively impaired or healthy elderly) belonged to and word categories. However, pause times within words did show an interaction effect, which indicates pause times within certain word categories differ significantly between patients and healthy elderly.

Keywords: Alzheimer's disease, keystroke logging, matching, writing process

Procedia PDF Downloads 362
658 Determination of Mechanical Properties of Adhesives via Digital Image Correlation (DIC) Method

Authors: Murat Demir Aydin, Elanur Celebi

Abstract:

Adhesively bonded joints are used as an alternative to traditional joining methods due to the important advantages they provide. The most important consideration in the use of adhesively bonded joints is that these joints have appropriate requirements for their use in terms of safety. In order to ensure control of this condition, damage analysis of the adhesively bonded joints should be performed by determining the mechanical properties of the adhesives. When the literature is investigated; it is generally seen that the mechanical properties of adhesives are determined by traditional measurement methods. In this study, to determine the mechanical properties of adhesives, the Digital Image Correlation (DIC) method, which can be an alternative to traditional measurement methods, has been used. The DIC method is a new optical measurement method which is used to determine the parameters of displacement and strain in an appropriate and correct way. In this study, tensile tests of Thick Adherent Shear Test (TAST) samples formed using DP410 liquid structural adhesive and steel materials and bulk tensile specimens formed using and DP410 liquid structural adhesive was performed. The displacement and strain values of the samples were determined by DIC method and the shear stress-strain curves of the adhesive for TAST specimens and the tensile strain curves of the bulk adhesive specimens were obtained. Various methods such as numerical methods are required as conventional measurement methods (strain gauge, mechanic extensometer, etc.) are not sufficient in determining the strain and displacement values of the very thin adhesive layer such as TAST samples. As a result, the DIC method removes these requirements and easily achieves displacement measurements with sufficient accuracy.

Keywords: structural adhesive, adhesively bonded joints, digital image correlation, thick adhered shear test (TAST)

Procedia PDF Downloads 313
657 Study of Evaluation Model Based on Information System Success Model and Flow Theory Using Web-scale Discovery System

Authors: June-Jei Kuo, Yi-Chuan Hsieh

Abstract:

Because of the rapid growth of information technology, more and more libraries introduce the new information retrieval systems to enhance the users’ experience, improve the retrieval efficiency, and increase the applicability of the library resources. Nevertheless, few of them are discussed the usability from the users’ aspect. The aims of this study are to understand that the scenario of the information retrieval system utilization, and to know why users are willing to continuously use the web-scale discovery system to improve the web-scale discovery system and promote their use of university libraries. Besides of questionnaires, observations and interviews, this study employs both Information System Success Model introduced by DeLone and McLean in 2003 and the flow theory to evaluate the system quality, information quality, service quality, use, user satisfaction, flow, and continuing to use web-scale discovery system of students from National Chung Hsing University. Then, the results are analyzed through descriptive statistics and structural equation modeling using AMOS. The results reveal that in web-scale discovery system, the user’s evaluation of system quality, information quality, and service quality is positively related to the use and satisfaction; however, the service quality only affects user satisfaction. User satisfaction and the flow show a significant impact on continuing to use. Moreover, user satisfaction has a significant impact on user flow. According to the results of this study, to maintain the stability of the information retrieval system, to improve the information content quality, and to enhance the relationship between subject librarians and students are recommended for the academic libraries. Meanwhile, to improve the system user interface, to minimize layer from system-level, to strengthen the data accuracy and relevance, to modify the sorting criteria of the data, and to support the auto-correct function are required for system provider. Finally, to establish better communication with librariana commended for all users.

Keywords: web-scale discovery system, discovery system, information system success model, flow theory, academic library

Procedia PDF Downloads 98
656 A Multidisciplinary Team Approach for Limb Salvage in a Rare Case of Pyoderma Gangrenosum in a Significant Circumferential Lower Extremity Wound Complicated by Diabetes and End-stage Renal Disease

Authors: Jenee Gooden, Kevin Vasquez-monterroso, Lady Paula Dejesus, Sandra Wainwright, Daniel Kim, Mackenzie Walker

Abstract:

Introduction: Pyoderma gangrenosum (PG) is a rare, rapidly progressive, neutrophilic ulcerative colitis condition with an incidence of 3 to 10 cases per year ¹ ². Due to the similar appearance, PG is often misdiagnosed as a diabetic ulcer in diabetic patients. Though they may clinically appear similar in appearance, the treatment protocol and diagnostic criteria differ. Also, end-stage renal disease (ESRD) is often a condition seen in diabetic patients, which can have a significant impact on wound healing due to the wide range of uremic toxins³. This case study demonstrates a multidisciplinary team and multimodal treatment approach by podiatric surgery, general surgery, rheumatology, infectious disease, interventional cardiology, wound care and hyperbaric medicine for an uncontrolled diabetic with pyoderma gangrenosum of a significant circumferential wound, covering almost the entire right lower extremity. Methods:56 y.o male presents with multiple PG ulcerations, including the chest, right posterior lower extremity and sacrum. All ulcerations were previously managed by the same wound care specialist. His chief complaint was worsening PG ulcerations accompanied by a fever of 103 °F . This case study focuses on the wound to his RLE. Past medical history significant for diabetes mellitus type 2 with hemoglobin A1c of 10% and end stage renal disease (ESRD) on hemodialysis. A multidisciplinary team approach by podiatric surgery, general surgery, rheumatology, infectious disease, interventional cardiology, wound care and hyperbaric medicine was successfully used to perform right lower extremity limb salvage. The patient was managed by rheumatology for the continuation of prior medication, as well as the mutual agreement with wound care for the addition of dapsone. A coronary CT angiogram was performed by interventional cardiology, but no significant disease was noted, and no further vascular workup was necessary. Multiple surgical sharp wide excisional debridements with application of allografts and split thickness skin grafts for the circumferential ulceration that encompassed almost the entire right lower extremity were performed by both podiatric surgery and general surgery. Wound cultures and soft tissue biopsies were performed, and infectious disease managed antibiotic therapy. Hyperbaric oxygen therapy and wound vac therapy by wound care were also completed as adjunct management. Results: Prevention of leg amputation by limb salvage of the RLE was accomplished by a multidisciplinary team approach, with the wound size decreasing over a total of 29 weeks from 600 cm² to 12.0 x 3.5 x 0.2 cm. Our multidisciplinary team included podiatric surgery, general surgery, rheumatology, infectious disease, interventional cardiology, wound care and hyperbaric medicine. Discussion: Wound healing, in general, can have its challenges, and those challenges are only magnified when accompanied by multiple systemic illnesses. Though the negative impact of diabetes on wound healing is well known, the compound impact of being a diabetic with ESRD and having pyoderma gangrenosum is not. This case demonstrates the necessity for a multidisciplinary team approach with a wide array of treatment modalities to optimize wound healing and perform limb salvage with prevention of lower extremity amputation.

Keywords: diabetes, podiatry, pyoderma gangrenosum, end stage renal disease

Procedia PDF Downloads 70
655 Ethical Considerations of Disagreements Between Clinicians and Artificial Intelligence Recommendations: A Scoping Review

Authors: Adiba Matin, Daniel Cabrera, Javiera Bellolio, Jasmine Stewart, Dana Gerberi (librarian), Nathan Cummins, Fernanda Bellolio

Abstract:

OBJECTIVES: Artificial intelligence (AI) tools are becoming more prevalent in healthcare settings, particularly for diagnostic and therapeutic recommendations, with an expected surge in the incoming years. The bedside use of this technology for clinicians opens the possibility of disagreements between the recommendations from AI algorithms and clinicians’ judgment. There is a paucity in the literature analyzing nature and possible outcomes of these potential conflicts, particularly related to ethical considerations. The goal of this scoping review is to identify, analyze and classify current themes and potential strategies addressing ethical conflicts originating from the conflict between AI and human recommendations. METHODS: A protocol was written prior to the initiation of the study. Relevant literature was searched by a medical librarian for the terms of artificial intelligence, healthcare and liability, ethics, or conflict. Search was run in 2021 in Ovid Cochrane Central Register of Controlled Trials, Embase, Medline, IEEE Xplore, Scopus, and Web of Science Core Collection. Articles describing the role of AI in healthcare that mentioned conflict between humans and AI were included in the primary search. Two investigators working independently and in duplicate screened titles and abstracts and reviewed full-text of potentially eligible studies. Data was abstracted into tables and reported by themes. We followed methodological guidelines for Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR). RESULTS: Of 6846 titles and abstracts, 225 full texts were selected, and 48 articles included in this review. 23 articles were included as original research and review papers. 25 were included as editorials and commentaries with similar themes. There was a lack of consensus in the included articles on who would be held liable for mistakes incurred by following AI recommendations. It appears that there is a dichotomy of the perceived ethical consequences depending on if the negative outcome is a result of a human versus AI conflict or secondary to a deviation from standard of care. Themes identified included transparency versus opacity of recommendations, data bias, liability of outcomes, regulatory framework, and the overall scope of artificial intelligence in healthcare. A relevant issue identified was the concern by clinicians of the “black box” nature of these recommendations and the ability to judge appropriateness of AI guidance. CONCLUSION AI clinical tools are being rapidly developed and adopted, and the use of this technology will create conflicts between AI algorithms and healthcare workers with various outcomes. In turn, these conflicts may have legal, and ethical considerations. There is limited consensus about the focus of ethical and liability for outcomes originated from disagreements. This scoping review identified the importance of framing the problem in terms of conflict between standard of care or not, and informed by the themes of transparency/opacity, data bias, legal liability, absent regulatory frameworks and understanding of the technology. Finally, limited recommendations to mitigate ethical conflicts between AI and humans have been identified. Further work is necessary in this field.

Keywords: ethics, artificial intelligence, emergency medicine, review

Procedia PDF Downloads 91
654 Numerical Investigation of Gas Leakage in RCSW-Soil Combinations

Authors: Mahmoud Y. M. Ahmed, Ahmed Konsowa, Mostafa Sami, Ayman Mosallam

Abstract:

Fukushima nuclear accident (Japan 2011) has drawn attention to the issue of gas leakage from hazardous facilities through building boundaries. The rapidly increasing investments in nuclear stations have made the ability to predict, and prevent, gas leakage a rather crucial issue both environmentally and economically. Leakage monitoring for underground facilities is rather complicated due to the combination of Reinforced Concrete Shear Wall (RCSW) and soil. In the framework of a recent research conducted by the authors, the gas insulation capabilities of RCSW-soil combination have been investigated via a lab-scale experimental work. Despite their accuracy, experimental investigations are expensive, time-consuming, hazardous, and lack for flexibility. Numerically simulating the gas leakage as a fluid flow problem based on Computational Fluid Dynamics (CFD) modeling approach can provide a potential alternative. This novel implementation of CFD approach is the topic of the present paper. The paper discusses the aspects of modeling the gas flow through porous media that resemble the RCSW both isolated and combined with the normal soil. A commercial CFD package is utilized in simulating this fluid flow problem. A fixed RCSW layer thickness is proposed, air is taken as the leaking gas, whereas the soil layer is represented as clean sand with variable properties. The variable sand properties include sand layer thickness, fine fraction ratio, and moisture content. The CFD simulation results almost demonstrate what has been found experimentally. A soil layer attached next to a cracked reinforced concrete section plays a significant role in reducing the gas leakage from that cracked section. This role is found to be strongly dependent on the soil specifications.

Keywords: RCSW, gas leakage, Pressure Decay Method, hazardous underground facilities, CFD

Procedia PDF Downloads 414
653 Prospectivity Mapping of Orogenic Lode Gold Deposits Using Fuzzy Models: A Case Study of Saqqez Area, Northwestern Iran

Authors: Fanous Mohammadi, Majid H. Tangestani, Mohammad H. Tayebi

Abstract:

This research aims to evaluate and compare Geographical Information Systems (GIS)-based fuzzy models for producing orogenic gold prospectivity maps in the Saqqez area, NW of Iran. Gold occurrences are hosted in sericite schist and mafic to felsic meta-volcanic rocks in this area and are associated with hydrothermal alterations that extend over ductile to brittle shear zones. The predictor maps, which represent the Pre-(Source/Trigger/Pathway), syn-(deposition/physical/chemical traps) and post-mineralization (preservation/distribution of indicator minerals) subsystems for gold mineralization, were generated using empirical understandings of the specifications of known orogenic gold deposits and gold mineral systems and were then pre-processed and integrated to produce mineral prospectivity maps. Five fuzzy logic operators, including AND, OR, Fuzzy Algebraic Product (FAP), Fuzzy Algebraic Sum (FAS), and GAMMA, were applied to the predictor maps in order to find the most efficient prediction model. Prediction-Area (P-A) plots and field observations were used to assess and evaluate the accuracy of prediction models. Mineral prospectivity maps generated by AND, OR, FAP, and FAS operators were inaccurate and, therefore, unable to pinpoint the exact location of discovered gold occurrences. The GAMMA operator, on the other hand, produced acceptable results and identified potentially economic target sites. The P-A plot revealed that 68 percent of known orogenic gold deposits are found in high and very high potential regions. The GAMMA operator was shown to be useful in predicting and defining cost-effective target sites for orogenic gold deposits, as well as optimizing mineral deposit exploitation.

Keywords: mineral prospectivity mapping, fuzzy logic, GIS, orogenic gold deposit, Saqqez, Iran

Procedia PDF Downloads 118
652 Plasmodium knowlesi Zoonotic Malaria: An Emerging Challenge of Health Problems in Thailand

Authors: Surachart Koyadun

Abstract:

Currently, Plasmodium knowlesi malaria has spread to almost all countries in Southeast Asia. This research aimed to 1) describe the epidemiology of Plasmodium knowlesi malaria, 2) examine the clinical symptoms of P. knowlesi malaria patients 3) analyze the ecology, animal reservoir and entomology of P. knowlesi malaria. 4) summarize the diagnosis, blood parasites, and treatment of P. knowlesi malaria. The study design was a case report combined with retrospective descriptive survey research. A total of 34 study subjects were patients with a confirmed diagnosis of P. knowlesi malaria who received treatment at hospitals and vector-borne disease control units in Songkhla Province during 2021 – 2022. The results of the epidemiological study unveiled the majority of the samples were male, had a history of staying overnight in the forest before becoming sick, the source of the infection was in the forest, and the season during which they were sick was mostly summer. The average length of time from the onset of illness until receiving a blood test was 3.8 days. The average length of hospital stay was 4 days. Patients were treated with Chloroquine Phosphate, Primaquine, Artesunate, Quinine, and Dihydroartemisinin-piperaquine (40 mg DHA-320 mg PPQ). One death was seen in 34 P. knowlesi malaria patients. All remaining patients recovered and responded to treatment. All symptoms improved after drug administration. No treatment failures were found. Analyses of ecological, zoonotic and entomological data revealed an association between infected patients and forested, monkey-hosted and mosquito-transmitted areas. The recommendation from this study was that the Polymerase Chain Reaction (PCR) method should be used in conjunction with the Thick/Thin Film test and blood parasite test (Parasitaemia) for the specificity of the infection, accuracy of diagnosis, leading to treatment of disease in a timely manner and be effective in disease control.

Keywords: human malaria, Plasmodium knowlesi, zoonotic disease, diagnosis and treatment, epidemiology, ecology

Procedia PDF Downloads 11
651 Determinants of Utilization of Information and Communication Technology by Lecturers at Kenya Medical Training College, Nairobi

Authors: Agnes Anyango Andollo, Jane Achieng Achola

Abstract:

The use of Information and Communication Technologies (ICTs) has become one of the driving forces in facilitation of learning in most colleges. The ability to effectively harness the technology varies from college to college. The study objective was to determine the lecturers’, institutional attributes and policies that influence the utilization of ICT by the lecturers’. A cross sectional survey design was employed in order to empirically investigate the extent to which lecturers’ personal, institutional attributes and policies influence the utilization of ICT to facilitate learning. The target population of the study was 295 lecturers who facilitate learning at KMTC-Nairobi. Structured self-administered questionnaire was given to the lecturers. Quantitative data was scrutinized for completeness, accuracy and uniformity then coded. Data were analyzed in frequencies and percentages using Statistical Package for Social Sciences (SPSS) version 19, this was a reliable tool for quantitative data analysis. A total of 155 completed questionnaires administered were obtained from the respondents for the study that were subjected to analysis. The study found out that 93 (60%) of the respondents were male while 62 (40%) of the respondents were female. Individual’s educational level, age, gender and educational experience had the greatest impact on use of ICT. Lecturers’ own beliefs, values, ideas and thinking had moderate impact on use of ICT. And that institutional support by provision of resources for ICT related training such as internet, computers, laptops and projectors had moderate impact (p = 0.049) at 5% significant level on use of ICT. The study concluded that institutional attributes and ICT policy were keys to utilization of ICT by lecturers at KMTC Nairobi also mandatory policy on use of ICT by lecturers to facilitate learning was key. It recommended that policies should be put in place for Technical support to lecturers when in problem during utilization of ICT and also a mechanism should be put in place to make the use of ICT in teaching and learning mandatory.

Keywords: policy, computers education, medical training institutions, ICTs

Procedia PDF Downloads 355
650 Understanding Different Facets of Chromosome Abnormalities: A 17-year Cytogenetic Study and Indian Perspectives

Authors: Lakshmi Rao Kandukuri, Mamata Deenadayal, Suma Prasad, Bipin Sethi, Srinadh Buragadda, Lalji Singh

Abstract:

Worldwide; at least 7.6 million children are born annually with severe genetic or congenital malformations and among them 90% of these are born in mid and low-income countries. Precise prevalence data are difficult to collect, especially in developing countries, owing to the great diversity of conditions and also because many cases remain undiagnosed. The genetic and congenital disorder is the second most common cause of infant and childhood mortality and occurs with a prevalence of 25-60 per 1000 births. The higher prevalence of genetic diseases in a particular community may, however, be due to some social or cultural factors. Such factors include the tradition of consanguineous marriage, which results in a higher rate of autosomal recessive conditions including congenital malformations, stillbirths, or mental retardation. Genetic diseases can vary in severity, from being fatal before birth to requiring continuous management; their onset covers all life stages from infancy to old age. Those presenting at birth are particularly burdensome and may cause early death or life-long chronic morbidity. Genetic testing for several genetic diseases identifies changes in chromosomes, genes, or proteins. The results of a genetic test can confirm or rule out a suspected genetic condition or help determine a person's chance of developing or passing on a genetic disorder. Several hundred genetic tests are currently in use and more are being developed. Chromosomal abnormalities are the major cause of human suffering, which are implicated in mental retardation, congenital malformations, dysmorphic features, primary and secondary amenorrhea, reproductive wastage, infertility neoplastic diseases. Cytogenetic evaluation of patients is helpful in the counselling and management of affected individuals and families. We present here especially chromosomal abnormalities which form a major part of genetic disease burden in India. Different programmes on chromosome research and human reproductive genetics primarily relate to infertility since this is a major public health problem in our country, affecting 10-15 percent of couples. Prenatal diagnosis of chromosomal abnormalities in high-risk pregnancies helps in detecting chromosomally abnormal foetuses. Such couples are counselled regarding the continuation of pregnancy. In addition to the basic research, the team is providing chromosome diagnostic services that include conventional and advanced techniques for identifying various genetic defects. Other than routine chromosome diagnosis for infertility, also include patients with short stature, hypogonadism, undescended testis, microcephaly, delayed developmental milestones, familial, and isolated mental retardation, and cerebral palsy. Thus, chromosome diagnostics has found its applicability not only in disease prevention and management but also in guiding the clinicians in certain aspects of treatment. It would be appropriate to affirm that chromosomes are the images of life and they unequivocally mirror the states of human health. The importance of genetic counseling is increasing with the advancement in the field of genetics. The genetic counseling can help families to cope with emotional, psychological, and medical consequences of genetic diseases.

Keywords: India, chromosome abnormalities, genetic disorders, cytogenetic study

Procedia PDF Downloads 313
649 Optimizing The Residential Design Process Using Automated Technologies

Authors: Martin Georgiev, Milena Nanova, Damyan Damov

Abstract:

Architects, engineers, and developers need to analyse and implement a wide spectrum of data in different formats, if they want to produce viable residential developments. Usually, this data comes from a number of different sources and is not well structured. The main objective of this research project is to provide parametric tools working with real geodesic data that can generate residential solutions. Various codes, regulations and design constraints are described by variables and prioritized. In this way, we establish a common workflow for architects, geodesists, and other professionals involved in the building and investment process. This collaborative medium ensures that the generated design variants conform to various requirements, contributing to a more streamlined and informed decision-making process. The quantification of distinctive characteristics inherent to typical residential structures allows a systematic evaluation of the generated variants, focusing on factors crucial to designers, such as daylight simulation, circulation analysis, space utilization, view orientation, etc. Integrating real geodesic data offers a holistic view of the built environment, enhancing the accuracy and relevance of the design solutions. The use of generative algorithms and parametric models offers high productivity and flexibility of the design variants. It can be implemented in more conventional CAD and BIM workflow. Experts from different specialties can join their efforts, sharing a common digital workspace. In conclusion, our research demonstrates that a generative parametric approach based on real geodesic data and collaborative decision-making could be introduced in the early phases of the design process. This gives the designers powerful tools to explore diverse design possibilities, significantly improving the qualities of the building investment during its entire lifecycle.

Keywords: architectural design, residential buildings, urban development, geodesic data, generative design, parametric models, workflow optimization

Procedia PDF Downloads 46
648 Dosimetric Dependence on the Collimator Angle in Prostate Volumetric Modulated Arc Therapy

Authors: Muhammad Isa Khan, Jalil Ur Rehman, Muhammad Afzal Khan Rao, James Chow

Abstract:

Purpose: This study investigates the dose-volume variations in planning target volume (PTV) and organs-at-risk (OARs) using different collimator angles for smart arc prostate volumetric modulated arc therapy (VMAT). Awareness of the collimator angle for PTV and OARs sparing is essential for the planner because optimization contains numerous treatment constraints producing a complex, unstable and computationally challenging problem throughout its examination of an optimal plan in a rational time. Materials and Methods: Single arc VMAT plans at different collimator angles varied systematically (0°-90°) were performed on a Harold phantom and a new treatment plan is optimized for each collimator angle. We analyzed the conformity index (CI), homogeneity index (HI), gradient index (GI), monitor units (MUs), dose-volume histogram, mean and maximum doses to PTV. We also explored OARs (e.g. bladder, rectum and femoral heads), dose-volume criteria in the treatment plan (e.g. D30%, D50%, V30Gy and V38Gy of bladder and rectum; D5%,V14Gy and V22Gy of femoral heads), dose-volume histogram, mean and maximum doses for smart arc VMAT at different collimator angles. Results: There was no significance difference found in VMAT optimization at all studied collimator angles. However, if 0.5% accuracy is concerned then collimator angle = 45° provides higher CI and lower HI. Collimator angle = 15° also provides lower HI values like collimator angle 45°. It is seen that collimator angle = 75° is established as a good for rectum and right femur sparing. Collimator angle = 90° and collimator angle = 30° were found good for rectum and left femur sparing respectively. The PTV dose coverage statistics for each plan are comparatively independent of the collimator angles. Conclusion: It is concluded that this study will help the planner to have freedom to choose any collimator angle from (0°-90°) for PTV coverage and select a suitable collimator angle to spare OARs.

Keywords: VMAT, dose-volume histogram, collimator angle, organs-at-risk

Procedia PDF Downloads 508
647 Dynamic Control Theory: A Behavioral Modeling Approach to Demand Forecasting amongst Office Workers Engaged in a Competition on Energy Shifting

Authors: Akaash Tawade, Manan Khattar, Lucas Spangher, Costas J. Spanos

Abstract:

Many grids are increasing the share of renewable energy in their generation mix, which is causing the energy generation to become less controllable. Buildings, which consume nearly 33% of all energy, are a key target for demand response: i.e., mechanisms for demand to meet supply. Understanding the behavior of office workers is a start towards developing demand response for one sector of building technology. The literature notes that dynamic computational modeling can be predictive of individual action, especially given that occupant behavior is traditionally abstracted from demand forecasting. Recent work founded on Social Cognitive Theory (SCT) has provided a promising conceptual basis for modeling behavior, personal states, and environment using control theoretic principles. Here, an adapted linear dynamical system of latent states and exogenous inputs is proposed to simulate energy demand amongst office workers engaged in a social energy shifting game. The energy shifting competition is implemented in an office in Singapore that is connected to a minigrid of buildings with a consistent 'price signal.' This signal is translated into a 'points signal' by a reinforcement learning (RL) algorithm to influence participant energy use. The dynamic model functions at the intersection of the points signals, baseline energy consumption trends, and SCT behavioral inputs to simulate future outcomes. This study endeavors to analyze how the dynamic model trains an RL agent and, subsequently, the degree of accuracy to which load deferability can be simulated. The results offer a generalizable behavioral model for energy competitions that provides the framework for further research on transfer learning for RL, and more broadly— transactive control.

Keywords: energy demand forecasting, social cognitive behavioral modeling, social game, transfer learning

Procedia PDF Downloads 103
646 Chemical Life Cycle Alternative Assessment as a Green Chemical Substitution Framework: A Feasibility Study

Authors: Sami Ayad, Mengshan Lee

Abstract:

The Sustainable Development Goals (SDGs) were designed to be the best possible blueprint to achieve peace, prosperity, and overall, a better and more sustainable future for the Earth and all its people, and such a blueprint is needed more than ever. The SDGs face many hurdles that will prevent them from becoming a reality, one of such hurdles, arguably, is the chemical pollution and unintended chemical impacts generated through the production of various goods and resources that we consume. Chemical Alternatives Assessment has proven to be a viable solution for chemical pollution management in terms of filtering out hazardous chemicals for a greener alternative. However, the current substitution practice lacks crucial quantitative datasets (exposures and life cycle impacts) to ensure no unintended trade-offs occur in the substitution process. A Chemical Life Cycle Alternative Assessment (CLiCAA) framework is proposed as a reliable and replicable alternative to Life Cycle Based Alternative Assessment (LCAA) as it integrates chemical molecular structure analysis and Chemical Life Cycle Collaborative (CLiCC) web-based tool to fill in data gaps that the former frameworks suffer from. The CLiCAA framework consists of a four filtering layers, the first two being mandatory, with the final two being optional assessment and data extrapolation steps. Each layer includes relevant impact categories of each chemical, ranging from human to environmental impacts, that will be assessed and aggregated into unique scores for overall comparable results, with little to no data. A feasibility study will demonstrate the efficiency and accuracy of CLiCAA whilst bridging both cancer potency and exposure limit data, hoping to provide the necessary categorical impact information for every firm possible, especially those disadvantaged in terms of research and resource management.

Keywords: chemical alternative assessment, LCA, LCAA, CLiCC, CLiCAA, chemical substitution framework, cancer potency data, chemical molecular structure analysis

Procedia PDF Downloads 87
645 Optimizing Detection Methods for THz Bio-imaging Applications

Authors: C. Bolakis, I. S. Karanasiou, D. Grbovic, G. Karunasiri, N. Uzunoglu

Abstract:

A new approach for efficient detection of THz radiation in biomedical imaging applications is proposed. A double-layered absorber consisting of a 32 nm thick aluminum (Al) metallic layer, located on a glass medium (SiO2) of 1 mm thickness, was fabricated and used to design a fine-tuned absorber through a theoretical and finite element modeling process. The results indicate that the proposed low-cost, double-layered absorber can be tuned based on the metal layer sheet resistance and the thickness of various glass media taking advantage of the diversity of the absorption of the metal films in the desired THz domain (6 to 10 THz). It was found that the composite absorber could absorb up to 86% (a percentage exceeding the 50%, previously shown to be the highest achievable when using single thin metal layer) and reflect less than 1% of the incident THz power. This approach will enable monitoring of the transmission coefficient (THz transmission ‘’fingerprint’’) of the biosample with high accuracy, while also making the proposed double-layered absorber a good candidate for a microbolometer pixel’s active element. Based on the aforementioned promising results, a more sophisticated and effective double-layered absorber is under development. The glass medium has been substituted by diluted poly-si and the results were twofold: An absorption factor of 96% was reached and high TCR properties acquired. In addition, a generalization of these results and properties over the active frequency spectrum was achieved. Specifically, through the development of a theoretical equation having as input any arbitrary frequency in the IR spectrum (0.3 to 405.4 THz) and as output the appropriate thickness of the poly-si medium, the double-layered absorber retains the ability to absorb the 96% and reflects less than 1% of the incident power. As a result, through that post-optimization process and the spread spectrum frequency adjustment, the microbolometer detector efficiency could be further improved.

Keywords: bio-imaging, fine-tuned absorber, fingerprint, microbolometer

Procedia PDF Downloads 343
644 Quality Control of Distinct Cements by IR Spectroscopy: First, insights into Perspectives and Opportunities

Authors: Tobias Bader, Joerg Rickert

Abstract:

One key factor in achieving net zero emissions along the cement and concrete value chain in Europe by 2050 is the use of distinct constituents to produce improved and advanced cements. These cements will contain e.g. calcined clays, recycled concrete fines that are chemically similar as well as X-ray amorphous and therefore difficult to distinguish. This leads to enhanced requirements on the analytical methods for quality control regarding accuracy as well as reproducibility due to the more complex cement composition. With the methods currently provided for in the European standards, it will be a challenge to ensure reliable analyses of the composition of the cements. In an ongoing research project, infrared (IR) spectroscopy in combination with mathematical tools (chemometrics) is going to be evaluated as an additional analytical method with fast and low preparation effort for the characterization of silicate-based cement constituents. The resulting comprehensive database should facilitate determination of the composition of new cements. First results confirmed the applicability of near-infrared IR for the characterization of traditional silicate-based cement constituents (e.g. clinker, granulated blast furnace slag) and modern X-ray amorphous constituents (e.g. calcined clay, recycled concrete fines) as well as different sulfate species (e.g. gypsum, hemihydrate, anhydrite). A multivariant calibration model based on numerous calibration mixtures is in preparation. The final analytical concept to be developed will form the basis for establishing IR spectroscopy as a rapid analytical method for characterizing material flows of known and unknown inorganic substances according to their material properties online and offline. The underlying project was funded by the Federal Institute for Research on Building, Urban Affairs and Spatial Development on behalf of the Federal Ministry of Housing, Urban Development and Building with funds from the ‘Zukunft Bau’ research programme.

Keywords: cement, infrared spectroscopy, quality control, X-ray amorphous

Procedia PDF Downloads 33
643 Effect of Modulation Factors on Tomotherapy Plans and Their Quality Analysis

Authors: Asawari Alok Pawaskar

Abstract:

This study was aimed at investigating quality assurance (QA) done with IBA matrix, the discrepan­cies observed for helical tomotherapy plans. A selection of tomotherapy plans that initially failed the with Matrix process was chosen for this investigation. These plans failed the fluence analysis as assessed using gamma criteria (3%, 3 mm). Each of these plans was modified (keeping the planning constraints the same), beamlets rebatched and reoptimized. By increasing and decreasing the modula­tion factor, the fluence in a circumferential plane as measured with a diode array was assessed. A subset of these plans was investigated using varied pitch values. Factors for each plan that were examined were point doses, fluences, leaf opening times, planned leaf sinograms, and uniformity indices. In order to ensure that the treatment constraints remained the same, the dose-volume histograms (DVHs) of all the modulated plans were compared to the original plan. It was observed that a large increase in the modulation factor did not significantly improve DVH unifor­mity, but reduced the gamma analysis pass rate. This also increased the treatment delivery time by slowing down the gantry rotation speed which then increases the maximum to mean non-zero leaf open time ratio. Increasing and decreasing the pitch value did not substantially change treatment time, but the delivery accuracy was adversely affected. This may be due to many other factors, such as the complexity of the treatment plan and site. Patient sites included in this study were head and neck, breast, abdomen. The impact of leaf tim­ing inaccuracies on plans was greater with higher modulation factors. Point-dose measurements were seen to be less susceptible to changes in pitch and modulation factors. The initial modulation factor used by the optimizer, such that the TPS generated ‘actual’ modulation factor within the range of 1.4 to 2.5, resulted in an improved deliverable plan.

Keywords: dose volume histogram, modulation factor, IBA matrix, tomotherapy

Procedia PDF Downloads 172
642 Synthesis of Double Dye-Doped Silica Nanoparticles and Its Application in Paper-Based Chromatography

Authors: Ka Ho Yau, Jan Frederick Engels, Kwok Kei Lai, Reinhard Renneberg

Abstract:

Lateral flow test is a prevalent technology in various sectors such as food, pharmacology and biomedical sciences. Colloidal gold (CG) is widely used as the signalling molecule because of the ease of synthesis, bimolecular conjugation and its red colour due to intrinsic SPRE. However, the production of colloidal gold is costly and requires vigorous conditions. The stability of colloidal gold are easily affected by environmental factors such as pH, high salt content etc. Silica nanoparticles are well known for its ease of production and stability over a wide range of solvents. Using reverse micro-emulsion (w/o), silica nanoparticles with different sizes can be produced precisely by controlling the amount of water. By incorporating different water-soluble dyes, a rainbow colour of the silica nanoparticles could be produced. Conjugation with biomolecules such as antibodies can be achieved after surface modification of the silica nanoparticles with organosilane. The optimum amount of the antibodies to be labelled was determined by Bradford Assay. In this work, we have demonstrated the ability of the dye-doped silica nanoparticles as a signalling molecule in lateral flow test, which showed a semi-quantitative measurement of the analyte. The image was further analysed for the LOD=10 ng of the analyte. The working range and the linear range of the test were from 0 to 2.15μg/mL and from 0 to 1.07 μg/mL (R2=0.988) respectively. The performance of the tests was comparable to those using colloidal gold with the advantages of lower cost, enhanced stability and having a wide spectrum of colours. The positives lines can be imaged by naked eye or by using a mobile phone camera for a better quantification. Further research has been carried out in multicolour detection of different biomarkers simultaneously. The preliminary results were promising as there was little cross-reactivity being observed for an optimized system. This approach provides a platform for multicolour detection for a set of biomarkers that enhances the accuracy of diseases diagnostics.

Keywords: colorimetric detection, immunosensor, paper-based biosensor, silica

Procedia PDF Downloads 377
641 Investigating the Effectiveness of Multilingual NLP Models for Sentiment Analysis

Authors: Othmane Touri, Sanaa El Filali, El Habib Benlahmar

Abstract:

Natural Language Processing (NLP) has gained significant attention lately. It has proved its ability to analyze and extract insights from unstructured text data in various languages. It is found that one of the most popular NLP applications is sentiment analysis which aims to identify the sentiment expressed in a piece of text, such as positive, negative, or neutral, in multiple languages. While there are several multilingual NLP models available for sentiment analysis, there is a need to investigate their effectiveness in different contexts and applications. In this study, we aim to investigate the effectiveness of different multilingual NLP models for sentiment analysis on a dataset of online product reviews in multiple languages. The performance of several NLP models, including Google Cloud Natural Language API, Microsoft Azure Cognitive Services, Amazon Comprehend, Stanford CoreNLP, spaCy, and Hugging Face Transformers are being compared. The models based on several metrics, including accuracy, precision, recall, and F1 score, are being evaluated and compared to their performance across different categories of product reviews. In order to run the study, preprocessing of the dataset has been performed by cleaning and tokenizing the text data in multiple languages. Then training and testing each model has been applied using a cross-validation approach where randomly dividing the dataset into training and testing sets and repeating the process multiple times has been used. A grid search approach to optimize the hyperparameters of each model and select the best-performing model for each category of product reviews and language has been applied. The findings of this study provide insights into the effectiveness of different multilingual NLP models for Multilingual Sentiment Analysis and their suitability for different languages and applications. The strengths and limitations of each model were identified, and recommendations for selecting the most performant model based on the specific requirements of a project were provided. This study contributes to the advancement of research methods in multilingual NLP and provides a practical guide for researchers and practitioners in the field.

Keywords: NLP, multilingual, sentiment analysis, texts

Procedia PDF Downloads 95
640 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator

Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain

Abstract:

Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.

Keywords: percent depth dose, flatness, symmetry, golden beam data

Procedia PDF Downloads 484
639 UEFA Super Cup: Economic Effects on Georgian Economy

Authors: Giorgi Bregadze

Abstract:

Tourism is the most viable and sustainable economic development option for Georgia and one of the main sources of foreign exchange earnings. Events are considered as one of the most effective ways to attract foreign visitors to the country, and, recently, the government of Georgia has begun investing in this sector very actively. This article stresses the necessity of research based economic policy in the tourism sector. In this regard, it is of paramount importance to measure the economic effects of the events which are subsidized by taxpayers’ money. The economic effect of events can be analyzed from two perspectives; financial perspective of the government and perspective of economic effects of the tourism administration. The article emphasizes more realistic and all-inclusive focus of the economic effect analysis of the tourism administration as it concentrates on the income of residents and local businesses, part of which generate tax revenues for the government. The public would like to know what the economic returns to investment are. In this article, the methodology used to describe the economic effects of UEFA Super Cup held in Tbilisi, will help to answer this question. Methodology is based on three main principles and covers three stages. Using the suggested methodology article estimates the direct economic effect of UEFA Super cup on Georgian economy. Although the attempt to make an economic effect analysis of the event was successful in Georgia, some obstacles and insufficiencies were identified during the survey. The article offers several recommendations that will help to refine methodology and improve the accuracy of the data. Furthermore, it is very important to receive the correct standard of measurement of events in Georgia. In this caseü non-ethical acts of measurement which are widely utilized by different research companies will not trigger others to show overestimated effects. It is worth mentioning that to author’s best knowledge, this is the first attempt to measure the economic effect of an event held in Georgia.

Keywords: biased economic effect analysis, expenditure of local citizens, time switchers and casuals, UEFA super cup

Procedia PDF Downloads 151
638 Improving Fingerprinting-Based Localization System Using Generative Artificial Intelligence

Authors: Getaneh Berie Tarekegn

Abstract:

A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 39 cm, and more than 90% of the errors are less than 82 cm. That is, numerical results proved that, in comparison to traditional methods, the proposed SRCLoc method can significantly improve positioning performance and reduce radio map construction costs.

Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine

Procedia PDF Downloads 65
637 Downscaling Grace Gravity Models Using Spectral Combination Techniques for Terrestrial Water Storage and Groundwater Storage Estimation

Authors: Farzam Fatolazadeh, Kalifa Goita, Mehdi Eshagh, Shusen Wang

Abstract:

The Gravity Recovery and Climate Experiment (GRACE) is a satellite mission with twin satellites for the precise determination of spatial and temporal variations in the Earth’s gravity field. The products of this mission are monthly global gravity models containing the spherical harmonic coefficients and their errors. These GRACE models can be used for estimating terrestrial water storage (TWS) variations across the globe at large scales, thereby offering an opportunity for surface and groundwater storage (GWS) assessments. Yet, the ability of GRACE to monitor changes at smaller scales is too limited for local water management authorities. This is largely due to the low spatial and temporal resolutions of its models (~200,000 km2 and one month, respectively). High-resolution GRACE data products would substantially enrich the information that is needed by local-scale decision-makers while offering the data for the regions that lack adequate in situ monitoring networks, including northern parts of Canada. Such products could eventually be obtained through downscaling. In this study, we extended the spectral combination theory to simultaneously downscale spatiotemporally the 3o spatial coarse resolution of GRACE to 0.25o degrees resolution and monthly coarse resolution to daily resolution. This method combines the monthly gravity field solution of GRACE and daily hydrological model products in the form of both low and high-frequency signals to produce high spatiotemporal resolution TWSA and GWSA products. The main contribution and originality of this study are to comprehensively and simultaneously consider GRACE and hydrological variables and their uncertainties to form the estimator in the spectral domain. Therefore, it is predicted that we reach downscale products with an acceptable accuracy.

Keywords: GRACE satellite, groundwater storage, spectral combination, terrestrial water storage

Procedia PDF Downloads 80