Search results for: digital imaging and communications in medicine (DICOM)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5922

Search results for: digital imaging and communications in medicine (DICOM)

2262 Analysis of Cell Cycle Status in Radiation Non-Targeted Hepatoma Cells Using Flow Cytometry: Evidence of Dose Dependent Response

Authors: Sharmi Mukherjee, Anindita Chakraborty

Abstract:

Cellular irradiation incites complex responses including arrest of cell cycle progression. This article accentuates the effects of radiation on cell cycle status of radiation non-targeted cells. Human Hepatoma HepG2 cells were exposed to increasing doses of γ radiations (1, 2, 4, 6 Gy) and their cell culture media was transferred to non-targeted HepG2 cells cultured in other Petri plates. These radiation non-targeted cells cultured in the ICCM (Irradiated cell conditioned media) were the bystander cells on which cell cycle analysis was performed using flow cytometry. An apparent decrease in the distribution of bystander cells at G0/G1 phase was observed with increased radiation doses upto 4 Gy representing a linear relationship. This was accompanied by a gradual increase in cellular distribution at G2/M phase. Interestingly the number of cells in G2/M phase at 1 and 2 Gy irradiation was not significantly different from each other. However, the percentage of G2 phase cells at 4 and 6 Gy doses were significantly higher than 2 Gy dose indicating the IC50 dose to be between 2 and 4 Gy. Cell cycle arrest is an indirect indicator of genotoxic damage in cells. In this study, bystander stress signals through the cell culture media of irradiated cells disseminated the radiation induced DNA damages in the non-targeted cells which resulted in arrest of the cell cycle progression at G2/M phase checkpoint. This implies that actual radiation biological effects represent a penumbra with effects encompassing a larger area than the actual beam. This article highlights the existence of genotoxic damages as bystander effects of γ rays in human Hepatoma cells by cell cycle analysis and opens up avenues for appraisal of bystander stress communications between tumor cells. Contemplation of underlying signaling mechanisms can be manipulated to maximize damaging effects of radiation with minimum dose and thus has therapeutic applications.

Keywords: bystander effect, cell cycle, genotoxic damage, hepatoma

Procedia PDF Downloads 184
2261 Detection of Autistic Children's Voice Based on Artificial Neural Network

Authors: Royan Dawud Aldian, Endah Purwanti, Soegianto Soelistiono

Abstract:

In this research we have been developed an automatic investigation to classify normal children voice or autistic by using modern computation technology that is computation based on artificial neural network. The superiority of this computation technology is its capability on processing and saving data. In this research, digital voice features are gotten from the coefficient of linear-predictive coding with auto-correlation method and have been transformed in frequency domain using fast fourier transform, which used as input of artificial neural network in back-propagation method so that will make the difference between normal children and autistic automatically. The result of back-propagation method shows that successful classification capability for normal children voice experiment data is 100% whereas, for autistic children voice experiment data is 100%. The success rate using back-propagation classification system for the entire test data is 100%.

Keywords: autism, artificial neural network, backpropagation, linier predictive coding, fast fourier transform

Procedia PDF Downloads 461
2260 The Role of Named Entity Recognition for Information Extraction

Authors: Girma Yohannis Bade, Olga Kolesnikova, Grigori Sidorov

Abstract:

Named entity recognition (NER) is a building block for information extraction. Though the information extraction process has been automated using a variety of techniques to find and extract a piece of relevant information from unstructured documents, the discovery of targeted knowledge still poses a number of research difficulties because of the variability and lack of structure in Web data. NER, a subtask of information extraction (IE), came to exist to smooth such difficulty. It deals with finding the proper names (named entities), such as the name of the person, country, location, organization, dates, and event in a document, and categorizing them as predetermined labels, which is an initial step in IE tasks. This survey paper presents the roles and importance of NER to IE from the perspective of different algorithms and application area domains. Thus, this paper well summarizes how researchers implemented NER in particular application areas like finance, medicine, defense, business, food science, archeology, and so on. It also outlines the three types of sequence labeling algorithms for NER such as feature-based, neural network-based, and rule-based. Finally, the state-of-the-art and evaluation metrics of NER were presented.

Keywords: the role of NER, named entity recognition, information extraction, sequence labeling algorithms, named entity application area

Procedia PDF Downloads 82
2259 ChatGPT 4.0 Demonstrates Strong Performance in Standardised Medical Licensing Examinations: Insights and Implications for Medical Educators

Authors: K. O'Malley

Abstract:

Background: The emergence and rapid evolution of large language models (LLMs) (i.e., models of generative artificial intelligence, or AI) has been unprecedented. ChatGPT is one of the most widely used LLM platforms. Using natural language processing technology, it generates customized responses to user prompts, enabling it to mimic human conversation. Responses are generated using predictive modeling of vast internet text and data swathes and are further refined and reinforced through user feedback. The popularity of LLMs is increasing, with a growing number of students utilizing these platforms for study and revision purposes. Notwithstanding its many novel applications, LLM technology is inherently susceptible to bias and error. This poses a significant challenge in the educational setting, where academic integrity may be undermined. This study aims to evaluate the performance of the latest iteration of ChatGPT (ChatGPT4.0) in standardized state medical licensing examinations. Methods: A considered search strategy was used to interrogate the PubMed electronic database. The keywords ‘ChatGPT’ AND ‘medical education’ OR ‘medical school’ OR ‘medical licensing exam’ were used to identify relevant literature. The search included all peer-reviewed literature published in the past five years. The search was limited to publications in the English language only. Eligibility was ascertained based on the study title and abstract and confirmed by consulting the full-text document. Data was extracted into a Microsoft Excel document for analysis. Results: The search yielded 345 publications that were screened. 225 original articles were identified, of which 11 met the pre-determined criteria for inclusion in a narrative synthesis. These studies included performance assessments in national medical licensing examinations from the United States, United Kingdom, Saudi Arabia, Poland, Taiwan, Japan and Germany. ChatGPT 4.0 achieved scores ranging from 67.1 to 88.6 percent. The mean score across all studies was 82.49 percent (SD= 5.95). In all studies, ChatGPT exceeded the threshold for a passing grade in the corresponding exam. Conclusion: The capabilities of ChatGPT in standardized academic assessment in medicine are robust. While this technology can potentially revolutionize higher education, it also presents several challenges with which educators have not had to contend before. The overall strong performance of ChatGPT, as outlined above, may lend itself to unfair use (such as the plagiarism of deliverable coursework) and pose unforeseen ethical challenges (arising from algorithmic bias). Conversely, it highlights potential pitfalls if users assume LLM-generated content to be entirely accurate. In the aforementioned studies, ChatGPT exhibits a margin of error between 11.4 and 32.9 percent, which resonates strongly with concerns regarding the quality and veracity of LLM-generated content. It is imperative to highlight these limitations, particularly to students in the early stages of their education who are less likely to possess the requisite insight or knowledge to recognize errors, inaccuracies or false information. Educators must inform themselves of these emerging challenges to effectively address them and mitigate potential disruption in academic fora.

Keywords: artificial intelligence, ChatGPT, generative ai, large language models, licensing exam, medical education, medicine, university

Procedia PDF Downloads 35
2258 Investigation of Different Conditions to Detect Cycles in Linearly Implicit Quantized State Systems

Authors: Elmongi Elbellili, Ben Lauwens, Daan Huybrechs

Abstract:

The increasing complexity of modern engineering systems presents a challenge to the digital simulation of these systems which usually can be represented by differential equations. The Linearly Implicit Quantized State System (LIQSS) offers an alternative approach to traditional numerical integration techniques for solving Ordinary Differential Equations (ODEs). This method proved effective for handling discontinuous and large stiff systems. However, the inherent discrete nature of LIQSS may introduce oscillations that result in unnecessary computational steps. The current oscillation detection mechanism relies on a condition that checks the significance of the derivatives, but it could be further improved. This paper describes a different cycle detection mechanism and presents the outcomes using LIQSS order one in simulating the Advection Diffusion problem. The efficiency of this new cycle detection mechanism is verified by comparing the performance of the current solver against the new version as well as a reference solution using a Runge-Kutta method of order14.

Keywords: numerical integration, quantized state systems, ordinary differential equations, stiffness, cycle detection, simulation

Procedia PDF Downloads 62
2257 Performance Evaluation and Kinetics of Artocarpus heterophyllus Seed for the Purification of Paint Industrial Wastewater by Coagulation-Flocculation Process

Authors: Ifeoma Maryjane Iloamaeke, Kelvin Obazie, Mmesoma Offornze, Chiamaka Marysilvia Ifeaghalu, Cecilia Aduaka, Ugomma Chibuzo Onyeije, Claudine Ifunanaya Ogu, Ngozi Anastesia Okonkwo

Abstract:

This work investigated the effects of pH, settling time, and coagulant dosages on the removal of color, turbidity, and heavy metals from paint industrial wastewater using the seed of Artocarpus heterophyllus (AH) by the coagulation-flocculation process. The paint effluent was physicochemically characterized, while AH coagulant was instrumentally characterized by Scanning Electron Microscope (SEM), Fourier Transform Infrared (FTIR), and X-ray diffraction (XRD). A Jar test experiment was used for the coagulation-flocculation process. The result showed that paint effluent was polluted with color, turbidity (36000 NTU), mercury (1.392 mg/L), lead (0.252 mg/L), arsenic (1.236 mg/L), TSS (63.40mg/L), and COD (121.70 mg/L). The maximum color removal efficiency was 94.33% at the dosage of 0.2 g/L, pH 2 at a constant time of 50 mins, and 74.67% at constant pH 2, coagulant dosage of 0.2 g/L and 50 mins. The highest turbidity removal efficiency was 99.94% at 0.2 g/L and 50 mins at constant pH 2 and 96.66% at pH 2 and 0.2 g/L at constant time of 50 mins. The mercury removal efficiency of 99.29% was achieved at the optimal condition of 0.8 g/L coagulant dosage, pH 8, and constant time of 50 mins and 99.57% at coagulant dosage of 0.8 g/L, time of 50 mins constant pH 8. The highest lead removal efficiency was 99.76% at a coagulant dosage of 10 g/L, time of 40 mins at constant pH 10, and 96.53% at pH 10, coagulant dosage of 10 g/L and constant time of 40 mins. For arsenic, the removal efficiency is 75.24 % at 0.8 g/L coagulant dosage, time of 40 mins, and constant pH of 8. XRD imaging before treatment showed that Artocarpus heterophyllus coagulant was crystalline and changed to amorphous after treatment. The SEM and FTIR results of the AH coagulant and sludge suggested there were changes in the surface morphology and functional groups before and after treatment. The reaction kinetics were modeled best in the second order.

Keywords: Artocarpus heterophyllus, coagulation-flocculation, coagulant dosages, setting time, paint effluent

Procedia PDF Downloads 101
2256 Automatic Near-Infrared Image Colorization Using Synthetic Images

Authors: Yoganathan Karthik, Guhanathan Poravi

Abstract:

Colorizing near-infrared (NIR) images poses unique challenges due to the absence of color information and the nuances in light absorption. In this paper, we present an approach to NIR image colorization utilizing a synthetic dataset generated from visible light images. Our method addresses two major challenges encountered in NIR image colorization: accurately colorizing objects with color variations and avoiding over/under saturation in dimly lit scenes. To tackle these challenges, we propose a Generative Adversarial Network (GAN)-based framework that learns to map NIR images to their corresponding colorized versions. The synthetic dataset ensures diverse color representations, enabling the model to effectively handle objects with varying hues and shades. Furthermore, the GAN architecture facilitates the generation of realistic colorizations while preserving the integrity of dimly lit scenes, thus mitigating issues related to over/under saturation. Experimental results on benchmark NIR image datasets demonstrate the efficacy of our approach in producing high-quality colorizations with improved color accuracy and naturalness. Quantitative evaluations and comparative studies validate the superiority of our method over existing techniques, showcasing its robustness and generalization capability across diverse NIR image scenarios. Our research not only contributes to advancing NIR image colorization but also underscores the importance of synthetic datasets and GANs in addressing domain-specific challenges in image processing tasks. The proposed framework holds promise for various applications in remote sensing, medical imaging, and surveillance where accurate color representation of NIR imagery is crucial for analysis and interpretation.

Keywords: computer vision, near-infrared images, automatic image colorization, generative adversarial networks, synthetic data

Procedia PDF Downloads 46
2255 Regional Treatment Trends in Canada Derived from Pharmacy Records

Authors: John Chau, Tzvi Aviv

Abstract:

Cardiometabolic conditions (hypertension, diabetes, and hyperlipidemia) are major public health concerns. Analysis of all prescription records from about 10 million patients at the largest network of pharmacies in Canada reveals small year-over-year increases in the treatment prevalence of cardiometabolic diseases prior to the COVID-19 pandemic. Cardiometabolic treatment rates increase with age and are higher in males than females. Hypertension treatment rates were 24% in males and 19% in females in 2021. Diabetes treatment rates were 10% in males and 7% in females in 2021. Geospatial analysis using patient addresses reveals interesting differences among provinces and neighborhoods in Canada. Using digital surveys distributed among 8,504 Canadian adults, an increase in hypertension awareness with age and female gender was observed. However, 7% of seniors and 6% of middle-aged Canadians reported uncontrolled blood pressure (>140/90 mmHg). In addition, elevated blood pressure (130-139/80-89 mmHg) was reported by 20% of seniors and 14% of middle-aged Canadians.

Keywords: cardiometabolic conditions, diabetes, hypertension, precision public health

Procedia PDF Downloads 118
2254 Bottling the Darkness of Inner Life: Considering the Origins of Model Psychosis

Authors: Matthew Perkins-McVey

Abstract:

The pharmacological arm of mental health treatment is in a state of crisis. The promises of the Prozac century have fallen short; the number of different therapeutically significant medications that successfully complete development shrinks with every passing year, and the demand for better treatments only grows. Answering these hardships is a renewed optimism concerning the efficacy of controlled psychedelic therapy, a renaissance that has seen the return of a familiar concept: intoxication as a model psychosis. First appearing in the mid-19th century and featuring in an array of 20th century efforts in psychedelic research, model psychosis has, once more, come to the foreground of psychedelic research. And yet, little has been made of where this peculiar, perhaps even intoxicatingly mad, the idea originates. This paper seeks to uncover the conceptual foundations underlying the early emergence of model psychosis. This narrative will explore the conceptual foundations behind their independent development of the concept of model psychosis, considering their similarities and differences. In the course of this examination, it becomes apparent that the definition of endogenous psychosis, which formed in the mid-19th century, is the direct product of emerging understandings of exogenous psychosis, or model psychosis. Ultimately, the goal is not merely to understand how and why model psychosis became thinkable but to examine how seemingly secondary concept changes can engender new ways of being a psychiatric subject.

Keywords: history of psychiatry, model psychosis, history of medicine, history of science

Procedia PDF Downloads 91
2253 Challenges for IoT Adoption in India: A Study Based on Foresight Analysis for 2025

Authors: Shruti Chopra, Vikas Rao Vadi

Abstract:

In the era of the digital world, the Internet of Things (IoT) has been receiving significant attention. Its ubiquitous connectivity between humans, machines to machines (M2M) and machines to humans provides it a potential to transform the society and establish an ecosystem to serve new dimensions to the economy of the country. Thereby, this study has attempted to identify the challenges that seem prevalent in IoT adoption in India through the literature survey. Further, the data has been collected by taking the opinions of experts to conduct the foresight analysis and it has been analyzed with the help of scenario planning process – Micmac, Mactor, Multipol, and Smic-Prob. As a methodology, the study has identified the relationship between variables through variable analysis using Micmac and actor analysis using Mactor, this paper has attempted to generate the entire field of possibilities in terms of hypotheses and construct various scenarios through Multipol. And lastly, the findings of the study include final scenarios that are selected using Smic-Prob by assigning the probability to all the scenarios (including the conditional probability). This study may help the practitioners and policymakers to remove the obstacles to successfully implement the IoT in India.

Keywords: Internet of Thing (IoT), foresight analysis, scenario planning, challenges, policymaking

Procedia PDF Downloads 149
2252 The Trend of Epidemics in Population and Body Regulation in Iran (1850-1920)

Authors: Seyedfateh Moradi

Abstract:

Medical issues mark the beginning of a new form of epistemology in nineteenth-century Iran. The emergence of epidemic diseases led to the formation of a medical discourse and conflict over the body which displayed itself in the concept of health progress and development. The discourse attributed to this development in the health system defines the general structure of the given period. This discourse manifested itself in the conflict between the traditional and new medicine. The regulation and classification of body and population reveal the nature of this period. The government attempted to adapt itself to the modern and progressive discourse. This paper seeks to reveal part of this rupture and adaptation around epidemics and modern medical discourse. Also, accepting part of the traditional discourse in the new era, or adapting and integrating parts of it indicate a delegation of part of the power of traditional authorities. The delegation of power arose in the context of the discursive hegemony of Western modernism from which there was no escape. This provided the ground for the acceptance of government and emergence of other discourses. Finally, during the reign of Reza Shah (1922-1942), body and population planning changed into the key issues of government, which created serious tensions in society.

Keywords: epidemic, population, body, cholera, plague

Procedia PDF Downloads 74
2251 The Correlation between of Medicine and Postural Orthostatic Tachycardia Syndrome (POTS)

Authors: Dian Ariyawati, Romi Sukoco, Sinung Agung Joko

Abstract:

Background: Postural Orthostatic Tachycardia Syndrome (POTS) is a form of orthostatic intolerance caused by autonomic dysfunction. POTS predominantly occurs in young women. Regular exercise has proven to improve the organ system functions, including autonomous systems. The aim of this research was to determine the correlation between exercise frequency and POTS in young women. Method: 510 young women (16-23 years of age) were screened. They were obtained by interview and physical examination. The diagnosis of POTS was performed with Active Stand Test (AST) and heart rate measurement using a pulsemeter. There were 29 young women who suffered from POTS. The exercise frequency was obtained by interview. Data was statistically analyzed using Spearman Correlation test. Result: The subjects’, who tested positive for POTS didn’t perform regular exercise. The Spearman correlation test showed there was a moderate negative correlation between exercise frequency and POTS in young women (r = -0.487, p < 0.00). Conclusion: There is a moderate reverse correlation between exercise frequency and POTS in young women. Further studies are suggested to develop an exercise program for young who suffered from POTS.

Keywords: POTS, autonomic dysfunction, exercise frequency, young woman

Procedia PDF Downloads 557
2250 Can We Develop a Practical and Applicable Ethic in Veterinary Health Care with a Universal Application and without Dogma?

Authors: Theodorus Holtzhausen

Abstract:

With a growing number of professionals in healthcare moving freely between countries and also in general a more mobile global workforce, awareness of cultural differences have become more urgent for health care workers to apply proper care. There is a slowly emerging trend in health care due to globalisation that may create a more uniform cultural base for administering healthcare, but it is still very vulnerable to being hijacked and misdirected by major commercial interests. Veterinary clinics and medical clinics promoting alternative remedies lacking evidence based support and simultaneously practicing medicine as a science have become more common. Such ‘holistic’ clinics see these remedies more as a belief system causing no harm with minimal impact but with added financial benefit to the facility. With the inarguable acceptance and realisation of the interconnection between evolutionary aspects of cognition, knowledge and culture as a global but vulnerable cognition-gaining process affecting us all, we can see the enormous responsibility we carry. Such a responsibility for creating global well-being calling for an universally applicable ethic. Such an ethic with the potential of having significant impact on our cognition gaining process.

Keywords: veterinary health care, ethics, wellbeing, veterinary clinics

Procedia PDF Downloads 642
2249 Osteoarticular Ultrasound for Diagnostic Purposes in the Practice of the Rheumatologist

Authors: A. Ibovi Mouondayi, S. Zaher, K. Nassar, S. Janani

Abstract:

Introduction: Osteoarticular ultrasound has become an essential tool for the investigation and monitoring of osteoarticular pathologies for rheumatologists. It is performed in the clinic, cheap to access than other imaging technics. Important anatomical sites of inflammation in inflammatory diseases such as synovium, tendon sheath, and enthesis are easily identifiable on ultrasound. Objective: The objective of this study was to evaluate the importance of ultrasound for rheumatologists in the development of diagnoses of inflammatory rheumatism in cases of uncertain clinical presentation. Material and Methods: This is a retrospective study conducted in our department and carried out over a period of 30 months from January 2020 to June 2022. We included all patients with inflammatory arthralgia without clinical arthritis. Patients' data were collected through a patient operating system. Results: A total of 35 patients were identified, made up of 4 men and 31 women, with a sex ratio M/F of 0.12. The average age of the patients was 48.8 years, with extremes ranging from 17 years to 83 years. All patients had inflammatory polyarthralgia for an average of 9.3 years. Only two patients had suspicious synovitis on clinical examination. 91.43% of patients had a positive inflammatory assessment with an average CRP of 22.2 mg/L. Rheumatoid factor (RF) was present in 45.7% of patients and anti-CCP in 48.57%, with respective averages of 294.43 and 314.63 international units/mL. Radiographic lesions were found in 54% of patients. Osteoarticular ultrasound was performed in all these patients. Subclinical synovitis was found in 60% of patients, including 23% Doppler positive. Tenosynovitis was found in 11% of patients. Enthesitis was objectified in 3% of patients. Rheumatoid arthritis (RA) was retained in 40% of patients; psoriatic arthritis in 6% of patients, hydroxyapatite arthritis, and osteoarthritis in 3% each. Conclusion: Osteoarticular ultrasound has been an essential tool in the practice of rheumatology in recent years. It is for diagnostic purposes in chronic inflammatory rheumatism as well as in degenerative rheumatism and crystal induced arthropathies, but also essential in the follow-up of patients in rheumatology.

Keywords: ultrasound, skeletal, rheumatoid arthritis, arthralgia

Procedia PDF Downloads 121
2248 Pre-Administration of Thunbergia Laurifolia Linn. Prevent the Increase of Dopamine in the Nucleus Accumbens in Ethanol Addicted Rats

Authors: Watchareewan Thongsaard, Ratirat Sangpayap, Maneekarn Namsa-Aid

Abstract:

Thunbergia laurifolia Linn. (TL) is a herbal medicine which has been used as an antidote for several poisonous agents including insecticides and as a component of a mixture of crude extracts to treat drug addicted patients. The aim of this study is to examine the level of dopamine in nucleus accumbens after chronic pre-administration of TL in ethanol addicted rats. Male Wistar rats weigh 200-250 g received TL methanol extract (200mg/kg, orally) 60 minutes before 20% ethanol (1 g/kg, i.p.) for 30 days. The nucleus accumbens was removed and tested for dopamine by HPLC-ECD. The level of dopamine was significantly increased by chronic ethanol administration, whereas the chronic TL extract administration did not cause a difference in dopamine level when compared to control. Moreover, the pre-treatment of TL extract before ethanol significantly reduced the dopamine level in nucleus accumbens to normal level when compared with chronic ethanol administration alone. These results suggested that the increase in dopamine level in the nucleus accumbens by chronic ethanol administration is the cause of ethanol addiction, and this effect is prevented by chronic TL pre-administration. Furthermore, chronic TL extract administration alone did not cause the changes in dopamine level in the nucleus accumbens, indicating that TL itself did not cause addiction.

Keywords: Thunbergia laurifolia Linn., alcohol addiction, dopamine, nucleus accumbens

Procedia PDF Downloads 143
2247 Marker-Controlled Level-Set for Segmenting Breast Tumor from Thermal Images

Authors: Swathi Gopakumar, Sruthi Krishna, Shivasubramani Krishnamoorthy

Abstract:

Contactless, painless and radiation-free thermal imaging technology is one of the preferred screening modalities for detection of breast cancer. However, poor signal to noise ratio and the inexorable need to preserve edges defining cancer cells and normal cells, make the segmentation process difficult and hence unsuitable for computer-aided diagnosis of breast cancer. This paper presents key findings from a research conducted on the appraisal of two promising techniques, for the detection of breast cancer: (I) marker-controlled, Level-set segmentation of anisotropic diffusion filtered preprocessed image versus (II) Segmentation using marker-controlled level-set on a Gaussian-filtered image. Gaussian-filtering processes the image uniformly, whereas anisotropic filtering processes only in specific areas of a thermographic image. The pre-processed (Gaussian-filtered and anisotropic-filtered) images of breast samples were then applied for segmentation. The segmentation of breast starts with initial level-set function. In this study, marker refers to the position of the image to which initial level-set function is applied. The markers are generally placed on the left and right side of the breast, which may vary with the breast size. The proposed method was carried out on images from an online database with samples collected from women of varying breast characteristics. It was observed that the breast was able to be segmented out from the background by adjustment of the markers. From the results, it was observed that as a pre-processing technique, anisotropic filtering with level-set segmentation, preserved the edges more effectively than Gaussian filtering. Segmented image, by application of anisotropic filtering was found to be more suitable for feature extraction, enabling automated computer-aided diagnosis of breast cancer.

Keywords: anisotropic diffusion, breast, Gaussian, level-set, thermograms

Procedia PDF Downloads 380
2246 Demographic Characteristics and Factors Affecting Mortality in Pediatric Trauma Patients Who Are Admitted to Emergency Service

Authors: Latif Duran, Erdem Aydin, Ahmet Baydin, Ali Kemal Erenler, Iskender Aksoy

Abstract:

Aim: In this retrospective study, we aim to contribute to the literature by presenting the proposals for taking measures to reduce the mortality by examining the demographic characteristics of the pediatric age group patients presenting with trauma and the factors that may cause mortality Material and Method: This study has been performed by retrospectively investigating the data obtained from the patient files and the hospital automation registration system of the pediatric trauma patients who applied to the Adult Emergency Department of the Ondokuz Mayıs University Medical Faculty between January 1, 2016, and December 31, 2016. Results: 289 of 415 patients involved in our study, were males. The median age was 11.3 years. The most common trauma mechanism was falling from the high. A significant statistical difference was found on the association between trauma mechanisms and gender. An increase in the number of trauma cases was found especially in the summer months. The study showed that thoracic and abdominal trauma was relevant to the increased mortality. Computerized tomography was the most common diagnostic imaging modality. The presence of subarachnoid hemorrhage has increased the risk of mortality by 62.3 fold. Eight of the patients (1.9%) died. Scoring systems were statistically significant to predict mortality. Conclusion: Children are vulnerable to trauma because of their unique anatomical and physiological differences compared to adult patient groups. It will be more successful in the mortality rate and in the post-traumatic healing process by administering the patient triage fast and most appropriate trauma centers in the prehospital period, management of the critical patients with the scoring systems and management with standard treatment protocols

Keywords: emergency service, pediatric patients, scoring systems, trauma, age groups

Procedia PDF Downloads 199
2245 Corporate Social Responsibility for Multinational Enterprises to Gain Incomparable Advantage on the Long Run without Competition

Authors: Fatima Homor

Abstract:

The new era in business has started, according to my research paper findings, corporate social responsibility leads organizations to an incomparable advantage phase, where competition is secondary and financial growth is a result. Those who join later, lose their active advantage and cause passive disadvantage for their organizations. The main purpose of this presentation is to state the obvious and shed the light of the advantages of doing good, while doing well for multinational enterprises, extremely low fluctuation (preventing one of the highest costs), significantly lower marketing budget, enhanced reputation causing customer and supplier loyalty, employee commitment results in higher motivation level leading to better quality at each stages, Corporate Social Responsibility brings Unique Selling Proposition incomparable to others. The paper is based on a large research work conducted for the University of Liverpool Masters in Business Administration program, with the title of Corporate Social Responsibility for Multinational Enterprises to gain incomparable advantage. The research is based on both recent secondary data, but most importantly on 25 interviews with Chief Executive Officers at Multinational Enterprises and / or the Human Resources / corporate communications directors. The direct gains on Corporate Social Responsibility are analyzed when it is embedded into the core of the business. It is evident that project based Corporate Social Responsibility is not effective neither from the supported topic, Non-governmental Organizations point of view nor from the organization’s long-term sustainability point of view. Surveys have been conducted, data compared and consequences drawn. Corporate Social Responsibility must be started inside of the business to strengthen it. First, commit employees. It must come from the Chief Executive Officer. It must be related to the business profile. It has to be long term. They will commit customers. B-corps are coming (e.g. Unilever); the phenomenon of social enterprises has become a leading one.

Keywords: B-corps, embedded into core business, first inside, unique advantage

Procedia PDF Downloads 207
2244 Approach-Avoidance and Intrinsic-Extrinsic Motivation of Adolescent Computer Games Players

Authors: Monika Paleczna, Barbara Szmigielska

Abstract:

The period of adolescence is a time when young people are becoming more and more active and conscious users of the digital world. One of the most frequently undertaken activities by them is computer games. Young players can choose from a wide range of games, including action, adventure, strategy, and logic games. The main aim of this study is to answer the question about the motivation of teenage players. The basic question is what motivates young players to play computer games and what motivates them to play a particular game. Fifty adolescents aged 15-17 participated in the study. They completed a questionnaire in which they determined what motivates them to play, how often they play computer games, and what type of computer games they play most often. It was found that entertainment and learning English are among the most important motives. The most important specific features related to a given game are the knowledge of its previous parts and the ability to play for free. The motives chosen by the players will be described in relation to the concepts of internal and external as well as approach and avoidance motivation. An additional purpose of this study is to present data concerning preferences regarding the type of games and the amount of time they spend playing.

Keywords: computer games, motivation, game preferences, adolescence

Procedia PDF Downloads 186
2243 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison

Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo

Abstract:

A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.

Keywords: affective computing, interface, brain, intelligent interaction

Procedia PDF Downloads 390
2242 Cities Simulation and Representation in Locative Games from the Perspective of Cultural Studies

Authors: B. A. A. Paixão, J. V. B. Gomide

Abstract:

This work aims to analyze the locative structure used by the locative games of the company Niantic. To fulfill this objective, a literature review on the representation and simulation of cities was developed; interviews with Ingress players and playing Ingress. Relating these data, it was possible to deepen the relationship between the virtual and the real to create the simulation of cities and their cultural objects in locative games. Cities representation associates geo-location provided by the Global Positioning System (GPS), with augmented reality and digital image, and provides a new paradigm in the city interaction with its parts and real and virtual world elements, homeomorphic to real world. Bibliographic review of papers related to the representation and simulation study and their application in locative games was carried out and is presented in the present paper. The cities representation and simulation concepts in locative games, and how this setting enables the flow and immersion in urban space, are analyzed. Some examples of games are discussed for this new setting development, which is a mix of real and virtual world. Finally, it was proposed a Locative Structure for electronic games using the concepts of heterotrophic representations and isotropic representations conjoined with immediacy and hypermediacy.

Keywords: cities representation, cities simulation, games simulation, immersion, locative games

Procedia PDF Downloads 212
2241 The Applications of Wire Print in Composite Material Research and Fabrication Process

Authors: Hsu Yi-Chia, Hoy June-Hao

Abstract:

FDM (Fused Deposition Modeling) is a rapid proofing method without mold, however, high material and time costs have always been a major disadvantage. Wire-printing is the next generation technology that can more flexible, and also easier to apply on a 3D printer and robotic arms printing. It can create its own construction methods. The research is mainly divided into three parts. The first is about the method of parameterizing the generated paths and the conversion of g-code to the wire-printing. The second is about material attempts and the application of effects. Third, is about the improvement of the operation of mechanical equipment and the design of robotic tool-head. The purpose of this study is to develop a new wire-print method that can efficiently generate line segments and paths in three- dimensions space. The parametric modeling software transforms the digital model into a 3D printer or robotic arms g-code, this article uses thermoplastics/ clay/composites materials for testing. The combination of materials and wire-print process makes architects and designers have the ability to research and develop works and construction in the future.

Keywords: parametric software, wire print, robotic arms fabrication, composite filament additive manufacturing

Procedia PDF Downloads 132
2240 Systematic Review of Misconceptions: Tools for Diagnostics and Remediation Models for Misconceptions in Physics

Authors: Muhammad Iqbal, Edi Istiyono

Abstract:

Misconceptions are one of the problems in physics learning where students' understanding is not in line with scientific theory. The aim of this research is to find diagnostic tools to identify misconceptions and how to remediate physics misconceptions. In this research, the articles that will be reviewed come from the Scopus database related to physics misconceptions from 2013-2023. The articles obtained from the Scopus database were then selected according to the Prisma model, so 29 articles were obtained that focused on discussing physics misconceptions, especially regarding diagnostic tools and remediation methods. Currently, the most widely used diagnostic tool is the four-tier test, which is able to measure students' misconceptions in depth by knowing whether students are guessing or not and from then on, there is also a trend toward five-tier diagnostic tests with additional sources of information obtained. So that the origin of students' misconceptions is known. There are several ways to remediate student misconceptions, namely 11 ways and one of the methods used is digital practicum so that abstract things can be visualized into real ones. This research is limited to knowing what tools are used to diagnose and remediate misconceptions, so it is not yet known how big the effect of remediation methods is on misconceptions. The researcher recommends that in the future further research can be carried out to find out the most appropriate remediation method for remediating student misconceptions.

Keywords: misconception, remediation, systematic review, tools

Procedia PDF Downloads 40
2239 The Role of E-Learning in Science, Technology, Engineering, and Math Education

Authors: Annette McArthur

Abstract:

The traditional model of teaching and learning, where ICT sits as a separate entity is not a model for a 21st century school. It is imperative that teaching and learning embraces technological advancements. The challenge in schools lies in shifting the mindset of teachers so they see ICT as integral to their teaching, learning and curriculum rather than a separate E-Learning curriculum stream. This research project investigates how the effective, planned, intentional integration of ICT into a STEM curriculum, can enable the shift in the teacher mindset. The project incorporated: • Developing a professional coaching relationship with key STEM teachers. • Facilitating staff professional development involving student centered project based learning pedagogy in the context of a STEM curriculum. • Facilitating staff professional development involving digital literacy. • Establishing a professional community where collaboration; sharing and reflection were part of the culture of the STEM community. • Facilitating classroom support for the effective delivery innovative STEM curriculum. • Developing STEM learning spaces where technologies were used to empower and engage learners to participate in student-centered, project-based learning.

Keywords: e-learning, ICT, project based learning, STEM

Procedia PDF Downloads 302
2238 Value in Exchange: The Importance of Users Interaction as the Center of User Experiences

Authors: Ramlan Jantan, Norfadilah Kamaruddin, Shahriman Zainal Abidin

Abstract:

In this era of technology, the co-creation method has become a new development trend. In this light, most design businesses have currently transformed their development strategy from being goods-dominant into service-dominant where more attention is given to the end-users and their roles in the development process. As a result, the conventional development process has been replaced with a more cooperative one. Consequently, numerous studies have been conducted to explore the extension of co-creation method in the design development process and most studies have focused on issues found during the production process. In the meantime, this study aims to investigate potential values established during the pre-production process, which is also known as the ‘circumstances value creation’. User involvement is questioned and crucially debate at the entry level of pre-production process in value in-exchange jointly spheres; thus user experiences took place. Thus, this paper proposed a potential framework of the co-creation method for Malaysian interactive product development. The framework is formulated from both parties involved: the users and designers. The framework will clearly give an explanation of the value of the co-creation method, and it could assist relevant design industries/companies in developing a blueprint for the design process. This paper further contributes to the literature on the co-creation of value and digital ecosystems.

Keywords: co-creation method, co-creation framework, co-creation, co-production

Procedia PDF Downloads 179
2237 A Simple and Efficient Method for Accurate Measurement and Control of Power Frequency Deviation

Authors: S. J. Arif

Abstract:

In the presented technique, a simple method is given for accurate measurement and control of power frequency deviation. The sinusoidal signal for which the frequency deviation measurement is required is transformed to a low voltage level and passed through a zero crossing detector to convert it into a pulse train. Another stable square wave signal of 10 KHz is obtained using a crystal oscillator and decade dividing assemblies (DDA). These signals are combined digitally and then passed through decade counters to give a unique combination of pulses or levels, which are further encoded to make them equally suitable for both control applications and display units. The developed circuit using discrete components has a resolution of 0.5 Hz and completes measurement within 20 ms. The realized circuit is simulated and synthesized using Verilog HDL and subsequently implemented on FPGA. The results of measurement on FPGA are observed on a very high resolution logic analyzer. These results accurately match the simulation results as well as the results of same circuit implemented with discrete components. The proposed system is suitable for accurate measurement and control of power frequency deviation.

Keywords: digital encoder for frequency measurement, frequency deviation measurement, measurement and control systems, power systems

Procedia PDF Downloads 378
2236 Impact of Lack of Testing on Patient Recovery in the Early Phase of COVID-19: Narratively Collected Perspectives from a Remote Monitoring Program

Authors: Nicki Mohammadi, Emma Reford, Natalia Romano Spica, Laura Tabacof, Jenna Tosto-Mancuso, David Putrino, Christopher P. Kellner

Abstract:

Introductory Statement: The onset of the COVID-19 pandemic demanded an unprecedented need for the rapid development, dispersal, and application of infection testing. However, despite the impressive mobilization of resources, individuals were incredibly limited in their access to tests, particularly during the initial months of the pandemic (March-April 2020) in New York City (NYC). Access to COVID-19 testing is crucial in understanding patients’ illness experiences and integral to the development of COVID-19 standard-of-care protocols, especially in the context of overall access to healthcare resources. Succinct Description of basic methodologies: 18 Patients in a COVID-19 Remote Patient Monitoring Program (Precision Recovery within the Mount Sinai Health System) were interviewed regarding their experience with COVID-19 during the first wave (March-May 2020) of the COVID-19 pandemic in New York City. Patients were asked about their experiences navigating COVID-19 diagnoses, the health care system, and their recovery process. Transcribed interviews were analyzed for thematic codes, using grounded theory to guide the identification of emergent themes and codebook development through an iterative process. Data coding was performed using NVivo12. References for the domain “testing” were then extracted and analyzed for themes and statistical patterns. Clear Indication of Major Findings of the study: 100% of participants (18/18) referenced COVID-19 testing in their interviews, with a total of 79 references across the 18 transcripts (average: 4.4 references/interview; 2.7% interview coverage). 89% of participants (16/18) discussed the difficulty of access to testing, including denial of testing without high severity of symptoms, geographical distance to the testing site, and lack of testing resources at healthcare centers. Participants shared varying perspectives on how the lack of certainty regarding their COVID-19 status affected their course of recovery. One participant shared that because she never tested positive she was shielded from her anxiety and fear, given the death toll in NYC. Another group of participants shared that not having a concrete status to share with family, friends and professionals affected how seriously onlookers took their symptoms. Furthermore, the absence of a positive test barred some individuals from access to treatment programs and employment support. Concluding Statement: Lack of access to COVID-19 testing in the first wave of the pandemic in NYC was a prominent element of patients’ illness experience, particularly during their recovery phase. While for some the lack of concrete results was protective, most emphasized the invalidating effect this had on the perception of illness for both self and others. COVID-19 testing is now widely accessible; however, those who are unable to demonstrate a positive test result but who are still presumed to have had COVID-19 in the first wave must continue to adapt to and live with the effects of this gap in knowledge and care on their recovery. Future efforts are required to ensure that patients do not face barriers to care due to the lack of testing and are reassured regarding their access to healthcare. Affiliations- 1Department of Neurosurgery, Icahn School of Medicine at Mount Sinai, New York, NY 2Abilities Research Center, Department of Rehabilitation and Human Performance, Icahn School of Medicine at Mount Sinai, New York, NY

Keywords: accessibility, COVID-19, recovery, testing

Procedia PDF Downloads 196
2235 A Discourse Study of Multimodal Intertextuality in Egyptian Social Media Memes

Authors: Ola Hafez

Abstract:

This study examines the way selected Egyptian digitally mediated memes utilize intertextuality as a means of expression. It is motivated by the emerging digital socio-political humorous practice using various forms of political commentary in Egyptian social media. One of these forms involves the use of memes incorporating (often doctored) video frames taken from Egyptian plays, films and songs, and relocated in a different socio-political context, often with a caption that re-appropriates the frame for the purpose of critical commentary, thus juxtaposing the socio-political phenomena being addressed and the Egyptian artistic and cultural heritage. The paper presents a discourse study of a convenience sample of a recent social media campaign and carries out two levels of analysis. At the micro level, the study pinpoints the various modes of intertextuality employed, including verbal as well as visual intertextuality in the light of the work of social semiotics by Kress and van Leeuwen. At the macro level, the paper sheds light on the socio-political implications of such practice in the light of Political Discourse Analysis.

Keywords: digitally mediated discourse, discourse analysis, Egyptian Arabic, intertextuality, memes, multimodality, political discourse analysis

Procedia PDF Downloads 218
2234 Motivation and Criteria as Determinant Factors in Accepting New Talents on User-Generated Content (UGC): Youtube as a Platform

Authors: Shereen Nadira Binti Jasney, Mohd Syuhaidi Bin Abu Bakar, Hafizah Binti Rosli

Abstract:

This quantitative study explored factors that motivate the public to use YouTube; and the elements of criteria, which the public are looking for to accept new talents on User-Generated Content (UGC). There are mass inputs on the net but the publics are still being very selective in accepting new talents. Thus, it is important to identify determinant factors that contribute to the acceptance of new talents on UGC. A total number of 236 respondents have participated in this study using Simple Random Sampling and they were analyzed with descriptive analysis. The findings of this paper advocate that tremendous expansion; and diversification YouTube music offers are main factors that motivated public viewers in using YouTube on accepting new talents. It is also found that by being relatable and concurrently providing interesting contents, having the artist name and song title in the YouTube talent’s title video and the number of views and likes of the video are some of the criteria that the public are looking for in accepting new talents on the UGC. This paper introduces YouTube as a mean of discovering new talents in the music industry where the public, especially the younger generations, whom are actively engaged with current digital landscape that they’ve been presently silver-plated.

Keywords: motivation, criteria, new talents, UGC, YouTube

Procedia PDF Downloads 289
2233 Track and Evaluate Cortical Responses Evoked by Electrical Stimulation

Authors: Kyosuke Kamada, Christoph Kapeller, Michael Jordan, Mostafa Mohammadpour, Christy Li, Christoph Guger

Abstract:

Cortico-cortical evoked potentials (CCEP) refer to responses generated by cortical electrical stimulation at distant brain sites. These responses provide insights into the functional networks associated with language or motor functions, and in the context of epilepsy, they can reveal pathological networks. Locating the origin and spread of seizures within the cortex is crucial for pre-surgical planning. This process can be enhanced by employing cortical stimulation at the seizure onset zone (SOZ), leading to the generation of CCEPs in remote brain regions that may be targeted for disconnection. In the case of a 24-year-old male patient suffering from intractable epilepsy, corpus callosotomy was performed as part of the treatment. DTI-MRI imaging, conducted using a 3T MRI scanner for fiber tracking, along with CCEP, is used as part of an assessment for surgical planning. Stimulation of the SOZ, with alternating monophasic pulses of 300µs duration and 15mA current intensity, resulted in CCEPs on the contralateral frontal cortex, reaching a peak amplitude of 206µV with a latency of 31ms, specifically in the left pars triangularis. The related fiber tracts were identified with a two-tensor unscented Kalman filter (UKF) technique, showing transversal fibers through the corpus callosum. The CCEPs were monitored through the progress of the surgery. Notably, the SOZ-associated CCEPs exhibited a reduction following the resection of the anterior portion of the corpus callosum, reaching the identified connecting fibers. This intervention demonstrated a potential strategy for mitigating the impact of intractable epilepsy through targeted disconnection of identified cortical regions.

Keywords: CCEP, SOZ, Corpus callosotomy, DTI

Procedia PDF Downloads 71