Search results for: medical data visualization
27433 Can 3D Virtual Prototyping Conquers the Apparel Industry?
Authors: Evridiki Papachristou, Nikolaos Bilalis
Abstract:
Imagine an apparel industry where fashion design does not begin with a paper-and-pen drawing which is then translated into pattern and later to a 3D model where the designer tries out different fabrics, colours and contrasts. Instead, imagine a fashion designer in the future who produces that initial fashion drawing in a three-dimensional space and won’t leave that environment until the product is done, communicating his/her ideas with the entire development team in true to life 3D. Three-dimensional (3D) technology - while well established in many other industrial sectors like automotive, aerospace, architecture and industrial design, has only just started to open up a whole range of new opportunities for apparel designers. The paper will discuss the process of 3D simulation technology enhanced by high quality visualization of data and its capability to ensure a massive competitiveness in the market. Secondly, it will underline the most frequent problems & challenges that occur in the process chain when various partners in the production of textiles and apparel are working together. Finally, it will offer a perspective of how the Virtual Prototyping Technology will make the global textile and apparel industry change to a level where designs will be visualized on a computer and various scenarios modeled without even having to produce a physical prototype. This state-of-the-art 3D technology has been described as transformative and“disruptive”comparing to the process of the way apparel companies develop their fashion products today. It provides the benefit of virtual sampling not only for quick testing of design ideas, but also reducing process steps and having more visibility.A so called“digital asset” that can be used for other purposes such as merchandising or marketing.Keywords: 3D visualization, apparel, virtual prototyping, prototyping technology
Procedia PDF Downloads 59227432 GeneNet: Temporal Graph Data Visualization for Gene Nomenclature and Relationships
Authors: Jake Gonzalez, Tommy Dang
Abstract:
This paper proposes a temporal graph approach to visualize and analyze the evolution of gene relationships and nomenclature over time. An interactive web-based tool implements this temporal graph, enabling researchers to traverse a timeline and observe coupled dynamics in network topology and naming conventions. Analysis of a real human genomic dataset reveals the emergence of densely interconnected functional modules over time, representing groups of genes involved in key biological processes. For example, the antimicrobial peptide DEFA1A3 shows increased connections to related alpha-defensins involved in infection response. Tracking degree and betweenness centrality shifts over timeline iterations also quantitatively highlight the reprioritization of certain genes’ topological importance as knowledge advances. Examination of the CNR1 gene encoding the cannabinoid receptor CB1 demonstrates changing synonymous relationships and consolidating naming patterns over time, reflecting its unique functional role discovery. The integrated framework interconnecting these topological and nomenclature dynamics provides richer contextual insights compared to isolated analysis methods. Overall, this temporal graph approach enables a more holistic study of knowledge evolution to elucidate complex biology.Keywords: temporal graph, gene relationships, nomenclature evolution, interactive visualization, biological insights
Procedia PDF Downloads 6227431 Design and Development of a Computerized Medical Record System for Hospitals in Remote Areas
Authors: Grace Omowunmi Soyebi
Abstract:
A computerized medical record system is a collection of medical information about a person that is stored on a computer. One principal problem of most hospitals in rural areas is using the file management system for keeping records. A lot of time is wasted when a patient visits the hospital, probably in an emergency, and the nurse or attendant has to search through voluminous files before the patient's file can be retrieved; this may cause an unexpected to happen to the patient. This data mining application is to be designed using a structured system analysis and design method which will help in a well-articulated analysis of the existing file management system, feasibility study, and proper documentation of the design and implementation of a computerized medical record system. This computerized system will replace the file management system and help to quickly retrieve a patient's record with increased data security, access clinical records for decision-making, and reduce the time range at which a patient gets attended to.Keywords: programming, data, software development, innovation
Procedia PDF Downloads 8827430 Still Pictures for Learning Foreign Language Sounds
Authors: Kaoru Tomita
Abstract:
This study explores how visual information helps us to learn foreign language pronunciation. Visual assistance and its effect for learning foreign language have been discussed widely. For example, simplified illustrations in textbooks are used for telling learners which part of the articulation organs are used for pronouncing sounds. Vowels are put into a chart that depicts a vowel space. Consonants are put into a table that contains two axes of place and manner of articulation. When comparing a still picture and a moving picture for visualizing learners’ pronunciation, it becomes clear that the former works better than the latter. The visualization of vowels was applied to class activities in which native and non-native speakers’ English was compared and the learners’ feedback was collected: the positions of six vowels did not scatter as much as they were expected to do. Specifically, two vowels were not discriminated and were arranged very close in the vowel space. It was surprising for the author to find that learners liked analyzing their own pronunciation by linking formant ones and twos on a sheet of paper with a pencil. Even a simple method works well if it leads learners to think about their pronunciation analytically.Keywords: feedback, pronunciation, visualization, vowel
Procedia PDF Downloads 25327429 Mining User-Generated Contents to Detect Service Failures with Topic Model
Authors: Kyung Bae Park, Sung Ho Ha
Abstract:
Online user-generated contents (UGC) significantly change the way customers behave (e.g., shop, travel), and a pressing need to handle the overwhelmingly plethora amount of various UGC is one of the paramount issues for management. However, a current approach (e.g., sentiment analysis) is often ineffective for leveraging textual information to detect the problems or issues that a certain management suffers from. In this paper, we employ text mining of Latent Dirichlet Allocation (LDA) on a popular online review site dedicated to complaint from users. We find that the employed LDA efficiently detects customer complaints, and a further inspection with the visualization technique is effective to categorize the problems or issues. As such, management can identify the issues at stake and prioritize them accordingly in a timely manner given the limited amount of resources. The findings provide managerial insights into how analytics on social media can help maintain and improve their reputation management. Our interdisciplinary approach also highlights several insights by applying machine learning techniques in marketing research domain. On a broader technical note, this paper illustrates the details of how to implement LDA in R program from a beginning (data collection in R) to an end (LDA analysis in R) since the instruction is still largely undocumented. In this regard, it will help lower the boundary for interdisciplinary researcher to conduct related research.Keywords: latent dirichlet allocation, R program, text mining, topic model, user generated contents, visualization
Procedia PDF Downloads 18727428 A Web-Based Systems Immunology Toolkit Allowing the Visualization and Comparative Analysis of Publically Available Collective Data to Decipher Immune Regulation in Early Life
Authors: Mahbuba Rahman, Sabri Boughorbel, Scott Presnell, Charlie Quinn, Darawan Rinchai, Damien Chaussabel, Nico Marr
Abstract:
Collections of large-scale datasets made available in public repositories can be used to identify and fill gaps in biomedical knowledge. But first, these data need to be made readily accessible to researchers for analysis and interpretation. Here a collection of transcriptome datasets was made available to investigate the functional programming of human hematopoietic cells in early life. Thirty two datasets were retrieved from the NCBI Gene Expression Omnibus (GEO) and loaded in a custom, interactive web application called the Gene Expression browser (GXB), designed for visualization and query of integrated large-scale data. Multiple sample groupings and gene rank lists were created based on the study design and variables in each dataset. Web links to customized graphical views can be generated by users and subsequently be used to graphically present data in manuscripts for publication. The GXB tool also enables browsing of a single gene across datasets, which can provide information on the role of a given molecule across biological systems. The dataset collection is available online. As a proof-of-principle, one of the datasets (GSE25087) was re-analyzed to identify genes that are differentially expressed by regulatory T cells in early life. Re-analysis of this dataset and a cross-study comparison using multiple other datasets in the above mentioned collection revealed that PMCH, a gene encoding a precursor of melanin-concentrating hormone (MCH), a cyclic neuropeptide, is highly expressed in a variety of other hematopoietic cell types, including neonatal erythroid cells as well as plasmacytoid dendritic cells upon viral infection. Our findings suggest an as yet unrecognized role of MCH in immune regulation, thereby highlighting the unique potential of the curated dataset collection and systems biology approach to generate new hypotheses which can be tested in future mechanistic studies.Keywords: early-life, GEO datasets, PMCH, interactive query, systems biology
Procedia PDF Downloads 29727427 Applying Hybrid Graph Drawing and Clustering Methods on Stock Investment Analysis
Authors: Mouataz Zreika, Maria Estela Varua
Abstract:
Stock investment decisions are often made based on current events of the global economy and the analysis of historical data. Conversely, visual representation could assist investors’ gain deeper understanding and better insight on stock market trends more efficiently. The trend analysis is based on long-term data collection. The study adopts a hybrid method that combines the Clustering algorithm and Force-directed algorithm to overcome the scalability problem when visualizing large data. This method exemplifies the potential relationships between each stock, as well as determining the degree of strength and connectivity, which will provide investors another understanding of the stock relationship for reference. Information derived from visualization will also help them make an informed decision. The results of the experiments show that the proposed method is able to produced visualized data aesthetically by providing clearer views for connectivity and edge weights.Keywords: clustering, force-directed, graph drawing, stock investment analysis
Procedia PDF Downloads 30227426 A Small Graphic Lie. The Photographic Quality of Pierre Bourdieu’s Correspondance Analysis
Authors: Lene Granzau Juel-Jacobsen
Abstract:
The problem of beautification is an obvious concern of photography, claiming reference to reality, but it also lies at the very heart of social theory. As we become accustomed to sophisticated visualizations of statistical data in pace with the development of software programs, we should not only be inclined to ask new types of research questions, but we also need to confront social theories based on such visualization techniques with new types of questions. Correspondence Analysis, GIS analysis, Social Network Analysis, and Perceptual Maps are current examples of visualization techniques popular within the social sciences and neighboring disciplines. This article discusses correspondence analysis, arguing that the graphic plot of correspondence analysis is to be interpreted much similarly to a photograph. It refers no more evidently or univocally to reality than a photograph, representing social life no more truthfully than a photograph documents. Pierre Bourdieu’s theoretical corpus, especially his theory of fields, relies heavily on correspondence analysis. While much attention has been directed towards critiquing the somewhat vague conceptualization of habitus, limited focus has been placed on the equally problematic concepts of social space and field. Based on a re-reading of the Distinction, the article argues that the concepts rely on ‘a small graphic lie’ very similar to a photograph. Like any other piece of art, as Bourdieu himself recognized, the graphic display is a politically and morally loaded representation technique. However, the correspondence analysis does not necessarily serve the purpose he intended. In fact, it tends towards the pitfalls he strove to overcome.Keywords: datavisualization, correspondance analysis, bourdieu, Field, visual representation
Procedia PDF Downloads 6827425 Computer-Aided Diagnosis of Eyelid Skin Tumors Using Machine Learning
Authors: Ofira Zloto, Ofir Fogel, Eyal Klang
Abstract:
Purpose: The aim is to develop an automated framework based on machine learning to diagnose malignant eyelid skin tumors. Methods: This study utilized eyelid lesion images from Sheba Medical Center, a large tertiary center in Israel. Before model training, we pre-trained our models on the ISIC 2019 dataset consisting of 25,332 images. The proprietary eyelid dataset was then used for fine-tuning. The dataset contained multiple images per patient, aiming to classify malignant lesions in comparison to benign counterparts. Results: The analyzed dataset consisted of images representing both benign and malignant eyelid lesions. For the benign category, a total of 373 images were sourced. In comparison, the malignant category has 186 images. Based on the accuracy values, the model with 3 epochs and a learning rate of 0.0001 exhibited the best performance, achieving an accuracy of 0.748 with a standard deviation of 0.034. At a sensitivity of 69%, the model has a corresponding specificity of 82%. To further understand the decision-making process of our model, we employed heatmap visualization techniques, specifically Gradient-weighted Class Activation Mapping. Discussion: This study introduces a dependable model-aided diagnostic technology for assessing eyelid skin lesions. The model demonstrated accuracy comparable to human evaluation, effectively determining whether a lesion raises a high suspicion of malignancy or is benign. Such a model has the potential to alleviate the burden on the healthcare system, particularly benefiting rural areas and enhancing the efficiency of clinicians and overall healthcare.Keywords: machine learning;, eyelid skin tumors;, decision-making process;, heatmap visualization techniques
Procedia PDF Downloads 427424 An Overview of College English Writing Teaching Studies in China Between 2002 and 2022: Visualization Analysis Based on CiteSpace
Authors: Yang Yiting
Abstract:
This paper employs CiteSpace to conduct a visualiazation analysis of literature on college English writing teaching researches published in core journals from the CNKI database and CSSCI journals between 2002 and 2022. It aims to explore the characteristics of researches and future directions on college English writing teaching. The present study yielded the following major findings: the field primarily focuses on innovative writing teaching models and methods, the integration of traditional classroom teaching and information technology, and instructional strategies to enhance students' writing skills. The future research is anticipated to involve a hybrid writing teaching approach combining online and offline teaching methods, leveraging the "Internet+" digital platform, aiming to elevate students' writing proficiency. This paper also presents a prospective outlook for college English writing teaching research in China.Keywords: citespace, college English, writing teaching, visualization analysis
Procedia PDF Downloads 7227423 Assessing the Incapacity of Indonesian Aviators Medical Conditions in 2016 – 2017
Authors: Ferdi Afian, Inne Yuliawati
Abstract:
Background: The change in causes of death from infectious diseases to non-communicable diseases also occurs in the aviation community in Indonesia. Non-communicable diseases are influenced by several internal risk factors, such as age, lifestyle changes and the presence of other diseases. These risk factors will increase the incidence of heart diseases resulting in the incapacity of Indonesian aviators which will disrupt flight safety. Method: The study was conducted by collecting secondary data. The retrieval of primary data was obtained from medical records at the Indonesian Aviation Health Center in 2016-2017. The subjects in this study were all cases of incapacity in Indonesian aviators medical conditions. Results: In this study, there were 15 cases of aviators in Indonesia who experienced incapacity of medical conditions related to heart and lung diseases in 2016-2017. Based on the secondary data contained in the flight medical records at the Aviation Health Center Aviation, it was found that several factors related to aviators incapacity causing its inability to carried out flight duties. Conclusion: Incapacity of Indonesian aviators medical conditions are most affected by the high value of Body Mass Index (86%) and less affected by high of Uric Acid in the blood (26%) and Hyperglycemia (26%).Keywords: incapacity, aviators, flight, Indonesia
Procedia PDF Downloads 13527422 Legal Judgment Prediction through Indictments via Data Visualization in Chinese
Authors: Kuo-Chun Chien, Chia-Hui Chang, Ren-Der Sun
Abstract:
Legal Judgment Prediction (LJP) is a subtask for legal AI. Its main purpose is to use the facts of a case to predict the judgment result. In Taiwan's criminal procedure, when prosecutors complete the investigation of the case, they will decide whether to prosecute the suspect and which article of criminal law should be used based on the facts and evidence of the case. In this study, we collected 305,240 indictments from the public inquiry system of the procuratorate of the Ministry of Justice, which included 169 charges and 317 articles from 21 laws. We take the crime facts in the indictments as the main input to jointly learn the prediction model for law source, article, and charge simultaneously based on the pre-trained Bert model. For single article cases where the frequency of the charge and article are greater than 50, the prediction performance of law sources, articles, and charges reach 97.66, 92.22, and 60.52 macro-f1, respectively. To understand the big performance gap between articles and charges, we used a bipartite graph to visualize the relationship between the articles and charges, and found that the reason for the poor prediction performance was actually due to the wording precision. Some charges use the simplest words, while others may include the perpetrator or the result to make the charges more specific. For example, Article 284 of the Criminal Law may be indicted as “negligent injury”, "negligent death”, "business injury", "driving business injury", or "non-driving business injury". As another example, Article 10 of the Drug Hazard Control Regulations can be charged as “Drug Control Regulations” or “Drug Hazard Control Regulations”. In order to solve the above problems and more accurately predict the article and charge, we plan to include the article content or charge names in the input, and use the sentence-pair classification method for question-answer problems in the BERT model to improve the performance. We will also consider a sequence-to-sequence approach to charge prediction.Keywords: legal judgment prediction, deep learning, natural language processing, BERT, data visualization
Procedia PDF Downloads 12227421 Design and Development of Data Mining Application for Medical Centers in Remote Areas
Authors: Grace Omowunmi Soyebi
Abstract:
Data Mining is the extraction of information from a large database which helps in predicting a trend or behavior, thereby helping management make knowledge-driven decisions. One principal problem of most hospitals in rural areas is making use of the file management system for keeping records. A lot of time is wasted when a patient visits the hospital, probably in an emergency, and the nurse or attendant has to search through voluminous files before the patient's file can be retrieved; this may cause an unexpected to happen to the patient. This Data Mining application is to be designed using a Structured System Analysis and design method, which will help in a well-articulated analysis of the existing file management system, feasibility study, and proper documentation of the Design and Implementation of a Computerized medical record system. This Computerized system will replace the file management system and help to easily retrieve a patient's record with increased data security, access clinical records for decision-making, and reduce the time range at which a patient gets attended to.Keywords: data mining, medical record system, systems programming, computing
Procedia PDF Downloads 21027420 Geovisualisation for Defense Based on a Deep Learning Monocular Depth Reconstruction Approach
Authors: Daniel R. dos Santos, Mateus S. Maldonado, Estevão J. R. Batista
Abstract:
The military commanders increasingly dependent on spatial awareness, as knowing where enemy are, understanding how war battle scenarios change over time, and visualizing these trends in ways that offer insights for decision-making. Thanks to advancements in geospatial technologies and artificial intelligence algorithms, the commanders are now able to modernize military operations on a universal scale. Thus, geovisualisation has become an essential asset in the defense sector. It has become indispensable for better decisionmaking in dynamic/temporal scenarios, operation planning and management for the war field, situational awareness, effective planning, monitoring, and others. For example, a 3D visualization of war field data contributes to intelligence analysis, evaluation of postmission outcomes, and creation of predictive models to enhance decision-making and strategic planning capabilities. However, old-school visualization methods are slow, expensive, and unscalable. Despite modern technologies in generating 3D point clouds, such as LIDAR and stereo sensors, monocular depth values based on deep learning can offer a faster and more detailed view of the environment, transforming single images into visual information for valuable insights. We propose a dedicated monocular depth reconstruction approach via deep learning techniques for 3D geovisualisation of satellite images. It introduces scalability in terrain reconstruction and data visualization. First, a dataset with more than 7,000 satellite images and associated digital elevation model (DEM) is created. It is based on high resolution optical and radar imageries collected from Planet and Copernicus, on which we fuse highresolution topographic data obtained using technologies such as LiDAR and the associated geographic coordinates. Second, we developed an imagery-DEM fusion strategy that combine feature maps from two encoder-decoder networks. One network is trained with radar and optical bands, while the other is trained with DEM features to compute dense 3D depth. Finally, we constructed a benchmark with sparse depth annotations to facilitate future research. To demonstrate the proposed method's versatility, we evaluated its performance on no annotated satellite images and implemented an enclosed environment useful for Geovisualisation applications. The algorithms were developed in Python 3.0, employing open-source computing libraries, i.e., Open3D, TensorFlow, and Pythorch3D. The proposed method provides fast and accurate decision-making with GIS for localization of troops, position of the enemy, terrain and climate conditions. This analysis enhances situational consciousness, enabling commanders to fine-tune the strategies and distribute the resources proficiently.Keywords: depth, deep learning, geovisualisation, satellite images
Procedia PDF Downloads 1327419 Design and Testing of Electrical Capacitance Tomography Sensors for Oil Pipeline Monitoring
Authors: Sidi M. A. Ghaly, Mohammad O. Khan, Mohammed Shalaby, Khaled A. Al-Snaie
Abstract:
Electrical capacitance tomography (ECT) is a valuable, non-invasive technique used to monitor multiphase flow processes, especially within industrial pipelines. This study focuses on the design, testing, and performance comparison of ECT sensors configured with 8, 12, and 16 electrodes, aiming to evaluate their effectiveness in imaging accuracy, resolution, and sensitivity. Each sensor configuration was designed to capture the spatial permittivity distribution within a pipeline cross-section, enabling visualization of phase distribution and flow characteristics such as oil and water interactions. The sensor designs were implemented and tested in closed pipes to assess their response to varying flow regimes. Capacitance data collected from each electrode configuration were reconstructed into cross-sectional images, enabling a comparison of image resolution, noise levels, and computational demands. Results indicate that the 16-electrode configuration yields higher image resolution and sensitivity to phase boundaries compared to the 8- and 12-electrode setups, making it more suitable for complex flow visualization. However, the 8 and 12-electrode sensors demonstrated advantages in processing speed and lower computational requirements. This comparative analysis provides critical insights into optimizing ECT sensor design based on specific industrial requirements, from high-resolution imaging to real-time monitoring needs.Keywords: capacitance tomography, modeling, simulation, electrode, permittivity, fluid dynamics, imaging sensitivity measurement
Procedia PDF Downloads 1427418 Analyzing Medical Workflows Using Market Basket Analysis
Authors: Mohit Kumar, Mayur Betharia
Abstract:
Healthcare domain, with the emergence of Electronic Medical Record (EMR), collects a lot of data which have been attracting Data Mining expert’s interest. In the past, doctors have relied on their intuition while making critical clinical decisions. This paper presents the means to analyze the Medical workflows to get business insights out of huge dumped medical databases. Market Basket Analysis (MBA) which is a special data mining technique, has been widely used in marketing and e-commerce field to discover the association between products bought together by customers. It helps businesses in increasing their sales by analyzing the purchasing behavior of customers and pitching the right customer with the right product. This paper is an attempt to demonstrate Market Basket Analysis applications in healthcare. In particular, it discusses the Market Basket Analysis Algorithm ‘Apriori’ applications within healthcare in major areas such as analyzing the workflow of diagnostic procedures, Up-selling and Cross-selling of Healthcare Systems, designing healthcare systems more user-friendly. In the paper, we have demonstrated the MBA applications using Angiography Systems, but can be extrapolated to other modalities as well.Keywords: data mining, market basket analysis, healthcare applications, knowledge discovery in healthcare databases, customer relationship management, healthcare systems
Procedia PDF Downloads 17427417 Development of a Secured Telemedical System Using Biometric Feature
Authors: O. Iyare, A. H. Afolayan, O. T. Oluwadare, B. K. Alese
Abstract:
Access to advanced medical services has been one of the medical challenges faced by our present society especially in distant geographical locations which may be inaccessible. Then the need for telemedicine arises through which live videos of a doctor can be streamed to a patient located anywhere in the world at any time. Patients’ medical records contain very sensitive information which should not be made accessible to unauthorized people in order to protect privacy, integrity and confidentiality. This research work focuses on a more robust security measure which is biometric (fingerprint) as a form of access control to data of patients by the medical specialist/practitioner.Keywords: biometrics, telemedicine, privacy, patient information
Procedia PDF Downloads 29027416 Diabetes Diagnosis Model Using Rough Set and K- Nearest Neighbor Classifier
Authors: Usiobaifo Agharese Rosemary, Osaseri Roseline Oghogho
Abstract:
Diabetes is a complex group of disease with a variety of causes; it is a disorder of the body metabolism in the digestion of carbohydrates food. The application of machine learning in the field of medical diagnosis has been the focus of many researchers and the use of recognition and classification model as a decision support tools has help the medical expert in diagnosis of diseases. Considering the large volume of medical data which require special techniques, experience, and high diagnostic skill in the diagnosis of diseases, the application of an artificial intelligent system to assist medical personnel in order to enhance their efficiency and accuracy in diagnosis will be an invaluable tool. In this study will propose a diabetes diagnosis model using rough set and K-nearest Neighbor classifier algorithm. The system consists of two modules: the feature extraction module and predictor module, rough data set is used to preprocess the attributes while K-nearest neighbor classifier is used to classify the given data. The dataset used for this model was taken for University of Benin Teaching Hospital (UBTH) database. Half of the data was used in the training while the other half was used in testing the system. The proposed model was able to achieve over 80% accuracy.Keywords: classifier algorithm, diabetes, diagnostic model, machine learning
Procedia PDF Downloads 33627415 Meta-Review of Scholarly Publications on Biosensors: A Bibliometric Study
Authors: Nasrine Olson
Abstract:
With over 70,000 scholarly publications on the topic of biosensors, an overview of the field has become a challenge. To facilitate, there are currently over 700 expert-reviews of publications on biosensors and related topics. This study focuses on these review papers in order to provide a Meta-Review of the area. This paper provides a statistical analysis and overview of biosensor-related review papers. Comprehensive searches are conducted in the Web of Science, and PubMed databases and the resulting empirical material are analyzed using bibliometric methods and tools. The study finds that the biosensor-related review papers can be categorized in five related subgroups, broadly denoted by (i) properties of materials and particles, (ii) analysis and indicators, (iii) diagnostics, (iv) pollutant and analytical devices, and (v) treatment/ application. For an easy and clear access to the findings visualization of clusters and networks of connections are presented. The study includes a temporal dimension and identifies the trends over the years with an emphasis on the most recent developments. This paper provides useful insights for those who wish to form a better understanding of the research trends in the area of biosensors.Keywords: bibliometrics, biosensors, meta-review, statistical analysis, trends visualization
Procedia PDF Downloads 21927414 Work Life Balance Strategies and Retention of Medical Professionals
Authors: Naseem M. Twaissi
Abstract:
Medical professionals play an important role in society, and in general, they care more about their patients than about their personal well-being. They need to take a professional approach to maintain a work-life balance. Through a collection of primary data from 1020 medical professionals and the application of relevant statistical tools, this paper explores the pressures on medical professionals with reference to their work-life balance. This study highlights how hospital management, in addition to economic reasons, needs to identify variables to enhance the work-life balance of medical professionals so that quality healthcare facilities may be provided to the citizens of Jordan. Results indicate that formulation and implementation of policies for enhancing work-life balance together with career and retention plans for medical professionals would enhance the performance of hospitals and the quality of health care in Jordan, leading to greater societal well-being.Keywords: work life balance, job environment, job satisfaction, employee well-being, stress, hospital industry
Procedia PDF Downloads 14227413 Sequential Pattern Mining from Data of Medical Record with Sequential Pattern Discovery Using Equivalent Classes (SPADE) Algorithm (A Case Study : Bolo Primary Health Care, Bima)
Authors: Rezky Rifaini, Raden Bagus Fajriya Hakim
Abstract:
This research was conducted at the Bolo primary health Care in Bima Regency. The purpose of the research is to find out the association pattern that is formed of medical record database from Bolo Primary health care’s patient. The data used is secondary data from medical records database PHC. Sequential pattern mining technique is the method that used to analysis. Transaction data generated from Patient_ID, Check_Date and diagnosis. Sequential Pattern Discovery Algorithms Using Equivalent Classes (SPADE) is one of the algorithm in sequential pattern mining, this algorithm find frequent sequences of data transaction, using vertical database and sequence join process. Results of the SPADE algorithm is frequent sequences that then used to form a rule. It technique is used to find the association pattern between items combination. Based on association rules sequential analysis with SPADE algorithm for minimum support 0,03 and minimum confidence 0,75 is gotten 3 association sequential pattern based on the sequence of patient_ID, check_Date and diagnosis data in the Bolo PHC.Keywords: diagnosis, primary health care, medical record, data mining, sequential pattern mining, SPADE algorithm
Procedia PDF Downloads 40227412 Interactive Glare Visualization Model for an Architectural Space
Authors: Florina Dutt, Subhajit Das, Matthew Swartz
Abstract:
Lighting design and its impact on indoor comfort conditions are an integral part of good interior design. Impact of lighting in an interior space is manifold and it involves many sub components like glare, color, tone, luminance, control, energy efficiency, flexibility etc. While other components have been researched and discussed multiple times, this paper discusses the research done to understand the glare component from an artificial lighting source in an indoor space. Consequently, the paper discusses a parametric model to convey real time glare level in an interior space to the designer/ architect. Our end users are architects and likewise for them it is of utmost importance to know what impression the proposed lighting arrangement and proposed furniture layout will have on indoor comfort quality. This involves specially those furniture elements (or surfaces) which strongly reflect light around the space. Essentially, the designer needs to know the ramification of the ‘discomfortable glare’ at the early stage of design cycle, when he still can afford to make changes to his proposed design and consider different routes of solution for his client. Unfortunately, most of the lighting analysis tools that are present, offer rigorous computation and analysis on the back end eventually making it challenging for the designer to analyze and know the glare from interior light quickly. Moreover, many of them do not focus on glare aspect of the artificial light. That is why, in this paper, we explain a novel approach to approximate interior glare data. Adding to that we visualize this data in a color coded format, expressing the implications of their proposed interior design layout. We focus on making this analysis process very fluid and fast computationally, enabling complete user interaction with the capability to vary different ranges of user inputs adding more degrees of freedom for the user. We test our proposed parametric model on a case study, a Computer Lab space in our college facility.Keywords: computational geometry, glare impact in interior space, info visualization, parametric lighting analysis
Procedia PDF Downloads 35027411 Flow Visualization around a Rotationally Oscillating Cylinder
Authors: Cemre Polat, Mustafa Soyler, Bulent Yaniktepe, Coskun Ozalp
Abstract:
In this study, it was aimed to control the flow actively by giving an oscillating rotational motion to a vertically placed cylinder, and flow characteristics were determined. In the study, firstly, the flow structure around the flat cylinder was investigated with dye experiments, and then the cylinders with different oscillation angles (θ = 60°, θ = 120°, and θ = 180°) and different rotation speeds (15 rpm and 30 rpm) the flow structure around it was examined. Thus, the effectiveness of oscillation and rotation speed in flow control has been investigated. In the dye experiments, the dye/water mixture obtained by mixing Rhodamine 6G in powder form with water, which shines under laser light and allows detailed observation of the flow structure, was used. During the experiments, the dye was injected into the flow with the help of a thin needle at a distance that would not affect the flow from the front of the cylinder. In dye experiments, 100 frames per second were taken with a Canon brand EOS M50 (24MP) digital mirrorless camera at a resolution of 1280 * 720 pixels. Then, the images taken were analyzed, and the pictures representing the flow structure for each experiment were obtained. As a result of the study, it was observed that no separation points were formed at 180° swing angle at 15 rpm speed, 120° and 180° swing angle at 30 rpm, and the flow was controlled according to the fixed cylinder.Keywords: active flow control, cylinder, flow visualization rotationally oscillating
Procedia PDF Downloads 17727410 Machine Learning-Based Workflow for the Analysis of Project Portfolio
Authors: Jean Marie Tshimula, Atsushi Togashi
Abstract:
We develop a data-science approach for providing an interactive visualization and predictive models to find insights into the projects' historical data in order for stakeholders understand some unseen opportunities in the African market that might escape them behind the online project portfolio of the African Development Bank. This machine learning-based web application identifies the market trend of the fastest growing economies across the continent as well skyrocketing sectors which have a significant impact on the future of business in Africa. Owing to this, the approach is tailored to predict where the investment needs are the most required. Moreover, we create a corpus that includes the descriptions of over more than 1,200 projects that approximately cover 14 sectors designed for some of 53 African countries. Then, we sift out this large amount of semi-structured data for extracting tiny details susceptible to contain some directions to follow. In the light of the foregoing, we have applied the combination of Latent Dirichlet Allocation and Random Forests at the level of the analysis module of our methodology to highlight the most relevant topics that investors may focus on for investing in Africa.Keywords: machine learning, topic modeling, natural language processing, big data
Procedia PDF Downloads 16827409 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh
Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila
Abstract:
Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.Keywords: data culture, data-driven organization, data mesh, data quality for business success
Procedia PDF Downloads 13727408 Radiology Information System’s Mechanisms: HL7-MHS & HL7/DICOM Translation
Authors: Kulwinder Singh Mann
Abstract:
The innovative features of information system, known as Radiology Information System (RIS), for electronic medical records has shown a good impact in the hospital. The objective is to help and make their work easier; such as for a physician to access the patient’s data and for a patient to check their bill transparently. The interoperability of RIS with the other intra-hospital information systems it interacts with, dealing with the compatibility and open architecture issues, are accomplished by two novel mechanisms. The first one is the particular message handling system that is applied for the exchange of information, according to the Health Level Seven (HL7) protocol’s specifications and serves the transfer of medical and administrative data among the RIS applications and data store unit. The second one implements the translation of information between the formats that HL7 and Digital Imaging and Communication in Medicine (DICOM) protocols specify, providing the communication between RIS and Picture and Archive Communication System (PACS) which is used for the increasing incorporation of modern medical imaging equipment.Keywords: RIS, PACS, HIS, HL7, DICOM, messaging service, interoperability, digital images
Procedia PDF Downloads 30127407 The Role of Predictive Modeling and Optimization in Enhancing Smart Factory Efficiency
Authors: Slawomir Lasota, Tomasz Kajdanowicz
Abstract:
This research examines the application of predictive modelling and optimization algorithms to improve production efficiency in smart factories. Utilizing gradient boosting and neural networks, the study builds robust KPI estimators to predict production outcomes based on real-time data. Optimization methods, including Bayesian optimization and gradient-based algorithms, identify optimal process configurations that maximize availability, efficiency, and quality KPIs. The paper highlights the modular architecture of a recommender system that integrates predictive models, data visualization, and adaptive automation. Comparative analysis across multiple production processes reveals significant improvements in operational performance, laying the foundation for scalable, self-regulating manufacturing systems.Keywords: predictive modeling, optimization, smart factory, efficiency
Procedia PDF Downloads 627406 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events
Authors: Jaqueline Maria Ribeiro Vieira
Abstract:
Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). Previously we developed and proposed a novel strategy capable of detecting patterns at borehole images that may point to regions that have tension and breakout characteristics, based on segmented images. In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge data set configurations.Keywords: image segmentation, oil well visualization, classifiers, data-mining, visual computer
Procedia PDF Downloads 30427405 Learning from Small Amount of Medical Data with Noisy Labels: A Meta-Learning Approach
Authors: Gorkem Algan, Ilkay Ulusoy, Saban Gonul, Banu Turgut, Berker Bakbak
Abstract:
Computer vision systems recently made a big leap thanks to deep neural networks. However, these systems require correctly labeled large datasets in order to be trained properly, which is very difficult to obtain for medical applications. Two main reasons for label noise in medical applications are the high complexity of the data and conflicting opinions of experts. Moreover, medical imaging datasets are commonly tiny, which makes each data very important in learning. As a result, if not handled properly, label noise significantly degrades the performance. Therefore, a label-noise-robust learning algorithm that makes use of the meta-learning paradigm is proposed in this article. The proposed solution is tested on retinopathy of prematurity (ROP) dataset with a very high label noise of 68%. Results show that the proposed algorithm significantly improves the classification algorithm's performance in the presence of noisy labels.Keywords: deep learning, label noise, robust learning, meta-learning, retinopathy of prematurity
Procedia PDF Downloads 16227404 Counselling Needs of Psychiatric Patients as Perceived by Their Medical Personnel, in Federal Neuropsychiatric Hospital, Aro, Abeokuta
Authors: F. N. Bolu-Steve, T. A. Ajiboye
Abstract:
A study was carried out on the awareness of counselling needs of psychiatric patients as perceived by medical personnel in the Federal Neuropsychiatric hospital, Aro, Abeokuta, Nigeria. The respondents comprised of medical personnel of the Neuropsychiatric hospital in Aro. Purposive sampling technique was used to select the respondents. The target population of the study consisted of all medical doctors treating the psychiatric patients. A total of 200 respondents participated in the study out of which 143 were males and 57 of them were females. With their years of experience as a medical doctors, 49.5% of them have worked between 1-5 years, 30.5% of the respondents have 6-10 years’ experience while those with 16 years and above experience are 7.0%. The major counselling need of psychiatric patients as expressed by medical doctors is the need to have information about the right balance diet. The data were analyzed using percentages, mean, frequency, Analysis of Variance (ANOVA) and t-test statistical tools. The instrument used for data collection was the structured questionnaire titled “Counselling Needs of Psychiatric Patients Questionnaire” (CNPPQ). This instrument was drafted by the researchers through the review of related literature. The reliability of the instrument was established using test-retest method. A reliability index of 0.74 was obtained. Three of the hypotheses were rejected while two of them were accepted at 0.05 alpha level of significance. Based on the findings of the study, it was recommended that broad based counselling services should be provided to psychiatric patients in order to assist them to develop positive self- image and to cope with their challenges.Keywords: counselling, needs, psychiatric, medical personnel, patients
Procedia PDF Downloads 422