Search results for: healthcare data
24629 Analyzing Test Data Generation Techniques Using Evolutionary Algorithms
Authors: Arslan Ellahi, Syed Amjad Hussain
Abstract:
Software Testing is a vital process in software development life cycle. We can attain the quality of software after passing it through software testing phase. We have tried to find out automatic test data generation techniques that are a key research area of software testing to achieve test automation that can eventually decrease testing time. In this paper, we review some of the approaches presented in the literature which use evolutionary search based algorithms like Genetic Algorithm, Particle Swarm Optimization (PSO), etc. to validate the test data generation process. We also look into the quality of test data generation which increases or decreases the efficiency of testing. We have proposed test data generation techniques for model-based testing. We have worked on tuning and fitness function of PSO algorithm.Keywords: search based, evolutionary algorithm, particle swarm optimization, genetic algorithm, test data generation
Procedia PDF Downloads 19024628 Comparative Analysis of the Third Generation of Research Data for Evaluation of Solar Energy Potential
Authors: Claudineia Brazil, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Rafael Haag
Abstract:
Renewable energy sources are dependent on climatic variability, so for adequate energy planning, observations of the meteorological variables are required, preferably representing long-period series. Despite the scientific and technological advances that meteorological measurement systems have undergone in the last decades, there is still a considerable lack of meteorological observations that form series of long periods. The reanalysis is a system of assimilation of data prepared using general atmospheric circulation models, based on the combination of data collected at surface stations, ocean buoys, satellites and radiosondes, allowing the production of long period data, for a wide gamma. The third generation of reanalysis data emerged in 2010, among them is the Climate Forecast System Reanalysis (CFSR) developed by the National Centers for Environmental Prediction (NCEP), these data have a spatial resolution of 0.50 x 0.50. In order to overcome these difficulties, it aims to evaluate the performance of solar radiation estimation through alternative data bases, such as data from Reanalysis and from meteorological satellites that satisfactorily meet the absence of observations of solar radiation at global and/or regional level. The results of the analysis of the solar radiation data indicated that the reanalysis data of the CFSR model presented a good performance in relation to the observed data, with determination coefficient around 0.90. Therefore, it is concluded that these data have the potential to be used as an alternative source in locations with no seasons or long series of solar radiation, important for the evaluation of solar energy potential.Keywords: climate, reanalysis, renewable energy, solar radiation
Procedia PDF Downloads 20924627 Data Mining Spatial: Unsupervised Classification of Geographic Data
Authors: Chahrazed Zouaoui
Abstract:
In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.Keywords: mining, GIS, geo-clustering, neighborhood
Procedia PDF Downloads 37524626 Multi-Dimensional (Quantatative and Qualatative) Longitudinal Research Methods for Biomedical Research of Post-COVID-19 (“Long Covid”) Symptoms
Authors: Steven G. Sclan
Abstract:
Background: Since December 2019, the world has been afflicted by the spread of the Severe Acute Respiratory Syndrome-Corona Virus-2 (SARS-CoV-2), which is responsible for the condition referred to as Covid-19. The illness has had a cataclysmic impact on the political, social, economic, and overall well-being of the population of the entire globe. While Covid-19 has had a substantial universal fatality impact, it may have an even greater effect on the socioeconomic, medical well-being, and healthcare planning for remaining societies. Significance: As these numbers illustrate, many more persons survive the infection than die from it, and many of those patients have noted ongoing, persistent symptoms after successfully enduring the acute phase of the illness. Recognition and understanding of these symptoms are crucial for developing and arranging efficacious models of care for all patients (whether or not having been hospitalized) surviving acute covid illness and plagued by post-acute symptoms. Furthermore, regarding Covid infection in children (< 18 y/o), although it may be that Covid “+” children are not major vectors of infective transmission, it now appears that many more children than initially thought are carrying the virus without accompanying obvious symptomatic expression. It seems reasonable to wonder whether viral effects occur in children – those children who are Covid “+” and now asymptomatic – and if, over time, they might also experience similar symptoms. An even more significant question is whether Covid “+” asymptomatic children might manifest increased multiple health problems as they grow – i.e., developmental complications (e.g., physical/medical, metabolic, neurobehavioral, etc.) – in comparison to children who had been consistently Covid “ - ” during the pandemic. Topics Addressed and Theoretical Importance: This review is important because of the description of both quantitative and qualitative methods for clinical and biomedical research. Topics reviewed will consider the importance of well-designed, comprehensive (i.e., quantitative and qualitative methods) longitudinal studies of Post Covid-19 symptoms in both adults and children. Also reviewed will be general characteristics of longitudinal studies and a presentation of a model for a proposed study. Also discussed will be the benefit of longitudinal studies for the development of efficacious interventions and for the establishment of cogent, practical, and efficacious community healthcare service planning for post-acute covid patients. Conclusion: Results of multi-dimensional, longitudinal studies will have important theoretical implications. These studies will help to improve our understanding of the pathophysiology of long COVID and will aid in the identification of potential targets for treatment. Such studies can also provide valuable insights into the long-term impact of COVID-19 on public health and socioeconomics.Keywords: COVID-19, post-COVID-19, long COVID, longitudinal research, quantitative research, qualitative research
Procedia PDF Downloads 5924625 Analysis and Prediction of Netflix Viewing History Using Netflixlatte as an Enriched Real Data Pool
Authors: Amir Mabhout, Toktam Ghafarian, Amirhossein Farzin, Zahra Makki, Sajjad Alizadeh, Amirhossein Ghavi
Abstract:
The high number of Netflix subscribers makes it attractive for data scientists to extract valuable knowledge from the viewers' behavioural analyses. This paper presents a set of statistical insights into viewers' viewing history. After that, a deep learning model is used to predict the future watching behaviour of the users based on previous watching history within the Netflixlatte data pool. Netflixlatte in an aggregated and anonymized data pool of 320 Netflix viewers with a length 250 000 data points recorded between 2008-2022. We observe insightful correlations between the distribution of viewing time and the COVID-19 pandemic outbreak. The presented deep learning model predicts future movie and TV series viewing habits with an average loss of 0.175.Keywords: data analysis, deep learning, LSTM neural network, netflix
Procedia PDF Downloads 25124624 Analysis of User Data Usage Trends on Cellular and Wi-Fi Networks
Authors: Jayesh M. Patel, Bharat P. Modi
Abstract:
The availability of on mobile devices that can invoke the demonstrated that the total data demand from users is far higher than previously articulated by measurements based solely on a cellular-centric view of smart-phone usage. The ratio of Wi-Fi to cellular traffic varies significantly between countries, This paper is shown the compression between the cellular data usage and Wi-Fi data usage by the user. This strategy helps operators to understand the growing importance and application of yield management strategies designed to squeeze maximum returns from their investments into the networks and devices that enable the mobile data ecosystem. The transition from unlimited data plans towards tiered pricing and, in the future, towards more value-centric pricing offers significant revenue upside potential for mobile operators, but, without a complete insight into all aspects of smartphone customer behavior, operators will unlikely be able to capture the maximum return from this billion-dollar market opportunity.Keywords: cellular, Wi-Fi, mobile, smart phone
Procedia PDF Downloads 36524623 Data Driven Infrastructure Planning for Offshore Wind farms
Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree
Abstract:
The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data
Procedia PDF Downloads 8624622 Evaluating the Implementation of a Quality Management System in the COVID-19 Diagnostic Laboratory of a Tertiary Care Hospital in Delhi
Authors: Sukriti Sabharwal, Sonali Bhattar, Shikhar Saxena
Abstract:
Introduction: COVID-19 molecular diagnostic laboratory is the cornerstone of the COVID-19 disease diagnosis as the patient’s treatment and management protocol depend on the molecular results. For this purpose, it is extremely important that the laboratory conducting these results adheres to the quality management processes to increase the accuracy and validity of the reports generated. We started our own molecular diagnostic setup at the onset of the pandemic. Therefore, we conducted this study to generate our quality management data to help us in improving on our weak points. Materials and Methods: A total of 14561 samples were evaluated by the retrospective observational method. The quality variables analysed were classified into pre-analytical, analytical, and post-analytical variables, and the results were presented in percentages. Results: Among the pre-analytical variables, sample leaking was the most common cause of the rejection of samples (134/14561, 0.92%), followed by non-generation of SRF ID (76/14561, 0.52%) and non-compliance to triple packaging (44/14561, 0.3%). The other pre-analytical aspects assessed were incomplete patient identification (17/14561, 0.11%), insufficient quantity of samples (12/14561, 0.08%), missing forms/samples (7/14561, 0.04%), samples in the wrong vials/empty VTM tubes (5/14561, 0.03%) and LIMS entry not done (2/14561, 0.01%). We are unable to obtain internal quality control in 0.37% of samples (55/14561). We also experienced two incidences of cross-contamination among the samples resulting in false-positive results. Among the post-analytical factors, a total of 0.07% of samples (11/14561) could not be dispatched within the stipulated time frame. Conclusion: Adherence to quality control processes is foremost for the smooth running of any diagnostic laboratory, especially the ones involved in critical reporting. Not only do the indicators help in keeping in check the laboratory parameters but they also allow comparison with other laboratories.Keywords: laboratory quality management, COVID-19, molecular diagnostics, healthcare
Procedia PDF Downloads 16424621 Health Communication and the Diabetes Narratives of Key Social Media Influencers in the UK
Authors: Z. Sun
Abstract:
Health communication is essential in promoting healthy lifestyles, managing disease conditions, and eventually reducing health disparities. The key elements of successful health communication always include the development of communication strategies to engage people in thinking about their health, inform them about healthy choices, persuade them to adopt safe and healthy behaviours, and eventually achieve public health objectives. The use of 'Narrative' is recognised as a kind of health communication strategy to enhance personal and public health due to its potential persuasive effect in motivating and supporting individuals change their beliefs and behaviours by inviting them into a narrative world, breaking down their cognitive and emotional resistance and enhance their acceptance of the ideas portrayed in narratives. Meanwhile, the popularity of social media has provided a novel means of communication for both healthcare stakeholders, and a special group of active social media users (influencers) have started playing a pivotal role in providing health ‘solutions’. Such individuals are often referred to as ‘influencers’ because of their central position in the online communication system and the persuasive effect their actions may have on audiences. They may have established a positive rapport with their audience, earned trust and credibility in a specific area, and thus, their audience considers the information they delivered to be authentic and influential. To our best knowledge, to date, there is no published research that examines the effect of diabetes narratives presented by social media influencers and their impacts on health-related outcomes. The primary aim of this study is to investigate the diabetes narratives presented by social media influencers in the UK because of the new dimension they bring to health communication and the potential impact they may have on audiences' health outcomes. This study is situated within the interpretivist and narrative paradigms. A mixed methodology combining both quantitative and qualitative approaches has been adopted. Qualitative data has been derived to provide a better understanding of influencers’ personal experiences and how they construct meanings and make sense of their world, while quantitative data has been accumulated to identify key social media influencers in the UK and measure the impact of diabetes narratives on audiences. Twitter has been chosen as the social media platform to initially identify key influencers. Two groups of participants are the top 10 key social media influencers in the UK and 100 audiences of each influencer, which means a total of 1000 audiences have been invited. This paper is going to discuss, first of all, the background of the research under the context of health communication; Secondly, the necessity and contribution of this research; then, the major research questions being explored; and finally, the methods to be used.Keywords: diabetes, health communication, narratives, social media influencers
Procedia PDF Downloads 10424620 Empirical Acceleration Functions and Fuzzy Information
Authors: Muhammad Shafiq
Abstract:
In accelerated life testing approaches life time data is obtained under various conditions which are considered more severe than usual condition. Classical techniques are based on obtained precise measurements, and used to model variation among the observations. In fact, there are two types of uncertainty in data: variation among the observations and the fuzziness. Analysis techniques, which do not consider fuzziness and are only based on precise life time observations, lead to pseudo results. This study was aimed to examine the behavior of empirical acceleration functions using fuzzy lifetimes data. The results showed an increased fuzziness in the transformed life times as compare to the input data.Keywords: acceleration function, accelerated life testing, fuzzy number, non-precise data
Procedia PDF Downloads 29824619 Intensive Care Unit Patient Self-Determination When Facing Cardiovascular Surgery for the First Time
Authors: Hsiao-Lin Fang
Abstract:
The Patient Self-Determination Act is based on the belief that each life is unique. The act regards each patient as an autonomous entity and explicitly protects the patient’s rights to know and make decisions and choices while ensuring that the patient’s wish for a peaceful end is granted. Even when the patient is unconscious and unable to express himself/herself, the patient’s self-determination and its exercise are still protected under the law. The act also ensures that healthcare professionals (HCPs) have a specific set of rules to follow and complete legal protection when their patients are unable to express themselves clearly. This report is about a 55-year-old female patient who weighed 110 kg and was diagnosed with acute type A aortic dissection. The case was that the patient suddenly felt backache and nausea during sleep before daybreak and was therefore transferred to this hospital from the original one. After the doctor explained the patient’s conditions, it was concluded that surgery was necessary. However, the patient’s family was immediately against the surgery after having heard its possible complications. Nevertheless, the patient was still willing to receive the surgery. Being at odds with her family, the patient decided to sign the surgery agreement herself and agreed to receive the two surgical procedures: (1) ascending aorta replacement and (2) innominate artery debranching. After the surgery, the patient did not regain consciousness and therefore received computed tomography scanning of the brain, which revealed false lumen involving proximal left common carotid artery, left subclavian artery and innominate artery, and severe compression of the true lumen with total/subtotal occlusion in the left common carotid artery. On the following day, the doctor discussed two further surgical procedures: (1) endografting for descending aorta and (2) endografting for left common carotid artery and subclavian artery with the family. However, as the patient’s postoperative recovery of consciousness only reached the level of stupor and her family had no intention of subsequent healthcare for the patient, the family made the joint decision three days later to have the endotracheal tube removed from the patient and let her die a natural death. Suggestion: An advance directive (AD) can be created beforehand. Once the patient is in a special clinical state (e.g., terminal illness, permanent vegetative state, etc.), the AD can determine whether to sustain the patient’s life through ‘medical intervention’ or to respect the patient’s rights to choose a peaceful end and receive palliative care. Through the expression of self-determination, it is possible to respect the patient’s medical practice autonomy and protect the patient’s dignity and right to a peaceful end, thereby respecting and supporting the patient’s decision. This also allows the three sides: the patient, the family and the medical team to understand the patient’s true wish in the process of advance care planning (ACP) and thereby promote harmony in the HCP-patient relationship.Keywords: intensive care unit patient, cardiovascular surgery, self-determination, advance directive
Procedia PDF Downloads 17624618 Evaluating Alternative Structures for Prefix Trees
Authors: Feras Hanandeh, Izzat Alsmadi, Muhammad M. Kwafha
Abstract:
Prefix trees or tries are data structures that are used to store data or index of data. The goal is to be able to store and retrieve data by executing queries in quick and reliable manners. In principle, the structure of the trie depends on having letters in nodes at the different levels to point to the actual words in the leafs. However, the exact structure of the trie may vary based on several aspects. In this paper, we evaluated different structures for building tries. Using datasets of words of different sizes, we evaluated the different forms of trie structures. Results showed that some characteristics may impact significantly, positively or negatively, the size and the performance of the trie. We investigated different forms and structures for the trie. Results showed that using an array of pointers in each level to represent the different alphabet letters is the best choice.Keywords: data structures, indexing, tree structure, trie, information retrieval
Procedia PDF Downloads 45224617 Data Management System for Environmental Remediation
Authors: Elizaveta Petelina, Anton Sizo
Abstract:
Environmental remediation projects deal with a wide spectrum of data, including data collected during site assessment, execution of remediation activities, and environmental monitoring. Therefore, an appropriate data management is required as a key factor for well-grounded decision making. The Environmental Data Management System (EDMS) was developed to address all necessary data management aspects, including efficient data handling and data interoperability, access to historical and current data, spatial and temporal analysis, 2D and 3D data visualization, mapping, and data sharing. The system focuses on support of well-grounded decision making in relation to required mitigation measures and assessment of remediation success. The EDMS is a combination of enterprise and desktop level data management and Geographic Information System (GIS) tools assembled to assist to environmental remediation, project planning, and evaluation, and environmental monitoring of mine sites. EDMS consists of seven main components: a Geodatabase that contains spatial database to store and query spatially distributed data; a GIS and Web GIS component that combines desktop and server-based GIS solutions; a Field Data Collection component that contains tools for field work; a Quality Assurance (QA)/Quality Control (QC) component that combines operational procedures for QA and measures for QC; Data Import and Export component that includes tools and templates to support project data flow; a Lab Data component that provides connection between EDMS and laboratory information management systems; and a Reporting component that includes server-based services for real-time report generation. The EDMS has been successfully implemented for the Project CLEANS (Clean-up of Abandoned Northern Mines). Project CLEANS is a multi-year, multimillion-dollar project aimed at assessing and reclaiming 37 uranium mine sites in northern Saskatchewan, Canada. The EDMS has effectively facilitated integrated decision-making for CLEANS project managers and transparency amongst stakeholders.Keywords: data management, environmental remediation, geographic information system, GIS, decision making
Procedia PDF Downloads 16124616 An Efficient Approach for Speed up Non-Negative Matrix Factorization for High Dimensional Data
Authors: Bharat Singh Om Prakash Vyas
Abstract:
Now a day’s applications deal with High Dimensional Data have tremendously used in the popular areas. To tackle with such kind of data various approached has been developed by researchers in the last few decades. To tackle with such kind of data various approached has been developed by researchers in the last few decades. One of the problems with the NMF approaches, its randomized valued could not provide absolute optimization in limited iteration, but having local optimization. Due to this, we have proposed a new approach that considers the initial values of the decomposition to tackle the issues of computationally expensive. We have devised an algorithm for initializing the values of the decomposed matrix based on the PSO (Particle Swarm Optimization). Through the experimental result, we will show the proposed method converse very fast in comparison to other row rank approximation like simple NMF multiplicative, and ACLS techniques.Keywords: ALS, NMF, high dimensional data, RMSE
Procedia PDF Downloads 34224615 Detailed Feasibility and Design of a Grid-Tied PV and Building Integrated Photovoltaic System for a Commercial Healthcare Building
Authors: Muhammad Ali Tariq
Abstract:
Grid-connected PV systems have drawn tremendous attention of researchers in the past recent years. The report elucidates the development of efficient and stable solar PV energy conversion systems after thorough analysis at various facets like PV module characteristics, its arrangement, power electronics and MPPT topologies, the stability of the grid, and economic viability over its lifetime. This report and feasibility study will try to bring all optimizing approaches and design calculations which are required for generating energy from BIPV and roof-mounted solar PV in a convenient, sustainable, and user-friendly way.Keywords: building integrated photovoltaic system, grid integration, solar resource assessment, return on investment, multi MPPT-inverter, levelised cost of electricity
Procedia PDF Downloads 13824614 Internet of Health Things as a Win-Win Solution for Mitigating the Paradigm Shift inside Senior Patient-Physician Shared Health Management
Authors: Marilena Ianculescu, Adriana Alexandru
Abstract:
Internet of Health Things (IoHT) has already proved to be a persuasive means to support a proper assessment of the living conditions by collecting a huge variety of data. For a customized health management of a senior patient, IoHT provides the capacity to build a dynamic solution for sustaining the shift inside the patient-physician relationship by allowing a real-time and continuous remote monitoring of the health status, well-being, safety and activities of the senior, especially in a non-clinical environment. Thus, is created a win-win solution in which both the patient and the physician enhance their involvement and shared decision-making, with significant outcomes. Health monitoring systems in smart environments are becoming a viable alternative to traditional healthcare solutions. The ongoing “Non-invasive monitoring and health assessment of the elderly in a smart environment (RO-SmartAgeing)” project aims to demonstrate that the existence of complete and accurate information is critical for assessing the health condition of the seniors, improving wellbeing and quality of life in relation to health. The researches performed inside the project aim to highlight how the management of IoHT devices connected to the RO-SmartAgeing platform in a secure way by using a role-based access control system, can allow the physicians to provide health services at a high level of efficiency and accessibility, which were previously only available in hospitals. The project aims to identify deficient aspects in the provision of health services tailored to a senior patient’s specificity and to offer a more comprehensive perspective of proactive and preventive medical acts.Keywords: health management, internet of health things, remote monitoring, senior patient
Procedia PDF Downloads 10124613 Integrating Time-Series and High-Spatial Remote Sensing Data Based on Multilevel Decision Fusion
Authors: Xudong Guan, Ainong Li, Gaohuan Liu, Chong Huang, Wei Zhao
Abstract:
Due to the low spatial resolution of MODIS data, the accuracy of small-area plaque extraction with a high degree of landscape fragmentation is greatly limited. To this end, the study combines Landsat data with higher spatial resolution and MODIS data with higher temporal resolution for decision-level fusion. Considering the importance of the land heterogeneity factor in the fusion process, it is superimposed with the weighting factor, which is to linearly weight the Landsat classification result and the MOIDS classification result. Three levels were used to complete the process of data fusion, that is the pixel of MODIS data, the pixel of Landsat data, and objects level that connect between these two levels. The multilevel decision fusion scheme was tested in two sites of the lower Mekong basin. We put forth a comparison test, and it was proved that the classification accuracy was improved compared with the single data source classification results in terms of the overall accuracy. The method was also compared with the two-level combination results and a weighted sum decision rule-based approach. The decision fusion scheme is extensible to other multi-resolution data decision fusion applications.Keywords: image classification, decision fusion, multi-temporal, remote sensing
Procedia PDF Downloads 12424612 Comparison of Radiation Dosage and Image Quality: Digital Breast Tomosynthesis vs. Full-Field Digital Mammography
Authors: Okhee Woo
Abstract:
Purpose: With increasing concern of individual radiation exposure doses, studies analyzing radiation dosage in breast imaging modalities are required. Aim of this study is to compare radiation dosage and image quality between digital breast tomosynthesis (DBT) and full-field digital mammography (FFDM). Methods and Materials: 303 patients (mean age 52.1 years) who studied DBT and FFDM were retrospectively reviewed. Radiation dosage data were obtained by radiation dosage scoring and monitoring program: Radimetrics (Bayer HealthCare, Whippany, NJ). Entrance dose and mean glandular doses in each breast were obtained in both imaging modalities. To compare the image quality of DBT with two-dimensional synthesized mammogram (2DSM) and FFDM, 5-point scoring of lesion clarity was assessed and the better modality between the two was selected. Interobserver performance was compared with kappa values and diagnostic accuracy was compared using McNemar test. The parameters of radiation dosages (entrance dose, mean glandular dose) and image quality were compared between two modalities by using paired t-test and Wilcoxon rank sum test. Results: For entrance dose and mean glandular doses for each breasts, DBT had lower values compared with FFDM (p-value < 0.0001). Diagnostic accuracy did not have statistical difference, but lesion clarity score was higher in DBT with 2DSM and DBT was chosen as a better modality compared with FFDM. Conclusion: DBT showed lower radiation entrance dose and also lower mean glandular doses to both breasts compared with FFDM. Also, DBT with 2DSM had better image quality than FFDM with similar diagnostic accuracy, suggesting that DBT may have a potential to be performed as an alternative to FFDM.Keywords: radiation dose, DBT, digital mammography, image quality
Procedia PDF Downloads 34924611 Analysis of Cooperative Learning Behavior Based on the Data of Students' Movement
Authors: Wang Lin, Li Zhiqiang
Abstract:
The purpose of this paper is to analyze the cooperative learning behavior pattern based on the data of students' movement. The study firstly reviewed the cooperative learning theory and its research status, and briefly introduced the k-means clustering algorithm. Then, it used clustering algorithm and mathematical statistics theory to analyze the activity rhythm of individual student and groups in different functional areas, according to the movement data provided by 10 first-year graduate students. It also focused on the analysis of students' behavior in the learning area and explored the law of cooperative learning behavior. The research result showed that the cooperative learning behavior analysis method based on movement data proposed in this paper is feasible. From the results of data analysis, the characteristics of behavior of students and their cooperative learning behavior patterns could be found.Keywords: behavior pattern, cooperative learning, data analyze, k-means clustering algorithm
Procedia PDF Downloads 18724610 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis
Authors: Meng Su
Abstract:
High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis
Procedia PDF Downloads 10824609 A Security Cloud Storage Scheme Based Accountable Key-Policy Attribute-Based Encryption without Key Escrow
Authors: Ming Lun Wang, Yan Wang, Ning Ruo Sun
Abstract:
With the development of cloud computing, more and more users start to utilize the cloud storage service. However, there exist some issues: 1) cloud server steals the shared data, 2) sharers collude with the cloud server to steal the shared data, 3) cloud server tampers the shared data, 4) sharers and key generation center (KGC) conspire to steal the shared data. In this paper, we use advanced encryption standard (AES), hash algorithms, and accountable key-policy attribute-based encryption without key escrow (WOKE-AKP-ABE) to build a security cloud storage scheme. Moreover, the data are encrypted to protect the privacy. We use hash algorithms to prevent the cloud server from tampering the data uploaded to the cloud. Analysis results show that this scheme can resist conspired attacks.Keywords: cloud storage security, sharing storage, attributes, Hash algorithm
Procedia PDF Downloads 39024608 Motherhood and Its Essence among Zimbabwean Migrant Women in Australia
Authors: Pranee Liamputtong
Abstract:
Childlessness in non-Western societies has wide-ranging social implications and profoundly affects the gender identity and well-being of women. The aspirations of women in these societies are shaped by various sociocultural expectations, encompassing social norms and their own social standing. Currently, there is limited knowledge regarding the perceptions and experiences of Zimbabwean migrant women living in Australia regarding childlessness and motherhood. This paper explores the cultural perspective on children in Zimbabwean society and investigates the personal and social consequences of infertility, as well as the cultural expectations of motherhood among Zimbabwean migrant women residing in Australia. The perceptions and experiences of this migrant community are of utmost importance in order to prevent misunderstandings about the core essence of motherhood among Zimbabwean women. Ultimately, this will lead to the provision of sensitive and culturally appropriate healthcare and social support for migrants in Australia's multicultural society. The study adopts a constructivist paradigm and employs qualitative methods, including in-depth interviews, drawings, and photo elicitation, involving 15 Zimbabwean women. Thematic analysis was employed to analyze the data. In Zimbabwean culture, the ability to bear a child holds significant meaning for women. Children not only ensure the continuity of society but also provide social security, as parents rely on their children for care in old age. Childlessness jeopardizes a woman's social status and carries social repercussions that have a profound impact on their gender identity and well-being. Cultural expectations of motherhood place the sole responsibility for the emotional and physical care of children on the mother. Despite residing in Australia, the procreative value has not diminished for Zimbabwean women. Raising awareness of the procreative needs of Zimbabwean women in a culturally sensitive manner would enhance the emotional well-being of these women.Keywords: motherhood, culture, migrant women, Zimbabwe, Australia
Procedia PDF Downloads 8724607 The Study on Life of Valves Evaluation Based on Tests Data
Authors: Binjuan Xu, Qian Zhao, Ping Jiang, Bo Guo, Zhijun Cheng, Xiaoyue Wu
Abstract:
Astronautical valves are key units in engine systems of astronautical products; their reliability will influence results of rocket or missile launching, even lead to damage to staff and devices on the ground. Besides failure in engine system may influence the hitting accuracy and flight shot of missiles. Therefore high reliability is quite essential to astronautical products. There are quite a few literature doing research based on few failure test data to estimate valves’ reliability, thus this paper proposed a new method to estimate valves’ reliability, according to the corresponding tests of different failure modes, this paper takes advantage of tests data which acquired from temperature, vibration, and action tests to estimate reliability in every failure modes, then this paper has regarded these three kinds of tests as three stages in products’ process to integrate these results to acquire valves’ reliability. Through the comparison of results achieving from tests data and simulated data, the results have illustrated how to obtain valves’ reliability based on the few failure data with failure modes and prove that the results are effective and rational.Keywords: censored data, temperature tests, valves, vibration tests
Procedia PDF Downloads 34624606 Development of Energy Benchmarks Using Mandatory Energy and Emissions Reporting Data: Ontario Post-Secondary Residences
Authors: C. Xavier Mendieta, J. J McArthur
Abstract:
Governments are playing an increasingly active role in reducing carbon emissions, and a key strategy has been the introduction of mandatory energy disclosure policies. These policies have resulted in a significant amount of publicly available data, providing researchers with a unique opportunity to develop location-specific energy and carbon emission benchmarks from this data set, which can then be used to develop building archetypes and used to inform urban energy models. This study presents the development of such a benchmark using the public reporting data. The data from Ontario’s Ministry of Energy for Post-Secondary Educational Institutions are being used to develop a series of building archetype dynamic building loads and energy benchmarks to fill a gap in the currently available building database. This paper presents the development of a benchmark for college and university residences within ASHRAE climate zone 6 areas in Ontario using the mandatory disclosure energy and greenhouse gas emissions data. The methodology presented includes data cleaning, statistical analysis, and benchmark development, and lessons learned from this investigation are presented and discussed to inform the development of future energy benchmarks from this larger data set. The key findings from this initial benchmarking study are: (1) the importance of careful data screening and outlier identification to develop a valid dataset; (2) the key features used to develop a model of the data are building age, size, and occupancy schedules and these can be used to estimate energy consumption; and (3) policy changes affecting the primary energy generation significantly affected greenhouse gas emissions, and consideration of these factors was critical to evaluate the validity of the reported data.Keywords: building archetypes, data analysis, energy benchmarks, GHG emissions
Procedia PDF Downloads 30624605 Causes of Jaundice and Skin Rashes Amongst Children in Selected Rural Communities in the Gambia
Authors: Alhage Drammeh
Abstract:
The research is on the occurrence of certain diseases among children in rural and far-flung parts of the Gambia and the extent to which they are caused by lack of access to clean water. A baseline survey was used to discover, describe, and explain the actual processes. The paper explains the purpose of the research, which is majorly to improve the health condition of children, especially those living in rural communities. The paper also gives a brief overview of the socio-economic situation of The Gambia, emphasizing its status as a Least Developed Country (LDC) and the majority of its population living below the poverty line, with women and children hardest hit. The research used as case studies of two rural communities in the Gambia -Basse Dampha Kunda Village and Foni Besse. Data was collected through oral interviews and medical tests conducted among people in both villages, with an emphasis on children. The demographic detail of those tested is tabulated for a clearer understanding. The results were compared, revealing that skin rashes, hepatitis, and certain other diseases are more prevalent in communities lacking access to safe drinking water. These results were also presented in a tabular form. The study established how some policy failures and neglect on the part of the Government of The Gambia are imperiling the health of many rural dwellers in the country, the most glaring being that the research team was unable to test water samples collected from the two communities, as there are no laboratory reagents for testing water anywhere in The Gambia. Many rural communities lack basic amenities, especially clean and potable water, as well as health facilities. The study findings also highlighted the need for healthcare providers and medical NGOs to voice the plight of rural dwellers and collaborate with the government to set up health facilities in rural areas of The Gambia.Keywords: jaundice, skin rashes, children, rural communities, the Gambia, causes
Procedia PDF Downloads 6624604 Collision Detection Algorithm Based on Data Parallelism
Authors: Zhen Peng, Baifeng Wu
Abstract:
Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.Keywords: data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability
Procedia PDF Downloads 29024603 Changing Arbitrary Data Transmission Period by Using Bluetooth Module on Gas Sensor Node of Arduino Board
Authors: Hiesik Kim, Yong-Beom Kim, Jaheon Gu
Abstract:
Internet of Things (IoT) applications are widely serviced and spread worldwide. Local wireless data transmission technique must be developed to rate up with some technique. Bluetooth wireless data communication is wireless technique is technique made by Special Inter Group (SIG) using the frequency range 2.4 GHz, and it is exploiting Frequency Hopping to avoid collision with a different device. To implement experiment, equipment for experiment transmitting measured data is made by using Arduino as open source hardware, gas sensor, and Bluetooth module and algorithm controlling transmission rate is demonstrated. Experiment controlling transmission rate also is progressed by developing Android application receiving measured data, and controlling this rate is available at the experiment result. It is important that in the future, improvement for communication algorithm be needed because a few error occurs when data is transferred or received.Keywords: Arduino, Bluetooth, gas sensor, IoT, transmission
Procedia PDF Downloads 27824602 Real-Time Sensor Fusion for Mobile Robot Localization in an Oil and Gas Refinery
Authors: Adewole A. Ayoade, Marshall R. Sweatt, John P. H. Steele, Qi Han, Khaled Al-Wahedi, Hamad Karki, William A. Yearsley
Abstract:
Understanding the behavioral characteristics of sensors is a crucial step in fusing data from several sensors of different types. This paper introduces a practical, real-time approach to integrate heterogeneous sensor data to achieve higher accuracy than would be possible from any one individual sensor in localizing a mobile robot. We use this approach in both indoor and outdoor environments and it is especially appropriate for those environments like oil and gas refineries due to their sparse and featureless nature. We have studied the individual contribution of each sensor data to the overall combined accuracy achieved from the fusion process. A Sequential Update Extended Kalman Filter(EKF) using validation gates was used to integrate GPS data, Compass data, WiFi data, Inertial Measurement Unit(IMU) data, Vehicle Velocity, and pose estimates from Fiducial marker system. Results show that the approach can enable a mobile robot to navigate autonomously in any environment using a priori information.Keywords: inspection mobile robot, navigation, sensor fusion, sequential update extended Kalman filter
Procedia PDF Downloads 47224601 Energy Efficient Massive Data Dissemination Through Vehicle Mobility in Smart Cities
Authors: Salman Naseer
Abstract:
One of the main challenges of operating a smart city (SC) is collecting the massive data generated from multiple data sources (DS) and to transmit them to the control units (CU) for further data processing and analysis. These ever-increasing data demands require not only more and more capacity of the transmission channels but also results in resource over-provision to meet the resilience requirements, thus the unavoidable waste because of the data fluctuations throughout the day. In addition, the high energy consumption (EC) and carbon discharges from these data transmissions posing serious issues to the environment we live in. Therefore, to overcome the issues of intensive EC and carbon emissions (CE) of massive data dissemination in Smart Cities, we propose an energy efficient and carbon reduction approach by utilizing the daily mobility of the existing vehicles as an alternative communications channel to accommodate the data dissemination in smart cities. To illustrate the effectiveness and efficiency of our approach, we take the Auckland City in New Zealand as an example, assuming massive data generated by various sources geographically scattered throughout the Auckland region to the control centres located in city centre. The numerical results show that our proposed approach can provide up to 5 times lower delay as transferring the large volume of data by utilizing the existing daily vehicles’ mobility than the conventional transmission network. Moreover, our proposed approach offers about 30% less EC and CE than that of conventional network transmission approach.Keywords: smart city, delay tolerant network, infrastructure offloading, opportunistic network, vehicular mobility, energy consumption, carbon emission
Procedia PDF Downloads 14224600 Integrating Geographic Information into Diabetes Disease Management
Authors: Tsu-Yun Chiu, Tsung-Hsueh Lu, Tain-Junn Cheng
Abstract:
Background: Traditional chronic disease management did not pay attention to effects of geographic factors on the compliance of treatment regime, which resulted in geographic inequality in outcomes of chronic disease management. This study aims to examine the geographic distribution and clustering of quality indicators of diabetes care. Method: We first extracted address, demographic information and quality of care indicators (number of visits, complications, prescription and laboratory records) of patients with diabetes for 2014 from medical information system in a medical center in Tainan City, Taiwan, and the patients’ addresses were transformed into district- and village-level data. We then compared the differences of geographic distribution and clustering of quality of care indicators between districts and villages. Despite the descriptive results, rate ratios and 95% confidence intervals (CI) were estimated for indices of care in order to compare the quality of diabetes care among different areas. Results: A total of 23,588 patients with diabetes were extracted from the hospital data system; whereas 12,716 patients’ information and medical records were included to the following analysis. More than half of the subjects in this study were male and between 60-79 years old. Furthermore, the quality of diabetes care did indeed vary by geographical levels. Thru the smaller level, we could point out clustered areas more specifically. Fuguo Village (of Yongkang District) and Zhiyi Village (of Sinhua District) were found to be “hotspots” for nephropathy and cerebrovascular disease; while Wangliau Village and Erwang Village (of Yongkang District) would be “coldspots” for lowest proportion of ≥80% compliance to blood lipids examination. On the other hand, Yuping Village (in Anping District) was the area with the lowest proportion of ≥80% compliance to all laboratory examination. Conclusion: In spite of examining the geographic distribution, calculating rate ratios and their 95% CI could also be a useful and consistent method to test the association. This information is useful for health planners, diabetes case managers and other affiliate practitioners to organize care resources to the areas most needed.
Keywords: catchment area of healthcare, chronic disease management, Geographic information system, quality of diabetes care
Procedia PDF Downloads 283