Search results for: data envelopment analysis (DEA)
41676 Systematic Review and Meta-Analysis of Mid-Term Survival, and Recurrent Mitral Regurgitation for Robotic-Assisted Mitral Valve Repair
Authors: Ramanen Sugunesegran, Michael L. Williams
Abstract:
Over the past two decades surgical approaches for mitral valve (MV) disease have evolved with the advent of minimally invasive techniques. Robotic mitral valve repair (RMVr) safety and efficacy has been well documented, however, mid- to long-term data are limited. The aim of this review was to provide a comprehensive analysis of the available mid- to long-term term data for RMVr. Electronic searches of five databases were performed to identify all relevant studies reporting minimum 5-year data on RMVr. Pre-defined primary outcomes of interest were overall survival, freedom from MV reoperation and freedom from moderate or worse mitral regurgitation (MR) at 5-years or more post-RMVr. A meta-analysis of proportions or means was performed, utilizing a random effects model, to present the data. Kaplan-Meier curves were aggregated using reconstructed individual patient data. Nine studies totaling 3,300 patients undergoing RMVr were identified. Rates of overall survival at 1-, 5- and 10-years were 99.2%, 97.4% and 92.3%, respectively. Freedom from MV reoperation at 8-years post RMVr was 95.0%. Freedom from moderate or worse MR at 7-years was 86.0%. Rates of early post-operative complications were low with only 0.2% all-cause mortality and 1.0% cerebrovascular accident. Reoperation for bleeding was low at 2.2% and successful RMVr was 99.8%. Mean intensive care unit and hospital stay were 22.4 hours and 5.2 days, respectively. RMVr is a safe procedure with low rates of early mortality and other complications. It can be performed with low complication rates in high volume, experienced centers. Evaluation of available mid-term data post-RMVr suggests favorable rates of overall survival, freedom from MV reoperation and freedom from moderate or worse MR recurrence.Keywords: mitral valve disease, mitral valve repair, robotic cardiac surgery, robotic mitral valve repair
Procedia PDF Downloads 8241675 Determination of Optimum Torque of an Internal Combustion Engine by Exergy Analysis
Authors: Veena Chaudhary, Rakesh P. Gakkhar
Abstract:
In this study, energy and exergy analysis are applied to the experimental data of an internal combustion engine operating on conventional diesel cycle. The experimental data are collected using an engine unit which enables accurate measurements of fuel flow rate, combustion air flow rate, engine load, engine speed and all relevant temperatures. First and second law efficiencies are calculated for different engine speed and compared. Results indicate that the first law (energy) efficiency is maximum at 1700 rpm whereas exergy efficiency is maximum and exergy destruction is minimum at 1900 rpm.Keywords: diesel engine, exergy destruction, exergy efficiency, second law of thermodynamics
Procedia PDF Downloads 33141674 Combustion Analysis of Suspended Sodium Droplet
Authors: T. Watanabe
Abstract:
Combustion analysis of suspended sodium droplet is performed by solving numerically the Navier-Stokes equations and the energy conservation equations. The combustion model consists of the pre-ignition and post-ignition models. The reaction rate for the pre-ignition model is based on the chemical kinetics, while that for the post-ignition model is based on the mass transfer rate of oxygen. The calculated droplet temperature is shown to be in good agreement with the existing experimental data. The temperature field in and around the droplet is obtained as well as the droplet shape variation, and the present numerical model is confirmed to be effective for the combustion analysis.Keywords: analysis, combustion, droplet, sodium
Procedia PDF Downloads 21141673 Road Accidents Bigdata Mining and Visualization Using Support Vector Machines
Authors: Usha Lokala, Srinivas Nowduri, Prabhakar K. Sharma
Abstract:
Useful information has been extracted from the road accident data in United Kingdom (UK), using data analytics method, for avoiding possible accidents in rural and urban areas. This analysis make use of several methodologies such as data integration, support vector machines (SVM), correlation machines and multinomial goodness. The entire datasets have been imported from the traffic department of UK with due permission. The information extracted from these huge datasets forms a basis for several predictions, which in turn avoid unnecessary memory lapses. Since data is expected to grow continuously over a period of time, this work primarily proposes a new framework model which can be trained and adapt itself to new data and make accurate predictions. This work also throws some light on use of SVM’s methodology for text classifiers from the obtained traffic data. Finally, it emphasizes the uniqueness and adaptability of SVMs methodology appropriate for this kind of research work.Keywords: support vector mechanism (SVM), machine learning (ML), support vector machines (SVM), department of transportation (DFT)
Procedia PDF Downloads 27641672 Positioning a Southern Inclusive Framework Embedded in the Social Model of Disability Theory Contextualised for Guyana
Authors: Lidon Lashley
Abstract:
This paper presents how the social model of disability can be used to reshape inclusive education practices in Guyana. Inclusive education in Guyana is metamorphosizing but still firmly held in the tenets of the Medical Model of Disability which influences the experiences of children with Special Education Needs and/or Disabilities (SEN/D). An ethnographic approach to data gathering was employed in this study. Qualitative data was gathered from the voices of children with and without SEN/D as well as their mainstream teachers to present the interplay of discourses and subjectivities in the situation. The data was analyzed using Adele Clarke's postmodern approach to grounded theory analysis called situational analysis. The data suggest that it is possible but will be challenging to fully contextualize and adopt Loreman's synthesis and Booths and Ainscow's Index in the two mainstream schools studied. In addition, the data paved the way for the presentation of the social model framework specific to Guyana called 'Southern Inclusive Education Framework for Guyana' and its support tool called 'The Inclusive Checker created for Southern mainstream primary classrooms.Keywords: social model of disability, medical model of disability, subjectivities, metamorphosis, special education needs, postcolonial Guyana, inclusion, culture, mainstream primary schools, Loreman's synthesis, Booths and Ainscow's index
Procedia PDF Downloads 16241671 Research on Strategies of Building a Child Friendly City in Wuhan
Authors: Tianyue Wan
Abstract:
Building a child-friendly city (CFC) contributes to improving the quality of urbanization. It also forms a local system committed to fulfilling children's rights and development. Yet, the work related to CFC is still at the initial stage in China. Therefore, taking Wuhan, the most populous city in central China, as the pilot city would offer some reference for other cities. Based on the analysis of theories and practice examples, this study puts forward the challenges of building a child-friendly city under the particularity of China's national conditions. To handle these challenges, this study uses four methods to collect status data: literature research, site observation, research inquiry, and semantic differential (SD). And it adopts three data analysis methods: case analysis, geographic information system (GIS) analysis, and analytic hierarchy process (AHP) method. Through data analysis, this study identifies the evaluation system and appraises the current situation of Wuhan. According to the status of Wuhan's child-friendly city, this study proposes three strategies: 1) construct the evaluation system; 2) establish a child-friendly space system integrating 'point-line-surface'; 3) build a digitalized service platform. At the same time, this study suggests building a long-term mechanism for children's participation and multi-subject supervision from laws, medical treatment, education, safety protection, social welfare, and other aspects. Finally, some conclusions of strategies about CFC are tried to be drawn to promote the highest quality of life for all citizens in Wuhan.Keywords: action plan, child friendly city, construction strategy, urban space
Procedia PDF Downloads 9341670 The Revised Completion of Student Internship Report by Goal Mapping
Authors: Faizah Herman
Abstract:
This study aims to explore the attitudes and behavior of goal mapping performed by the student in completing the internship report revised on time. The approach is phenomenological research with qualitative methods. Data sources include observation, interviews and questionnaires, focus group discussions. Research subject 5 students who have completed the internship report revisions in a timely manner. The analysis technique is an interactive model of Miles&Huberman data analysis techniques. The results showed that the students have a goal of mapping that includes the ultimate goal, formulate goals by identifying what are the things that need to be done, action to be taken and what kind of support is needed from the environment.Keywords: goal mapping, revision internship report, students, Brawijaya
Procedia PDF Downloads 39641669 The Relationships between Market Orientation and Competitiveness of Companies in Banking Sector
Authors: Patrik Jangl, Milan Mikuláštík
Abstract:
The objective of the paper is to measure and compare market orientation of Swiss and Czech banks, as well as examine statistically the degree of influence it has on competitiveness of the institutions. The analysis of market orientation is based on the collecting, analysis and correct interpretation of the data. Descriptive analysis of market orientation describe current situation. Research of relation of competitiveness and market orientation in the sector of big international banks is suggested with the expectation of existence of a strong relationship. Partially, the work served as reconfirmation of suitability of classic methodologies to measurement of banks’ market orientation. Two types of data were gathered. Firstly, by measuring subjectively perceived market orientation of a company and secondly, by quantifying its competitiveness. All data were collected from a sample of small, mid-sized and large banks. We used numerical secondary character data from the international statistical financial Bureau Van Dijk’s BANKSCOPE database. Statistical analysis led to the following results. Assuming classical market orientation measures to be scientifically justified, Czech banks are statistically less market-oriented than Swiss banks. Secondly, among small Swiss banks, which are not broadly internationally active, small relationship exist between market orientation measures and market share based competitiveness measures. Thirdly, among all Swiss banks, a strong relationship exists between market orientation measures and market share based competitiveness measures. Above results imply existence of a strong relation of this measure in sector of big international banks. A strong statistical relationship has been proven to exist between market orientation measures and equity/total assets ratio in Switzerland.Keywords: market orientation, competitiveness, marketing strategy, measurement of market orientation, relation between market orientation and competitiveness, banking sector
Procedia PDF Downloads 47641668 The Visualization of Hydrological and Hydraulic Models Based on the Platform of Autodesk Civil 3D
Authors: Xiyue Wang, Shaoning Yan
Abstract:
Cities in China today is faced with an increasingly serious river ecological crisis accompanying with the development of urbanization: waterlogging on account of the fragmented urban natural hydrological system; the limited ecological function of the hydrological system caused by a destruction of water system and waterfront ecological environment. Additionally, the eco-hydrological processes of rivers are affected by various environmental factors, which are more complex in the context of urban environment. Therefore, efficient hydrological monitoring and analysis tools, accurate and visual hydrological and hydraulic models are becoming more important basis for decision-makers and an important way for landscape architects to solve urban hydrological problems, formulating sustainable and forward-looking schemes. The study mainly introduces the river and flood analysis model based on the platform of Autodesk Civil 3D. Taking the Luanhe River in Qian'an City of Hebei Province as an example, the 3D models of the landform, river, embankment, shoal, pond, underground stream and other land features were initially built, with which the water transfer simulation analysis, river floodplain analysis, and river ecology analysis were carried out, ultimately the real-time visualized simulation and analysis of rivers in various hypothetical scenarios were realized. Through the establishment of digital hydrological and hydraulic model, the hydraulic data can be accurately and intuitively simulated, which provides basis for rational water system and benign urban ecological system design. Though, the hydrological and hydraulic model based on Autodesk Civil3D own its boundedness: the interaction between the model and other data and software is unfavorable; the huge amount of 3D data and the lack of basic data restrict the accuracy and application range. The hydrological and hydraulic model based on Autodesk Civil3D platform provides more possibility to access convenient and intelligent tool for urban planning and monitoring, a solid basis for further urban research and design.Keywords: visualization, hydrological and hydraulic model, Autodesk Civil 3D, urban river
Procedia PDF Downloads 29741667 Linguistic Features for Sentence Difficulty Prediction in Aspect-Based Sentiment Analysis
Authors: Adrian-Gabriel Chifu, Sebastien Fournier
Abstract:
One of the challenges of natural language understanding is to deal with the subjectivity of sentences, which may express opinions and emotions that add layers of complexity and nuance. Sentiment analysis is a field that aims to extract and analyze these subjective elements from text, and it can be applied at different levels of granularity, such as document, paragraph, sentence, or aspect. Aspect-based sentiment analysis is a well-studied topic with many available data sets and models. However, there is no clear definition of what makes a sentence difficult for aspect-based sentiment analysis. In this paper, we explore this question by conducting an experiment with three data sets: ”Laptops”, ”Restaurants”, and ”MTSC” (Multi-Target-dependent Sentiment Classification), and a merged version of these three datasets. We study the impact of domain diversity and syntactic diversity on difficulty. We use a combination of classifiers to identify the most difficult sentences and analyze their characteristics. We employ two ways of defining sentence difficulty. The first one is binary and labels a sentence as difficult if the classifiers fail to correctly predict the sentiment polarity. The second one is a six-level scale based on how many of the top five best-performing classifiers can correctly predict the sentiment polarity. We also define 9 linguistic features that, combined, aim at estimating the difficulty at sentence level.Keywords: sentiment analysis, difficulty, classification, machine learning
Procedia PDF Downloads 9241666 A Model of Teacher Leadership in History Instruction
Authors: Poramatdha Chutimant
Abstract:
The objective of the research was to propose a model of teacher leadership in history instruction for utilization. Everett M. Rogers’ Diffusion of Innovations Theory is applied as theoretical framework. Qualitative method is to be used in the study, and the interview protocol used as an instrument to collect primary data from best practices who awarded by Office of National Education Commission (ONEC). Open-end questions will be used in interview protocol in order to gather the various data. Then, information according to international context of history instruction is the secondary data used to support in the summarizing process (Content Analysis). Dendrogram is a key to interpret and synthesize the primary data. Thus, secondary data comes as the supportive issue in explanation and elaboration. In-depth interview is to be used to collected information from seven experts in educational field. The focal point is to validate a draft model in term of future utilization finally.Keywords: history study, nationalism, patriotism, responsible citizenship, teacher leadership
Procedia PDF Downloads 28141665 Activity-Based Costing of Medical Intensive Care Unit 240
Authors: Suppawan Lertpongpakpoom, Anongnat Boonrat, Kunya BoontummoSuppawan
Abstract:
This descriptive cost analysis aimed to analyze the unit cost of patients in medical intensive care unit. Purposive sampling was used to select 20 nurses, 6 practical nurses, 5 nurses aid and select samples 30 patients. Data were collected from both primary source (activity and average time of nursing care) and secondary source Z bill of payment and patient record). Instruments were cost recording form, activity observation form, and service recording form. Content validity of all instruments were evaluated by three experts (CVI = 0.87). Descriptive statistics was employed for data analysis. The results of the Activity-Based Costing Analysis showed that total activity cost of 4 service types for the patients was 14,776.92 Bath. The highest cost was nursing record was 5,674.78 Bath, followed direct nursing activity was 5,176.18 Bath, medical treatment was 1,976.6 Bath. The lowest cost was management activity was 1,003.64 Bath per visit. The result suggested that Activity-Base Costing Analysis could be applied to give better understanding of cost structure, enabling better consideration wasted expense and non-value-added activity, and improvement of effective utilization.Keywords: activity-based costing, medical intensive care, nursing care, cost analysis
Procedia PDF Downloads 40441664 Prediction of Live Birth in a Matched Cohort of Elective Single Embryo Transfers
Authors: Mohsen Bahrami, Banafsheh Nikmehr, Yueqiang Song, Anuradha Koduru, Ayse K. Vuruskan, Hongkun Lu, Tamer M. Yalcinkaya
Abstract:
In recent years, we have witnessed an explosion of studies aimed at using a combination of artificial intelligence (AI) and time-lapse imaging data on embryos to improve IVF outcomes. However, despite promising results, no study has used a matched cohort of transferred embryos which only differ in pregnancy outcome, i.e., embryos from a single clinic which are similar in parameters, such as: morphokinetic condition, patient age, and overall clinic and lab performance. Here, we used time-lapse data on embryos with known pregnancy outcomes to see if the rich spatiotemporal information embedded in this data would allow the prediction of the pregnancy outcome regardless of such critical parameters. Methodology—We did a retrospective analysis of time-lapse data from our IVF clinic utilizing Embryoscope 100% of the time for embryo culture to blastocyst stage with known clinical outcomes, including live birth vs nonpregnant (embryos with spontaneous abortion outcomes were excluded). We used time-lapse data from 200 elective single transfer embryos randomly selected from January 2019 to June 2021. Our sample included 100 embryos in each group with no significant difference in patient age (P=0.9550) and morphokinetic scores (P=0.4032). Data from all patients were combined to make a 4th order tensor, and feature extraction were subsequently carried out by a tensor decomposition methodology. The features were then used in a machine learning classifier to classify the two groups. Major Findings—The performance of the model was evaluated using 100 random subsampling cross validation (train (80%) - test (20%)). The prediction accuracy, averaged across 100 permutations, exceeded 80%. We also did a random grouping analysis, in which labels (live birth, nonpregnant) were randomly assigned to embryos, which yielded 50% accuracy. Conclusion—The high accuracy in the main analysis and the low accuracy in random grouping analysis suggest a consistent spatiotemporal pattern which is associated with pregnancy outcomes, regardless of patient age and embryo morphokinetic condition, and beyond already known parameters, such as: early cleavage or early blastulation. Despite small samples size, this ongoing analysis is the first to show the potential of AI methods in capturing the complex morphokinetic changes embedded in embryo time-lapse data, which contribute to successful pregnancy outcomes, regardless of already known parameters. The results on a larger sample size with complementary analysis on prediction of other key outcomes, such as: euploidy and aneuploidy of embryos will be presented at the meeting.Keywords: IVF, embryo, machine learning, time-lapse imaging data
Procedia PDF Downloads 9341663 Risk Analysis of Leaks from a Subsea Oil Facility Based on Fuzzy Logic Techniques
Authors: Belén Vinaixa Kinnear, Arturo Hidalgo López, Bernardo Elembo Wilasi, Pablo Fernández Pérez, Cecilia Hernández Fuentealba
Abstract:
The expanded use of risk assessment in legislative and corporate decision-making has increased the role of expert judgement in giving data for security-related decision-making. Expert judgements are required in most steps of risk assessment: danger recognizable proof, hazard estimation, risk evaluation, and examination of choices. This paper presents a fault tree analysis (FTA), which implies a probabilistic failure analysis applied to leakage of oil in a subsea production system. In standard FTA, the failure probabilities of items of a framework are treated as exact values while evaluating the failure probability of the top event. There is continuously insufficiency of data for calculating the failure estimation of components within the drilling industry. Therefore, fuzzy hypothesis can be used as a solution to solve the issue. The aim of this paper is to examine the leaks from the Zafiro West subsea oil facility by using fuzzy fault tree analysis (FFTA). As a result, the research has given theoretical and practical contributions to maritime safety and environmental protection. It has been also an effective strategy used traditionally in identifying hazards in nuclear installations and power industries.Keywords: expert judgment, probability assessment, fault tree analysis, risk analysis, oil pipelines, subsea production system, drilling, quantitative risk analysis, leakage failure, top event, off-shore industry
Procedia PDF Downloads 19141662 Generation of Automated Alarms for Plantwide Process Monitoring
Authors: Hyun-Woo Cho
Abstract:
Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.Keywords: detection, monitoring, process data, noise
Procedia PDF Downloads 25341661 Exploring the Inter-firm Collaborating and Supply Chain Innovation in the Pharmaceutical Industry
Authors: Fatima Gouiferda
Abstract:
Uncertainty and competitiveness are changing firm’s environment to become more complicated. The competition is moving to supply chain’s level, and firms need to collaborate and innovate to survive. In the current economy, common efforts between organizations and developing new capacities mutually are the key resources in gaining collaborative advantage and enhancing supply chain performance. The purpose of this paper is to explore different practices of collaboration activities that exist in the pharmaceutical industry of Morocco. Also, to inquire how these practices affect supply chain performance. The exploration is based on interpretativism research paradigm. Data were collected through semi-structured interviews from supply chain practitioners. Qualitative data was analyzed via Iramuteq software to explore different themes of the study.The findings include descriptive analysis as a result of data processing using Iramuteq. It also encompasses the content analysis of the themes extracted from interviews.Keywords: inter-firm relationships, collaboration, supply chain innovation, morocco
Procedia PDF Downloads 6441660 Experimental Modal Analysis of Kursuncular Minaret
Authors: Yunus Dere
Abstract:
Minarets are tower like structures where the call to prayer of Muslims is performed. They have a symbolic meaning and sacred place among Muslims. Being tall and slender, they are prone to damage under earthquakes and strong winds. Kursuncular stone minaret was built around thirty years ago in Konya/TURKEY. Its core and helical stairs are made of reinforced concrete. Its stone spire was damaged during a light earthquake. Its spire is later replaced with a light material covered with lead sheets. In this study, the natural frequencies and mode shapes of Kursuncular minaret is obtained experimentally and analytically. First an ambient vibration test is carried out using a data acquisition system with accelerometers located at four locations along the height of the minaret. The collected vibration data is evaluated by operational modal analysis techniques. For the analytical part of the study, the dimensions of the minaret are accurately measured and a detailed 3D solid finite element model of the minaret is generated. The moduli of elasticity of the stone and concrete are approximated using the compressive strengths obtained by Windsor Pin tests. Finite element modal analysis of the minaret is carried out to get the modal parameters. Experimental and analytical results are then compared and found in good agreement.Keywords: experimental modal analysis, stone minaret, finite element modal analysis, minarets
Procedia PDF Downloads 32841659 A Model Architecture Transformation with Approach by Modeling: From UML to Multidimensional Schemas of Data Warehouses
Authors: Ouzayr Rabhi, Ibtissam Arrassen
Abstract:
To provide a complete analysis of the organization and to help decision-making, leaders need to have relevant data; Data Warehouses (DW) are designed to meet such needs. However, designing DW is not trivial and there is no formal method to derive a multidimensional schema from heterogeneous databases. In this article, we present a Model-Driven based approach concerning the design of data warehouses. We describe a multidimensional meta-model and also specify a set of transformations starting from a Unified Modeling Language (UML) metamodel. In this approach, the UML metamodel and the multidimensional one are both considered as a platform-independent model (PIM). The first meta-model is mapped into the second one through transformation rules carried out by the Query View Transformation (QVT) language. This proposal is validated through the application of our approach to generating a multidimensional schema of a Balanced Scorecard (BSC) DW. We are interested in the BSC perspectives, which are highly linked to the vision and the strategies of an organization.Keywords: data warehouse, meta-model, model-driven architecture, transformation, UML
Procedia PDF Downloads 16141658 lncRNA Gene Expression Profiling Analysis by TCGA RNA-Seq Data of Breast Cancer
Authors: Xiaoping Su, Gabriel G. Malouf
Abstract:
Introduction: Breast cancer is a heterogeneous disease that can be classified in 4 subgroups using transcriptional profiling. The role of lncRNA expression in human breast cancer biology, prognosis, and molecular classification remains unknown. Methods and results: Using an integrative comprehensive analysis of lncRNA, mRNA and DNA methylation in 900 breast cancer patients from The Cancer Genome Atlas (TCGA) project, we unraveled the molecular portraits of 1,700 expressed lncRNA. Some of those lncRNAs (i.e, HOTAIR) are previously reported and others are novel (i.e, HOTAIRM1, MAPT-AS1). The lncRNA classification correlated well with the PAM50 classification for basal-like, Her-2 enriched and luminal B subgroups, in contrast to the luminal A subgroup which behaved differently. Importantly, estrogen receptor (ESR1) expression was associated with distinct lncRNA networks in lncRNA clusters III and IV. Gene set enrichment analysis for cis- and trans-acting lncRNA showed enrichment for breast cancer signatures driven by breast cancer master regulators. Almost two third of those lncRNA were marked by enhancer chromatin modifications (i.e., H3K27ac), suggesting that lncRNA expression may result in increased activity of neighboring genes. Differential analysis of gene expression profiling data showed that lncRNA HOTAIRM1 was significantly down-regulated in basal-like subtype, and DNA methylation profiling data showed that lncRNA HOTAIRM1 was highly methylated in basal-like subtype. Thus, our integrative analysis of gene expression and DNA methylation strongly suggested that lncRNA HOTAIRM1 should be a tumor suppressor in basal-like subtype. Conclusion and significance: Our study depicts the first lncRNA molecular portrait of breast cancer and shows that lncRNA HOTAIRM1 might be a novel tumor suppressor.Keywords: lncRNA profiling, breast cancer, HOTAIRM1, tumor suppressor
Procedia PDF Downloads 10641657 Secured Embedding of Patient’s Confidential Data in Electrocardiogram Using Chaotic Maps
Authors: Butta Singh
Abstract:
This paper presents a chaotic map based approach for secured embedding of patient’s confidential data in electrocardiogram (ECG) signal. The chaotic map generates predefined locations through the use of selective control parameters. The sample value difference method effectually hides the confidential data in ECG sample pairs at these predefined locations. Evaluation of proposed method on all 48 records of MIT-BIH arrhythmia ECG database demonstrates that the embedding does not alter the diagnostic features of cover ECG. The secret data imperceptibility in stego-ECG is evident through various statistical and clinical performance measures. Statistical metrics comprise of Percentage Root Mean Square Difference (PRD) and Peak Signal to Noise Ratio (PSNR). Further, a comparative analysis between proposed method and existing approaches was also performed. The results clearly demonstrated the superiority of proposed method.Keywords: chaotic maps, ECG steganography, data embedding, electrocardiogram
Procedia PDF Downloads 19841656 The Influence of the Vocational Teachers Empowerment toward the Vocational High Schools’ Performance Based on the Education National Standards of Indonesia
Authors: Abdul Haris Setiawan
Abstract:
Teachers empowerment is one of the important factors considered to contribute significantly to the achievement of the national education goals. This study was conducted to determine the influence on the vocational teachers empowerment toward the performance of the vocational high schools based on the Education National Standards of Indonesia. The population of the study was all vocational teachers at the State Vocational High schools in Surakarta, Central Java Province, Indonesia. The sampling technique used proportional random sampling technique. This study used a quantitative descriptive statistical analysis techniques. The data was collected using questionnaires. The data has been collected and then tested using analysis requirements test. Having tested using the requirements analysis and then the data processed using regression analysis between the independent and dependent variables to determine the effect and the regression equation. The results of the study found that the level of vocational high schools’ performance based on the Education National Standards of Indonesia was 74.29%, including in the high category; the level of vocational teachers empowerment was 76.20%, including in the high category; there was a positive influence of vocational teachers empowerment toward the vocational high schools’ performance based on the Education National Standards of Indonesia with a correlation coefficient of 0,886, and a contribution of 78.50% with the regression equation Y = 79.431 +0.534 X.Keywords: vocational teachers, empowerment, vocational high school, the education national standards
Procedia PDF Downloads 39441655 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network
Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson
Abstract:
The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0
Procedia PDF Downloads 18241654 Denoising Transient Electromagnetic Data
Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen
Abstract:
Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform
Procedia PDF Downloads 8641653 Use of Machine Learning in Data Quality Assessment
Authors: Bruno Pinto Vieira, Marco Antonio Calijorne Soares, Armando Sérgio de Aguiar Filho
Abstract:
Nowadays, a massive amount of information has been produced by different data sources, including mobile devices and transactional systems. In this scenario, concerns arise on how to maintain or establish data quality, which is now treated as a product to be defined, measured, analyzed, and improved to meet consumers' needs, which is the one who uses these data in decision making and companies strategies. Information that reaches low levels of quality can lead to issues that can consume time and money, such as missed business opportunities, inadequate decisions, and bad risk management actions. The step of selecting, identifying, evaluating, and selecting data sources with significant quality according to the need has become a costly task for users since the sources do not provide information about their quality. Traditional data quality control methods are based on user experience or business rules limiting performance and slowing down the process with less than desirable accuracy. Using advanced machine learning algorithms, it is possible to take advantage of computational resources to overcome challenges and add value to companies and users. In this study, machine learning is applied to data quality analysis on different datasets, seeking to compare the performance of the techniques according to the dimensions of quality assessment. As a result, we could create a ranking of approaches used, besides a system that is able to carry out automatically, data quality assessment.Keywords: machine learning, data quality, quality dimension, quality assessment
Procedia PDF Downloads 15041652 Improving Security in Healthcare Applications Using Federated Learning System With Blockchain Technology
Authors: Aofan Liu, Qianqian Tan, Burra Venkata Durga Kumar
Abstract:
Data security is of the utmost importance in the healthcare area, as sensitive patient information is constantly sent around and analyzed by many different parties. The use of federated learning, which enables data to be evaluated locally on devices rather than being transferred to a central server, has emerged as a potential solution for protecting the privacy of user information. To protect against data breaches and unauthorized access, federated learning alone might not be adequate. In this context, the application of blockchain technology could provide the system extra protection. This study proposes a distributed federated learning system that is built on blockchain technology in order to enhance security in healthcare. This makes it possible for a wide variety of healthcare providers to work together on data analysis without raising concerns about the confidentiality of the data. The technical aspects of the system, including as the design and implementation of distributed learning algorithms, consensus mechanisms, and smart contracts, are also investigated as part of this process. The technique that was offered is a workable alternative that addresses concerns about the safety of healthcare while also fostering collaborative research and the interchange of data.Keywords: data privacy, distributed system, federated learning, machine learning
Procedia PDF Downloads 13541651 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges
Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch
Abstract:
Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.Keywords: big data interpretation, datathon, systems toxicology, verification
Procedia PDF Downloads 27841650 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges
Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars
Abstract:
In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting
Procedia PDF Downloads 15341649 A Non-parametric Clustering Approach for Multivariate Geostatistical Data
Authors: Francky Fouedjio
Abstract:
Multivariate geostatistical data have become omnipresent in the geosciences and pose substantial analysis challenges. One of them is the grouping of data locations into spatially contiguous clusters so that data locations within the same cluster are more similar while clusters are different from each other, in some sense. Spatially contiguous clusters can significantly improve the interpretation that turns the resulting clusters into meaningful geographical subregions. In this paper, we develop an agglomerative hierarchical clustering approach that takes into account the spatial dependency between observations. It relies on a dissimilarity matrix built from a non-parametric kernel estimator of the spatial dependence structure of data. It integrates existing methods to find the optimal cluster number and to evaluate the contribution of variables to the clustering. The capability of the proposed approach to provide spatially compact, connected and meaningful clusters is assessed using bivariate synthetic dataset and multivariate geochemical dataset. The proposed clustering method gives satisfactory results compared to other similar geostatistical clustering methods.Keywords: clustering, geostatistics, multivariate data, non-parametric
Procedia PDF Downloads 47741648 Exploring the Capabilities of Sentinel-1A and Sentinel-2A Data for Landslide Mapping
Authors: Ismayanti Magfirah, Sartohadi Junun, Samodra Guruh
Abstract:
Landslides are one of the most frequent and devastating natural disasters in Indonesia. Many studies have been conducted regarding this phenomenon. However, there is a lack of attention in the landslide inventory mapping. The natural condition (dense forest area) and the limited human and economic resources are some of the major problems in building landslide inventory in Indonesia. Considering the importance of landslide inventory data in susceptibility, hazard, and risk analysis, it is essential to generate landslide inventory based on available resources. In order to achieve this, the first thing we have to do is identify the landslides' location. The presence of Sentinel-1A and Sentinel-2A data gives new insights into land monitoring investigation. The free access, high spatial resolution, and short revisit time, make the data become one of the most trending open sources data used in landslide mapping. Sentinel-1A and Sentinel-2A data have been used broadly for landslide detection and landuse/landcover mapping. This study aims to generate landslide map by integrating Sentinel-1A and Sentinel-2A data use change detection method. The result will be validated by field investigation to make preliminary landslide inventory in the study area.Keywords: change detection method, landslide inventory mapping, Sentinel-1A, Sentinel-2A
Procedia PDF Downloads 17241647 3D Frictionless Contact Case between the Structure of E-Bike and the Ground
Authors: Lele Zhang, Hui Leng Choo, Alexander Konyukhov, Shuguang Li
Abstract:
China is currently the world's largest producer and distributor of electric bicycle (e-bike). The increasing number of e-bikes on the road is accompanied by rising injuries and even deaths of e-bike drivers. Therefore, there is a growing need to improve the safety structure of e-bikes. This 3D frictionless contact analysis is a preliminary, but necessary work for further structural design improvement of an e-bike. The contact analysis between e-bike and the ground was carried out as follows: firstly, the Penalty method was illustrated and derived from the simplest spring-mass system. This is one of the most common methods to satisfy the frictionless contact case; secondly, ANSYS static analysis was carried out to verify finite element (FE) models with contact pair (without friction) between e-bike and the ground; finally, ANSYS transient analysis was used to obtain the data of the penetration p(u) of e-bike with respect to the ground. Results obtained from the simulation are as estimated by comparing with that from theoretical method. In the future, protective shell will be designed following the stability criteria and added to the frame of e-bike. Simulation of side falling of the improved safety structure of e-bike will be confirmed with experimental data.Keywords: frictionless contact, penalty method, e-bike, finite element
Procedia PDF Downloads 279