Search results for: data acquisition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25644

Search results for: data acquisition

23004 Comparison between Some of Robust Regression Methods with OLS Method with Application

Authors: Sizar Abed Mohammed, Zahraa Ghazi Sadeeq

Abstract:

The use of the classic method, least squares (OLS) to estimate the linear regression parameters, when they are available assumptions, and capabilities that have good characteristics, such as impartiality, minimum variance, consistency, and so on. The development of alternative statistical techniques to estimate the parameters, when the data are contaminated with outliers. These are powerful methods (or resistance). In this paper, three of robust methods are studied, which are: Maximum likelihood type estimate M-estimator, Modified Maximum likelihood type estimate MM-estimator and Least Trimmed Squares LTS-estimator, and their results are compared with OLS method. These methods applied to real data taken from Duhok company for manufacturing furniture, the obtained results compared by using the criteria: Mean Squared Error (MSE), Mean Absolute Percentage Error (MAPE) and Mean Sum of Absolute Error (MSAE). Important conclusions that this study came up with are: a number of typical values detected by using four methods in the furniture line and very close to the data. This refers to the fact that close to the normal distribution of standard errors, but typical values in the doors line data, using OLS less than that detected by the powerful ways. This means that the standard errors of the distribution are far from normal departure. Another important conclusion is that the estimated values of the parameters by using the lifeline is very far from the estimated values using powerful methods for line doors, gave LTS- destined better results using standard MSE, and gave the M- estimator better results using standard MAPE. Moreover, we noticed that using standard MSAE, and MM- estimator is better. The programs S-plus (version 8.0, professional 2007), Minitab (version 13.2) and SPSS (version 17) are used to analyze the data.

Keywords: Robest, LTS, M estimate, MSE

Procedia PDF Downloads 232
23003 Hybrid Fuzzy Weighted K-Nearest Neighbor to Predict Hospital Readmission for Diabetic Patients

Authors: Soha A. Bahanshal, Byung G. Kim

Abstract:

Identification of patients at high risk for hospital readmission is of crucial importance for quality health care and cost reduction. Predicting hospital readmissions among diabetic patients has been of great interest to many researchers and health decision makers. We build a prediction model to predict hospital readmission for diabetic patients within 30 days of discharge. The core of the prediction model is a modified k Nearest Neighbor called Hybrid Fuzzy Weighted k Nearest Neighbor algorithm. The prediction is performed on a patient dataset which consists of more than 70,000 patients with 50 attributes. We applied data preprocessing using different techniques in order to handle data imbalance and to fuzzify the data to suit the prediction algorithm. The model so far achieved classification accuracy of 80% compared to other models that only use k Nearest Neighbor.

Keywords: machine learning, prediction, classification, hybrid fuzzy weighted k-nearest neighbor, diabetic hospital readmission

Procedia PDF Downloads 186
23002 The Comparison of Joint Simulation and Estimation Methods for the Geometallurgical Modeling

Authors: Farzaneh Khorram

Abstract:

This paper endeavors to construct a block model to assess grinding energy consumption (CCE) and pinpoint blocks with the highest potential for energy usage during the grinding process within a specified region. Leveraging geostatistical techniques, particularly joint estimation, or simulation, based on geometallurgical data from various mineral processing stages, our objective is to forecast CCE across the study area. The dataset encompasses variables obtained from 2754 drill samples and a block model comprising 4680 blocks. The initial analysis encompassed exploratory data examination, variography, multivariate analysis, and the delineation of geological and structural units. Subsequent analysis involved the assessment of contacts between these units and the estimation of CCE via cokriging, considering its correlation with SPI. The selection of blocks exhibiting maximum CCE holds paramount importance for cost estimation, production planning, and risk mitigation. The study conducted exploratory data analysis on lithology, rock type, and failure variables, revealing seamless boundaries between geometallurgical units. Simulation methods, such as Plurigaussian and Turning band, demonstrated more realistic outcomes compared to cokriging, owing to the inherent characteristics of geometallurgical data and the limitations of kriging methods.

Keywords: geometallurgy, multivariate analysis, plurigaussian, turning band method, cokriging

Procedia PDF Downloads 70
23001 Computed Tomography Myocardial Perfusion on a Patient with Hypertrophic Cardiomyopathy

Authors: Jitendra Pratap, Daphne Prybyszcuk, Luke Elliott, Arnold Ng

Abstract:

Introduction: Coronary CT angiography is a non-invasive imaging technique for the assessment of coronary artery disease and has high sensitivity and negative predictive value. However, the correlation between the degree of CT coronary stenosis and the significance of hemodynamic obstruction is poor. The assessment of myocardial perfusion has mostly been undertaken by Nuclear Medicine (SPECT), but it is now possible to perform stress myocardial CT perfusion (CTP) scans quickly and effectively using CT scanners with high temporal resolution. Myocardial CTP is in many ways similar to neuro perfusion imaging technique, where radiopaque iodinated contrast is injected intravenously, transits the pulmonary and cardiac structures, and then perfuses through the coronary arteries into the myocardium. On the Siemens Force CT scanner, a myocardial perfusion scan is performed using a dynamic axial acquisition, where the scanner shuffles in and out every 1-3 seconds (heart rate dependent) to be able to cover the heart in the z plane. This is usually performed over 38 seconds. Report: A CT myocardial perfusion scan can be utilised to complement the findings of a CT Coronary Angiogram. Implementing a CT Myocardial Perfusion study as part of a routine CT Coronary Angiogram procedure provides a ‘One Stop Shop’ for diagnosis of coronary artery disease. This case study demonstrates that although the CT Coronary Angiogram was within normal limits, the perfusion scan provided additional, clinically significant information in regards to the haemodynamics within the myocardium of a patient with Hypertrophic Obstructive Cardio Myopathy (HOCM). This negated the need for further diagnostics studies such as cardiac ECHO or Nuclear Medicine Stress tests. Conclusion: CT coronary angiography with adenosine stress myocardial CTP was utilised in this case to specifically exclude coronary artery disease in conjunction with accessing perfusion within the hypertrophic myocardium. Adenosine stress myocardial CTP demonstrated the reduced myocardial blood flow within the hypertrophic myocardium, but the coronary arteries did not show any obstructive disease. A CT coronary angiogram scan protocol that incorporates myocardial perfusion can provide diagnostic information on the haemodynamic significance of any coronary artery stenosis and has the potential to be a “One Stop Shop” for cardiac imaging.

Keywords: CT, cardiac, myocardium, perfusion

Procedia PDF Downloads 132
23000 Deriving Generic Transformation Matrices for Multi-Axis Milling Machine

Authors: Alan C. Lin, Tzu-Kuan Lin, Tsong Der Lin

Abstract:

This paper proposes a new method to find the equations of transformation matrix for the rotation angles of the two rotational axes and the coordinates of the three linear axes of an orthogonal multi-axis milling machine. This approach provides intuitive physical meanings for rotation angles of multi-axis machines, which can be used to evaluate the accuracy of the conversion from CL data to NC data.

Keywords: CAM, multi-axis milling machining, transformation matrix, rotation angles

Procedia PDF Downloads 482
22999 A Stepwise Approach to Automate the Search for Optimal Parameters in Seasonal ARIMA Models

Authors: Manisha Mukherjee, Diptarka Saha

Abstract:

Reliable forecasts of univariate time series data are often necessary for several contexts. ARIMA models are quite popular among practitioners in this regard. Hence, choosing correct parameter values for ARIMA is a challenging yet imperative task. Thus, a stepwise algorithm is introduced to provide automatic and robust estimates for parameters (p; d; q)(P; D; Q) used in seasonal ARIMA models. This process is focused on improvising the overall quality of the estimates, and it alleviates the problems induced due to the unidimensional nature of the methods that are currently used such as auto.arima. The fast and automated search of parameter space also ensures reliable estimates of the parameters that possess several desirable qualities, consequently, resulting in higher test accuracy especially in the cases of noisy data. After vigorous testing on real as well as simulated data, the algorithm doesn’t only perform better than current state-of-the-art methods, it also completely obviates the need for human intervention due to its automated nature.

Keywords: time series, ARIMA, auto.arima, ARIMA parameters, forecast, R function

Procedia PDF Downloads 165
22998 Decision-Making Strategies on Smart Dairy Farms: A Review

Authors: L. Krpalkova, N. O' Mahony, A. Carvalho, S. Campbell, G. Corkery, E. Broderick, J. Walsh

Abstract:

Farm management and operations will drastically change due to access to real-time data, real-time forecasting, and tracking of physical items in combination with Internet of Things developments to further automate farm operations. Dairy farms have embraced technological innovations and procured vast amounts of permanent data streams during the past decade; however, the integration of this information to improve the whole farm-based management and decision-making does not exist. It is now imperative to develop a system that can collect, integrate, manage, and analyse on-farm and off-farm data in real-time for practical and relevant environmental and economic actions. The developed systems, based on machine learning and artificial intelligence, need to be connected for useful output, a better understanding of the whole farming issue, and environmental impact. Evolutionary computing can be very effective in finding the optimal combination of sets of some objects and, finally, in strategy determination. The system of the future should be able to manage the dairy farm as well as an experienced dairy farm manager with a team of the best agricultural advisors. All these changes should bring resilience and sustainability to dairy farming as well as improving and maintaining good animal welfare and the quality of dairy products. This review aims to provide an insight into the state-of-the-art of big data applications and evolutionary computing in relation to smart dairy farming and identify the most important research and development challenges to be addressed in the future. Smart dairy farming influences every area of management, and its uptake has become a continuing trend.

Keywords: big data, evolutionary computing, cloud, precision technologies

Procedia PDF Downloads 189
22997 Hydrothermal Energy Application Technology Using Dam Deep Water

Authors: Yooseo Pang, Jongwoong Choi, Yong Cho, Yongchae Jeong

Abstract:

Climate crisis, such as environmental problems related to energy supply, is getting emerged issues, so the use of renewable energy is essentially required to solve these problems, which are mainly managed by the Paris Agreement, the international treaty on climate change. The government of the Republic of Korea announced that the key long-term goal for a low-carbon strategy is “Carbon neutrality by 2050”. It is focused on the role of the internet data centers (IDC) in which large amounts of data, such as artificial intelligence (AI) and big data as an impact of the 4th industrial revolution, are managed. The demand for the cooling system market for IDC was about 9 billion US dollars in 2020, and 15.6% growth a year is expected in Korea. It is important to control the temperature in IDC with an efficient air conditioning system, so hydrothermal energy is one of the best options for saving energy in the cooling system. In order to save energy and optimize the operating conditions, it has been considered to apply ‘the dam deep water air conditioning system. Deep water at a specific level from the dam can supply constant water temperature year-round. It will be tested & analyzed the amount of energy saving with a pilot plant that has 100RT cooling capacity. Also, a target of this project is 1.2 PUE (Power Usage Effectiveness) which is the key parameter to check the efficiency of the cooling system.

Keywords: hydrothermal energy, HVAC, internet data center, free-cooling

Procedia PDF Downloads 81
22996 A Comparative Asessment of Some Algorithms for Modeling and Forecasting Horizontal Displacement of Ialy Dam, Vietnam

Authors: Kien-Trinh Thi Bui, Cuong Manh Nguyen

Abstract:

In order to simulate and reproduce the operational characteristics of a dam visually, it is necessary to capture the displacement at different measurement points and analyze the observed movement data promptly to forecast the dam safety. The accuracy of forecasts is further improved by applying machine learning methods to data analysis progress. In this study, the horizontal displacement monitoring data of the Ialy hydroelectric dam was applied to machine learning algorithms: Gaussian processes, multi-layer perceptron neural networks, and the M5-rules algorithm for modelling and forecasting of horizontal displacement of the Ialy hydropower dam (Vietnam), respectively, for analysing. The database which used in this research was built by collecting time series of data from 2006 to 2021 and divided into two parts: training dataset and validating dataset. The final results show all three algorithms have high performance for both training and model validation, but the MLPs is the best model. The usability of them are further investigated by comparison with a benchmark models created by multi-linear regression. The result show the performance which obtained from all the GP model, the MLPs model and the M5-Rules model are much better, therefore these three models should be used to analyze and predict the horizontal displacement of the dam.

Keywords: Gaussian processes, horizontal displacement, hydropower dam, Ialy dam, M5-Rules, multi-layer perception neural networks

Procedia PDF Downloads 210
22995 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets

Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe

Abstract:

Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.

Keywords: biomedical research, genomics, information systems, software

Procedia PDF Downloads 270
22994 COVID_ICU_BERT: A Fine-Tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes

Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo

Abstract:

Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as vital physiological signs, images, and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision-making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful in influencing the judgement of clinical sentiment in ICU clinical notes. This paper introduces two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of clinical transformer models that can reliably predict clinical sentiment for notes of COVID patients in the ICU. We train the model on clinical notes for COVID-19 patients, a type of notes that were not previously seen by clinicalBERT, and Bio_Discharge_Summary_BERT. The model, which was based on clinicalBERT achieves higher predictive accuracy (Acc 93.33%, AUC 0.98, and precision 0.96 ). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and precision 0.92 ).

Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation

Procedia PDF Downloads 206
22993 The Sapir-Whorf Hypothesis and Multicultural Effects on Translators: A Case Study from Chinese Ethnic Minority Literature

Authors: Yuqiao Zhou

Abstract:

The Sapir-Whorf hypothesis (SWH) emphasizes the effect produced by language on people’s minds. According to linguistic relativity, language has evolved over the course of human life on earth, and, in turn, the acquisition of language shapes learners’ thoughts. Despite much attention drawn by SWH, few scholars have attempted to analyse people’s thoughts via their literary works. And yet, the linguistic choices that create a narrative can enable us to examine its writer’s thoughts. Still, less work has been done on the impact of language on the minds of bilingual people. Internationalization has resulted in an increasing number of bilingual and multilingual individuals. In China, where more than one hundred languages are used for communication, most people are bilingual in Mandarin Chinese (the official language of China) and their own dialect. Taking as its corpus the ethnic minority myth of Ge Sa-er Wang by Alai and its English translation by Goldblatt and Lin, this paper aims to analyse the effects of culture on bilingual people’s minds. It will first analyse Alai’s thoughts on using the original version of Ge Sa-er Wang; next, it will examine the thoughts of the two translators by looking at translation choices made in the English version; finally, it will compare the cultural influences evident in the thoughts of Alai, and Goldblatt and Lin. Whereas Alai can speak two Sino-Tibetan languages – Mandarin Chinese and Tibetan – Goldblatt and Lin can speak two languages from different families – Mandarin Chinese (a Sino-Tibetan language) and English (an Indo-European language). The results reveal two systems of thought existing in the translators’ minds; Alai’s text, on the other hand, does not reveal a significant influence from North China, where Mandarin Chinese originated. The findings reveal the inconsistency of a second language’s influence on people’s minds. Notably, they suggest that the more different the two languages are, the greater the influence produced by the second language culture on people’s thoughts. It is hoped that this research will expand the scope of SWH as well as shed light on future translation studies on ethnic minority literature.

Keywords: Sapir-Whorf hypothesis, cultural translation, cultural-specific items, Ge Sa-er Wang, ethnic minority literature, Tibet

Procedia PDF Downloads 115
22992 The Impact of AI on Higher Education

Authors: Georges Bou Ghantous

Abstract:

This literature review examines the transformative impact of Artificial Intelligence (AI) on higher education, highlighting both the potential benefits and challenges associated with its adoption. The review reveals that AI significantly enhances personalized learning by tailoring educational experiences to individual student needs, thereby boosting engagement and learning outcomes. Automated grading systems streamline assessment processes, allowing educators to focus on improving instructional quality and student interaction. AI's data-driven insights provide valuable analytics, helping educators identify trends in at-risk students and refine teaching strategies. Moreover, AI promotes enhanced instructional innovation through the adoption of advanced teaching methods and technologies, enriching the educational environment. Administrative efficiency is also improved as AI automates routine tasks, freeing up time for educators to engage in research and curriculum development. However, the review also addresses the challenges that accompany AI integration, such as data privacy concerns, algorithmic bias, dependency on technology, reduced human interaction, and ethical dilemmas. This balanced exploration underscores the need for careful consideration of both the advantages and potential hurdles in the implementation of AI in higher education.

Keywords: administrative efficiency, data-driven insights, data privacy, ethical dilemmas, higher education, personalized learning

Procedia PDF Downloads 26
22991 Development and Modeling of a Geographic Information System Solar Flux in Adrar, Algeria

Authors: D. Benatiallah, A. Benatiallah, K. Bouchouicha, A. Harouz

Abstract:

The development and operation of renewable energy known an important development in the world with significant growth potential. Estimate the solar radiation on terrestrial geographic locality is of extreme importance, firstly to choose the appropriate site where to place solar systems (solar power plants for electricity generation, for example) and also for the design and performance analysis of any system using solar energy. In addition, solar radiation measurements are limited to a few areas only in Algeria. Thus, we use theoretical approaches to assess the solar radiation on a given location. The Adrar region is one of the most favorable sites for solar energy use with a medium flow that exceeds 7 kWh / m2 / d and saddle of over 3500 hours per year. Our goal in this work focuses on the creation of a data bank for the given data in the energy field of the Adrar region for the period of the year and the month then the integration of these data into a geographic Information System (GIS) to estimate the solar flux on a location on the map.

Keywords: Adrar, flow, GIS, deposit potential

Procedia PDF Downloads 374
22990 Mapping a Data Governance Framework to the Continuum of Care in the Active Assisted Living Context

Authors: Gaya Bin Noon, Thoko Hanjahanja-Phiri, Laura Xavier Fadrique, Plinio Pelegrini Morita, Hélène Vaillancourt, Jennifer Teague, Tania Donovska

Abstract:

Active Assisted Living (AAL) refers to systems designed to improve the quality of life, aid in independence, and create healthier lifestyles for care recipients. As the population ages, there is a pressing need for non-intrusive, continuous, adaptable, and reliable health monitoring tools to support aging in place. AAL has great potential to support these efforts with the wide variety of solutions currently available, but insufficient efforts have been made to address concerns arising from the integration of AAL into care. The purpose of this research was to (1) explore the integration of AAL technologies and data into the clinical pathway, and (2) map data access and governance for AAL technology in order to develop standards for use by policy-makers, technology manufacturers, and developers of smart communities for seniors. This was done through four successive research phases: (1) literature search to explore existing work in this area and identify lessons learned; (2) modeling of the continuum of care; (3) adapting a framework for data governance into the AAL context; and (4) interviews with stakeholders to explore the applicability of previous work. Opportunities for standards found in these research phases included a need for greater consistency in language and technology requirements, better role definition regarding who can access and who is responsible for taking action based on the gathered data, and understanding of the privacy-utility tradeoff inherent in using AAL technologies in care settings.

Keywords: active assisted living, aging in place, internet of things, standards

Procedia PDF Downloads 131
22989 Remotely Sensed Data Fusion to Extract Vegetation Cover in the Cultural Park of Tassili, South of Algeria

Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur

Abstract:

The cultural park of the Tassili, occupying a large area of Algeria, is characterized by a rich vegetative biodiversity to be preserved and managed both in time and space. The management of a large area (case of Tassili), by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information etc.), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Multispectral imaging sensors have been very useful in the last decade in very interesting applications of remote sensing. They can aid in several domains such as the de¬tection and identification of diverse surface targets, topographical details, and geological features. In this work, we try to extract vegetative areas using fusion techniques between data acquired from sensor on-board the Earth Observing 1 (EO-1) satellite and Landsat ETM+ and TM sensors. We have used images acquired over the Oasis of Djanet in the National Park of Tassili in the south of Algeria. Fusion technqiues were applied on the obtained image to extract the vegetative fraction of the different classes of land use. We compare the obtained results in vegetation end member extraction with vegetation indices calculated from both Hyperion and other multispectral sensors.

Keywords: Landsat ETM+, EO1, data fusion, vegetation, Tassili, Algeria

Procedia PDF Downloads 433
22988 The Importance of Industrial Work Experience, Career Information, and Work Motivation to Increase Work Readiness

Authors: Nyaris Pambudiyatno, Asto Buditjahjanto, Eppy Yundra, Arie Wardhono, Eko Hariadi

Abstract:

Vocational education is part of the national education system that is prepared to produce graduates who have the skills and knowledge according to the needs and requirements required by the job. Vocational Education is a secondary education that prepares students to work in a particular field. The purpose of this study was to analyze and find out the effect of industrial work practice experience and career information on work readiness through work motivation. This type of research is causal research with a quantitative approach. The population in this study was 359 cadets of Aviation Polytechnic Surabaya. While the number of samples calculates using slovin calculations obtained by 189 cadets of Surabaya Aviation Polytechnic. The type of data used is quantitative data with the primary data source. Data collection techniques are by distributing questionnaires. Analysis of this study is with Lisrel. The findings prove that: (1) Industrial Work Experience experience has a positive and significant effect on work motivation; (2) Industrial Work Experience has a positive and significant impact on work readiness; (3) Career information has a positive and significant effect on job readiness; (4) Career information has a positive and significant impact on job readiness; Dan (5) Work motivation has a positive and significant effect on work readiness.

Keywords: career information, increase work readiness, industrial work experience, work motivation

Procedia PDF Downloads 137
22987 Regression of Hand Kinematics from Surface Electromyography Data Using an Long Short-Term Memory-Transformer Model

Authors: Anita Sadat Sadati Rostami, Reza Almasi Ghaleh

Abstract:

Surface electromyography (sEMG) offers important insights into muscle activation and has applications in fields including rehabilitation and human-computer interaction. The purpose of this work is to predict the degree of activation of two joints in the index finger using an LSTM-Transformer architecture trained on sEMG data from the Ninapro DB8 dataset. We apply advanced preprocessing techniques, such as multi-band filtering and customizable rectification methods, to enhance the encoding of sEMG data into features that are beneficial for regression tasks. The processed data is converted into spike patterns and simulated using Leaky Integrate-and-Fire (LIF) neuron models, allowing for neuromorphic-inspired processing. Our findings demonstrate that adjusting filtering parameters and neuron dynamics and employing the LSTM-Transformer model improves joint angle prediction performance. This study contributes to the ongoing development of deep learning frameworks for sEMG analysis, which could lead to improvements in motor control systems.

Keywords: surface electromyography, LSTM-transformer, spiking neural networks, hand kinematics, leaky integrate-and-fire neuron, band-pass filtering, muscle activity decoding

Procedia PDF Downloads 7
22986 Ex-Post Export Data for Differentiated Products Revealing the Existence of Productcycles

Authors: Ranajoy Bhattcharyya

Abstract:

We estimate international product cycles as shifting product spaces by using 1976 to 2010 UN Comtrade data on all differentiated tradable products in all countries. We use a product space approach to identify the representative product baskets of high-, middle and low-income countries and then use these baskets to identify the patterns of change in comparative advantage of countries over time. We find evidence of a product cycle in two senses: First, high-, middle- and low-income countries differ in comparative advantage, and high-income products migrate to the middle-income basket. A similar pattern is observed for middle- and low-income countries. Our estimation of the lag shows that middle-income countries tend to quickly take up the products of high-income countries, but low-income countries take a longer time absorbing these products. Thus, the gap between low- and middle-income countries is considerably higher than that between middle- and high-income nations.

Keywords: product cycle, comparative advantage, representative product basket, ex-post data

Procedia PDF Downloads 420
22985 A Comparative Analysis of Innovation Maturity Models: Towards the Development of a Technology Management Maturity Model

Authors: Nikolett Deutsch, Éva Pintér, Péter Bagó, Miklós Hetényi

Abstract:

Strategic technology management has emerged and evolved parallelly with strategic management paradigms. It focuses on the opportunity for organizations operating mainly in technology-intensive industries to explore and exploit technological capabilities upon which competitive advantage can be obtained. As strategic technology management involves multifunction within an organization, requires broad and diversified knowledge, and must be developed and implemented with business objectives to enable a firm’s profitability and growth, excellence in strategic technology management provides unique opportunities for organizations in terms of building a successful future. Accordingly, a framework supporting the evaluation of the technological readiness level of management can significantly contribute to developing organizational competitiveness through a better understanding of strategic-level capabilities and deficiencies in operations. In the last decade, several innovation maturity assessment models have appeared and become designated management tools that can serve as references for future practical approaches expected to be used by corporate leaders, strategists, and technology managers to understand and manage technological capabilities and capacities. The aim of this paper is to provide a comprehensive review of the state-of-the-art innovation maturity frameworks, to investigate the critical lessons learned from their application, to identify the similarities and differences among the models, and identify the main aspects and elements valid for the field and critical functions of technology management. To this end, a systematic literature review was carried out considering the relevant papers and articles published in highly ranked international journals around the 27 most widely known innovation maturity models from four relevant digital sources. Key findings suggest that despite the diversity of the given models, there is still room for improvement regarding the common understanding of innovation typologies, the full coverage of innovation capabilities, and the generalist approach to the validation and practical applicability of the structure and content of the models. Furthermore, the paper proposes an initial structure by considering the maturity assessment of the technological capacities and capabilities - i.e., technology identification, technology selection, technology acquisition, technology exploitation, and technology protection - covered by strategic technology management.

Keywords: innovation capabilities, innovation maturity models, technology audit, technology management, technology management maturity models

Procedia PDF Downloads 61
22984 Enabling Self-Care and Shared Decision Making for People Living with Dementia

Authors: Jonathan Turner, Julie Doyle, Laura O’Philbin, Dympna O’Sullivan

Abstract:

People living with dementia should be at the centre of decision-making regarding goals for daily living. These goals include basic activities (dressing, hygiene, and mobility), advanced activities (finances, transportation, and shopping), and meaningful activities that promote well-being (pastimes and intellectual pursuits). However, there is limited involvement of people living with dementia in the design of technology to support their goals. A project is described that is co-designing intelligent computer-based support for, and with, people affected by dementia and their carers. The technology will support self-management, empower participation in shared decision-making with carers and help people living with dementia remain healthy and independent in their homes for longer. It includes information from the patient’s care plan, which documents medications, contacts, and the patient's wishes on end-of-life care. Importantly for this work, the plan can outline activities that should be maintained or worked towards, such as exercise or social contact. The authors discuss how to integrate care goal information from such a care plan with data collected from passive sensors in the patient’s home in order to deliver individualized planning and interventions for persons with dementia. A number of scientific challenges are addressed: First, to co-design with dementia patients and their carers computerized support for shared decision-making about their care while allowing the patient to share the care plan. Second, to develop a new and open monitoring framework with which to configure sensor technologies to collect data about whether goals and actions specified for a person in their care plan are being achieved. This is developed top-down by associating care quality types and metrics elicited from the co-design activities with types of data that can be collected within the home, from passive and active sensors, and from the patient’s feedback collected through a simple co-designed interface. These activities and data will be mapped to appropriate sensors and technological infrastructure with which to collect the data. Third, the application of machine learning models to analyze data collected via the sensing devices in order to investigate whether and to what extent activities outlined via the care plan are being achieved. The models will capture longitudinal data to track disease progression over time; as the disease progresses and captured data show that activities outlined in the care plan are not being achieved, the care plan may recommend alternative activities. Disease progression may also require care changes, and a data-driven approach can capture changes in a condition more quickly and allow care plans to evolve and be updated.

Keywords: care goals, decision-making, dementia, self-care, sensors

Procedia PDF Downloads 170
22983 Geographic Information System Cloud for Sustainable Digital Water Management: A Case Study

Authors: Mohamed H. Khalil

Abstract:

Water is one of the most crucial elements which influence human lives and development. Noteworthy, over the last few years, GIS plays a significant role in optimizing water management systems, especially after exponential developing in this sector. In this context, the Egyptian government initiated an advanced ‘GIS-Web Based System’. This system is efficiently designed to tangibly assist and optimize the complement and integration of data between departments of Call Center, Operation and Maintenance, and laboratory. The core of this system is a unified ‘Data Model’ for all the spatial and tabular data of the corresponding departments. The system is professionally built to provide advanced functionalities such as interactive data collection, dynamic monitoring, multi-user editing capabilities, enhancing data retrieval, integrated work-flow, different access levels, and correlative information record/track. Noteworthy, this cost-effective system contributes significantly not only in the completeness of the base-map (93%), the water network (87%) in high level of details GIS format, enhancement of the performance of the customer service, but also in reducing the operating costs/day-to-day operations (~ 5-10 %). In addition, the proposed system facilitates data exchange between different departments (Call Center, Operation and Maintenance, and laboratory), which allowed a better understanding/analyzing of complex situations. Furthermore, this system reflected tangibly on: (i) dynamic environmental monitor/water quality indicators (ammonia, turbidity, TDS, sulfate, iron, pH, etc.), (ii) improved effectiveness of the different water departments, (iii) efficient deep advanced analysis, (iv) advanced web-reporting tools (daily, weekly, monthly, quarterly, and annually), (v) tangible planning synthesizing spatial and tabular data; and finally, (vi) scalable decision support system. It is worth to highlight that the proposed future plan (second phase) of this system encompasses scalability will extend to include integration with departments of Billing and SCADA. This scalability will comprise advanced functionalities in association with the existing one to allow further sustainable contributions.

Keywords: GIS Web-Based, base-map, water network, decision support system

Procedia PDF Downloads 96
22982 Advantages of Neural Network Based Air Data Estimation for Unmanned Aerial Vehicles

Authors: Angelo Lerro, Manuela Battipede, Piero Gili, Alberto Brandl

Abstract:

Redundancy requirements for UAV (Unmanned Aerial Vehicle) are hardly faced due to the generally restricted amount of available space and allowable weight for the aircraft systems, limiting their exploitation. Essential equipment as the Air Data, Attitude and Heading Reference Systems (ADAHRS) require several external probes to measure significant data as the Angle of Attack or the Sideslip Angle. Previous research focused on the analysis of a patented technology named Smart-ADAHRS (Smart Air Data, Attitude and Heading Reference System) as an alternative method to obtain reliable and accurate estimates of the aerodynamic angles. This solution is based on an innovative sensor fusion algorithm implementing soft computing techniques and it allows to obtain a simplified inertial and air data system reducing external devices. In fact, only one external source of dynamic and static pressures is needed. This paper focuses on the benefits which would be gained by the implementation of this system in UAV applications. A simplification of the entire ADAHRS architecture will bring to reduce the overall cost together with improved safety performance. Smart-ADAHRS has currently reached Technology Readiness Level (TRL) 6. Real flight tests took place on ultralight aircraft equipped with a suitable Flight Test Instrumentation (FTI). The output of the algorithm using the flight test measurements demonstrates the capability for this fusion algorithm to embed in a single device multiple physical and virtual sensors. Any source of dynamic and static pressure can be integrated with this system gaining a significant improvement in terms of versatility.

Keywords: aerodynamic angles, air data system, flight test, neural network, unmanned aerial vehicle, virtual sensor

Procedia PDF Downloads 221
22981 From Madrassah to Elite Schools; The Political Economy of Pluralistic Educational Systems in Pakistan

Authors: Ahmad Zia

Abstract:

This study problematizes the notion that the pluralistic educational system in Pakistan fosters equality. Instead, it argues that this system not only reflects but also sustains existing class divisions, with implications for the future economic and social mobility of children. The primary goal of this study is to explore unequal access to educational opportunities in Pakistan. By examining the intersection between education and socioeconomic status, it attempts to explore the implications of key disparities in different tiers of education systems in Pakistan like between madrassahs, public schools and private schools, with an emphasis on how these institutions contribute to the maintenance of class hierarchies. This is a primary data based case study and the most recent data has been directly gathered Qualitative methods have been used to collect data from the units of data collection (UDCs). it have used Bourdieu’s theory as a leading framework. Its application in the context of country like Pakistan is very productive. it choose the thematic analysis method to analyse the data. This process helped me to identify relevant main themes and subthemes emerging from my data, which could comprise my analysis. Findings reveal that the educational landscape in Pakistan is deeply divided having far-reaching implications for social mobility and access to opportunities. This study found profound disparities among various educational institutions with respect to widening socioeconomic divides. Every kind of educational institution operates in a distinct socio-cultural and economic environment. Therefore, access to quality education is highly stratified and remains a privilege for only those who can afford it. This widens the socioeconomic gap that already exists. There has not been an extensive investigation of the relationship between pluralistic educations with class stratification in the literature so far. This study adds to a multifaceted understanding of educational disparities in Pakistan by analysing the intersections between socioeconomic divisions and educational access. It offers valuable theoretical and practical insights into the subject. This study provides theoretical concepts and empirical data to enhance scholars' understanding of socioeconomic inequality, specifically in relation to education systems.

Keywords: social inequality, pluralism, class divide, capitalism, globalisation, elitism, education

Procedia PDF Downloads 10
22980 TRACE/FRAPTRAN Analysis of Kuosheng Nuclear Power Plant Dry-Storage System

Authors: J. R. Wang, Y. Chiang, W. Y. Li, H. T. Lin, H. C. Chen, C. Shih, S. W. Chen

Abstract:

The dry-storage systems of nuclear power plants (NPPs) in Taiwan have become one of the major safety concerns. There are two steps considered in this study. The first step is the verification of the TRACE by using VSC-17 experimental data. The results of TRACE were similar to the VSC-17 data. It indicates that TRACE has the respectable accuracy in the simulation and analysis of the dry-storage systems. The next step is the application of TRACE in the dry-storage system of Kuosheng NPP (BWR/6). Kuosheng NPP is the second BWR NPP of Taiwan Power Company. In order to solve the storage of the spent fuels, Taiwan Power Company developed the new dry-storage system for Kuosheng NPP. In this step, the dry-storage system model of Kuosheng NPP was established by TRACE. Then, the steady state simulation of this model was performed and the results of TRACE were compared with the Kuosheng NPP data. Finally, this model was used to perform the safety analysis of Kuosheng NPP dry-storage system. Besides, FRAPTRAN was used tocalculate the transient performance of fuel rods.

Keywords: BWR, TRACE, FRAPTRAN, dry-storage

Procedia PDF Downloads 519
22979 Evaluating Learning Outcomes in the Implementation of Flipped Teaching Using Data Envelopment Analysis

Authors: Huie-Wen Lin

Abstract:

This study integrated various teaching factors -based on the idea of a flipped classroom- in a financial management course. The study’s aim was to establish an effective teaching implementation strategy and evaluation mechanism with respect to learning outcomes, which can serve as a reference for the future modification of teaching methods. This study implemented a teaching method in five stages and estimated the learning efficiencies of 22 students (in the teaching scenario and over two semesters). Subsequently, data envelopment analysis (DEA) was used to compare, for each student, between the learning efficiencies before and after participation in the flipped classroom -in the first and second semesters, respectively- to identify the crucial external factors influencing learning efficiency. According to the results, the average overall student learning efficiency increased from 0.901 in the first semester to 0.967 in the second semester, which demonstrate that the flipped classroom approach can improve teaching effectiveness and learning outcomes. The results also revealed a difference in learning efficiency between male and female students.

Keywords: data envelopment analysis, flipped classroom, learning outcome, teaching and learning

Procedia PDF Downloads 156
22978 Using RASCAL and ALOHA Codes to Establish an Analysis Methodology for Hydrogen Fluoride Evaluation

Authors: J. R. Wang, Y. Chiang, W. S. Hsu, H. C. Chen, S. H. Chen, J. H. Yang, S. W. Chen, C. Shih

Abstract:

In this study, the RASCAL and ALOHA codes are used to establish an analysis methodology for hydrogen fluoride (HF) evaluation. There are three main steps in this study. First, the UF6 data were collected. Second, one postulated case was analyzed by using the RASCAL and UF6 data. This postulated case assumes that fire occurring and UF6 is releasing from a building. Third, the results of RASCAL for HF mass were as the input data of ALOHA. Two postulated cases of HF were analyzed by using ALOHA code and the results of RASCAL. These postulated cases assume fire occurring and HF is releasing with no raining (Case 1) or raining (Case 2) condition. According to the analysis results of ALOHA, the HF concentration of Case 2 is smaller than Case 1. The results can be a reference for the preparing of emergency plans for the release of HF.

Keywords: RASCAL, ALOHA, UF₆, hydrogen fluoride

Procedia PDF Downloads 750
22977 Heart Ailment Prediction Using Machine Learning Methods

Authors: Abhigyan Hedau, Priya Shelke, Riddhi Mirajkar, Shreyash Chaple, Mrunali Gadekar, Himanshu Akula

Abstract:

The heart is the coordinating centre of the major endocrine glandular structure of the body, which produces hormones that profoundly affect the operations of the body, and diagnosing cardiovascular disease is a difficult but critical task. By extracting knowledge and information about the disease from patient data, data mining is a more practical technique to help doctors detect disorders. We use a variety of machine learning methods here, including logistic regression and support vector classifiers (SVC), K-nearest neighbours Classifiers (KNN), Decision Tree Classifiers, Random Forest classifiers and Gradient Boosting classifiers. These algorithms are applied to patient data containing 13 different factors to build a system that predicts heart disease in less time with more accuracy.

Keywords: logistic regression, support vector classifier, k-nearest neighbour, decision tree, random forest and gradient boosting

Procedia PDF Downloads 51
22976 Slosh Investigations on a Spacecraft Propellant Tank for Control Stability Studies

Authors: Sarath Chandran Nair S, Srinivas Kodati, Vasudevan R, Asraff A. K

Abstract:

Spacecrafts generally employ liquid propulsion for their attitude and orbital maneuvers or raising it from geo-transfer orbit to geosynchronous orbit. Liquid propulsion systems use either mono-propellant or bi-propellants for generating thrust. These propellants are generally stored in either spherical tanks or cylindrical tanks with spherical end domes. The propellant tanks are provided with a propellant acquisition system/propellant management device along with vanes and their conical mounting structure to ensure propellant availability in the outlet for thrust generation even under a low/zero-gravity environment. Slosh is the free surface oscillations in partially filled containers under external disturbances. In a spacecraft, these can be due to control forces and due to varying acceleration. Knowledge of slosh and its effect due to internals is essential for understanding its stability through control stability studies. It is mathematically represented by a pendulum-mass model. It requires parameters such as slosh frequency, damping, sloshes mass and its location, etc. This paper enumerates various numerical and experimental methods used for evaluating the slosh parameters required for representing slosh. Numerical methods like finite element methods based on linear velocity potential theory and computational fluid dynamics based on Reynolds Averaged Navier Stokes equations are used for the detailed evaluation of slosh behavior in one of the spacecraft propellant tanks used in an Indian space mission. Experimental studies carried out on a scaled-down model are also discussed. Slosh parameters evaluated by different methods matched very well and finalized their dispersion bands based on experimental studies. It is observed that the presence of internals such as propellant management devices, including conical support structure, alters slosh parameters. These internals also offers one order higher damping compared to viscous/ smooth wall damping. It is an advantage factor for the stability of slosh. These slosh parameters are given for establishing slosh margins through control stability studies and finalize the spacecraft control system design.

Keywords: control stability, propellant tanks, slosh, spacecraft, slosh spacecraft

Procedia PDF Downloads 245
22975 A Case Study of the Political Determinant of Health on the Public Health Crisis of Malaria in Nigeria

Authors: Bisola Olumegbon

Abstract:

Globally, there were about 229 million cases of malaria in 2022. The sub-Saharan African region accounted for 92% of the reported cases and 94% of deaths. Nigeria had the highest number of malaria cases and deaths, representing 27% of global cases. This scholarly project was a case study guided by the political determinants of health. Triangulation of data using thematic analysis was used to identify the political determinants of malaria in Nigeria and to understand how the concept of interaction contributes to the persistence of the disease. The analysis involved a deductive and inductive approach based on the literature review and the evidence of political determinants gathered in the data. Participants’ in-depth interviews were used to collect data from frontline personnel. Data triangulation was done using thematic analysis, a method used to identify patterns and themes in qualitative data. The study findings revealed a correlation between political determinants of health and malaria management efforts in Nigeria. Some influencing factors included voting challenges, inadequate funding, lack of health priority from the government, noncompliance among patients, and hurdles to effective communication. The findings suggest a need to deliberately increase dedication to the political agenda, provide sufficient financial resources, enhance communication, and active community involvement to address the persistent malaria endemic effectively. Further study is recommended to identify interventions to address identified factors of political determinants of health to reduce malaria in Nigeria. Such intervention must involve collaboration with diverse stakeholders such as policymakers, healthcare professionals, community leaders, and researchers.

Keywords: malaria, malaria management, health worker, stakeholders, political determinant of health

Procedia PDF Downloads 71