Search results for: enterprise data warehouse
22765 Remote Vital Signs Monitoring in Neonatal Intensive Care Unit Using a Digital Camera
Authors: Fatema-Tuz-Zohra Khanam, Ali Al-Naji, Asanka G. Perera, Kim Gibson, Javaan Chahl
Abstract:
Conventional contact-based vital signs monitoring sensors such as pulse oximeters or electrocardiogram (ECG) may cause discomfort, skin damage, and infections, particularly in neonates with fragile, sensitive skin. Therefore, remote monitoring of the vital sign is desired in both clinical and non-clinical settings to overcome these issues. Camera-based vital signs monitoring is a recent technology for these applications with many positive attributes. However, there are still limited camera-based studies on neonates in a clinical setting. In this study, the heart rate (HR) and respiratory rate (RR) of eight infants at the Neonatal Intensive Care Unit (NICU) in Flinders Medical Centre were remotely monitored using a digital camera applying color and motion-based computational methods. The region-of-interest (ROI) was efficiently selected by incorporating an image decomposition method. Furthermore, spatial averaging, spectral analysis, band-pass filtering, and peak detection were also used to extract both HR and RR. The experimental results were validated with the ground truth data obtained from an ECG monitor and showed a strong correlation using the Pearson correlation coefficient (PCC) 0.9794 and 0.9412 for HR and RR, respectively. The RMSE between camera-based data and ECG data for HR and RR were 2.84 beats/min and 2.91 breaths/min, respectively. A Bland Altman analysis of the data also showed a close correlation between both data sets with a mean bias of 0.60 beats/min and 1 breath/min, and the lower and upper limit of agreement -4.9 to + 6.1 beats/min and -4.4 to +6.4 breaths/min for both HR and RR, respectively. Therefore, video camera imaging may replace conventional contact-based monitoring in NICU and has potential applications in other contexts such as home health monitoring.Keywords: neonates, NICU, digital camera, heart rate, respiratory rate, image decomposition
Procedia PDF Downloads 10422764 A Study on Sentiment Analysis Using Various ML/NLP Models on Historical Data of Indian Leaders
Authors: Sarthak Deshpande, Akshay Patil, Pradip Pandhare, Nikhil Wankhede, Rushali Deshmukh
Abstract:
Among the highly significant duties for any language most effective is the sentiment analysis, which is also a key area of NLP, that recently made impressive strides. There are several models and datasets available for those tasks in popular and commonly used languages like English, Russian, and Spanish. While sentiment analysis research is performed extensively, however it is lagging behind for the regional languages having few resources such as Hindi, Marathi. Marathi is one of the languages that included in the Indian Constitution’s 8th schedule and is the third most widely spoken language in the country and primarily spoken in the Deccan region, which encompasses Maharashtra and Goa. There isn’t sufficient study on sentiment analysis methods based on Marathi text due to lack of available resources, information. Therefore, this project proposes the use of different ML/NLP models for the analysis of Marathi data from the comments below YouTube content, tweets or Instagram posts. We aim to achieve a short and precise analysis and summary of the related data using our dataset (Dates, names, root words) and lexicons to locate exact information.Keywords: multilingual sentiment analysis, Marathi, natural language processing, text summarization, lexicon-based approaches
Procedia PDF Downloads 7422763 A Machine Learning Approach for Performance Prediction Based on User Behavioral Factors in E-Learning Environments
Authors: Naduni Ranasinghe
Abstract:
E-learning environments are getting more popular than any other due to the impact of COVID19. Even though e-learning is one of the best solutions for the teaching-learning process in the academic process, it’s not without major challenges. Nowadays, machine learning approaches are utilized in the analysis of how behavioral factors lead to better adoption and how they related to better performance of the students in eLearning environments. During the pandemic, we realized the academic process in the eLearning approach had a major issue, especially for the performance of the students. Therefore, an approach that investigates student behaviors in eLearning environments using a data-intensive machine learning approach is appreciated. A hybrid approach was used to understand how each previously told variables are related to the other. A more quantitative approach was used referred to literature to understand the weights of each factor for adoption and in terms of performance. The data set was collected from previously done research to help the training and testing process in ML. Special attention was made to incorporating different dimensionality of the data to understand the dependency levels of each. Five independent variables out of twelve variables were chosen based on their impact on the dependent variable, and by considering the descriptive statistics, out of three models developed (Random Forest classifier, SVM, and Decision tree classifier), random forest Classifier (Accuracy – 0.8542) gave the highest value for accuracy. Overall, this work met its goals of improving student performance by identifying students who are at-risk and dropout, emphasizing the necessity of using both static and dynamic data.Keywords: academic performance prediction, e learning, learning analytics, machine learning, predictive model
Procedia PDF Downloads 15722762 Study of the Process of Climate Change According to Data Simulation Using LARS-WG Software during 2010-2030: Case Study of Semnan Province
Authors: Leila Rashidian
Abstract:
Temperature rise on Earth has had harmful effects on the Earth's surface and has led to change in precipitation patterns all around the world. The present research was aimed to study the process of climate change according to the data simulation in future and compare these parameters with current situation in the studied stations in Semnan province including Garmsar, Shahrood and Semnan. In this regard, LARS-WG software, HADCM3 model and A2 scenario were used for the 2010-2030 period. In this model, climatic parameters such as maximum and minimum temperature, precipitation and radiation were used daily. The obtained results indicated that there will be a 4.4% increase in precipitation in Semnan province compared with the observed data, and in general, there will be a 1.9% increase in temperature. This temperature rise has significant impact on precipitation patterns. Most of precipitation will be raining (torrential rains in some cases). According to the results, from west to east, the country will experience more temperature rise and will be warmer.Keywords: climate change, Semnan province, Lars.WG model, climate parameters, HADCM₃ model
Procedia PDF Downloads 25222761 Estimation of Geotechnical Parameters by Comparing Monitoring Data with Numerical Results: Case Study of Arash–Esfandiar-Niayesh Under-Passing Tunnel, Africa Tunnel, Tehran, Iran
Authors: Aliakbar Golshani, Seyyed Mehdi Poorhashemi, Mahsa Gharizadeh
Abstract:
The under passing tunnels are strongly influenced by the soils around. There are some complexities in the specification of real soil behavior, owing to the fact that lots of uncertainties exist in soil properties, and additionally, inappropriate soil constitutive models. Such mentioned factors may cause incompatible settlements in numerical analysis with the obtained values in actual construction. This paper aims to report a case study on a specific tunnel constructed by NATM. The tunnel has a depth of 11.4 m, height of 12.2 m, and width of 14.4 m with 2.5 lanes. The numerical modeling was based on a 2D finite element program. The soil material behavior was modeled by hardening soil model. According to the field observations, the numerical estimated settlement at the ground surface was approximately four times more than the measured one, after the entire installation of the initial lining, indicating that some unknown factors affect the values. Consequently, the geotechnical parameters are accurately revised by a numerical back-analysis using laboratory and field test data and based on the obtained monitoring data. The obtained result confirms that typically, the soil parameters are conservatively low-estimated. And additionally, the constitutive models cannot be applied properly for all soil conditions.Keywords: NATM tunnel, initial lining, laboratory test data, numerical back-analysis
Procedia PDF Downloads 36122760 Data Refinement Enhances The Accuracy of Short-Term Traffic Latency Prediction
Authors: Man Fung Ho, Lap So, Jiaqi Zhang, Yuheng Zhao, Huiyang Lu, Tat Shing Choi, K. Y. Michael Wong
Abstract:
Nowadays, a tremendous amount of data is available in the transportation system, enabling the development of various machine learning approaches to make short-term latency predictions. A natural question is then the choice of relevant information to enable accurate predictions. Using traffic data collected from the Taiwan Freeway System, we consider the prediction of short-term latency of a freeway segment with a length of 17 km covering 5 measurement points, each collecting vehicle-by-vehicle data through the electronic toll collection system. The processed data include the past latencies of the freeway segment with different time lags, the traffic conditions of the individual segments (the accumulations, the traffic fluxes, the entrance and exit rates), the total accumulations, and the weekday latency profiles obtained by Gaussian process regression of past data. We arrive at several important conclusions about how data should be refined to obtain accurate predictions, which have implications for future system-wide latency predictions. (1) We find that the prediction of median latency is much more accurate and meaningful than the prediction of average latency, as the latter is plagued by outliers. This is verified by machine-learning prediction using XGBoost that yields a 35% improvement in the mean square error of the 5-minute averaged latencies. (2) We find that the median latency of the segment 15 minutes ago is a very good baseline for performance comparison, and we have evidence that further improvement is achieved by machine learning approaches such as XGBoost and Long Short-Term Memory (LSTM). (3) By analyzing the feature importance score in XGBoost and calculating the mutual information between the inputs and the latencies to be predicted, we identify a sequence of inputs ranked in importance. It confirms that the past latencies are most informative of the predicted latencies, followed by the total accumulation, whereas inputs such as the entrance and exit rates are uninformative. It also confirms that the inputs are much less informative of the average latencies than the median latencies. (4) For predicting the latencies of segments composed of two or three sub-segments, summing up the predicted latencies of each sub-segment is more accurate than the one-step prediction of the whole segment, especially with the latency prediction of the downstream sub-segments trained to anticipate latencies several minutes ahead. The duration of the anticipation time is an increasing function of the traveling time of the upstream segment. The above findings have important implications to predicting the full set of latencies among the various locations in the freeway system.Keywords: data refinement, machine learning, mutual information, short-term latency prediction
Procedia PDF Downloads 16922759 Influential Parameters in Estimating Soil Properties from Cone Penetrating Test: An Artificial Neural Network Study
Authors: Ahmed G. Mahgoub, Dahlia H. Hafez, Mostafa A. Abu Kiefa
Abstract:
The Cone Penetration Test (CPT) is a common in-situ test which generally investigates a much greater volume of soil more quickly than possible from sampling and laboratory tests. Therefore, it has the potential to realize both cost savings and assessment of soil properties rapidly and continuously. The principle objective of this paper is to demonstrate the feasibility and efficiency of using artificial neural networks (ANNs) to predict the soil angle of internal friction (Φ) and the soil modulus of elasticity (E) from CPT results considering the uncertainties and non-linearities of the soil. In addition, ANNs are used to study the influence of different parameters and recommend which parameters should be included as input parameters to improve the prediction. Neural networks discover relationships in the input data sets through the iterative presentation of the data and intrinsic mapping characteristics of neural topologies. General Regression Neural Network (GRNN) is one of the powerful neural network architectures which is utilized in this study. A large amount of field and experimental data including CPT results, plate load tests, direct shear box, grain size distribution and calculated data of overburden pressure was obtained from a large project in the United Arab Emirates. This data was used for the training and the validation of the neural network. A comparison was made between the obtained results from the ANN's approach, and some common traditional correlations that predict Φ and E from CPT results with respect to the actual results of the collected data. The results show that the ANN is a very powerful tool. Very good agreement was obtained between estimated results from ANN and actual measured results with comparison to other correlations available in the literature. The study recommends some easily available parameters that should be included in the estimation of the soil properties to improve the prediction models. It is shown that the use of friction ration in the estimation of Φ and the use of fines content in the estimation of E considerable improve the prediction models.Keywords: angle of internal friction, cone penetrating test, general regression neural network, soil modulus of elasticity
Procedia PDF Downloads 41522758 The Relationships between Second Language Proficiency (L2) and Interpersonal Relationships of Students and Teachers: Pilot Study in Wenzhou-Kean University
Authors: Hu Yinyao
Abstract:
Learning and using a second language have become more and more common in daily life. Understanding the complexity of second language proficiency can help students develop their interpersonal relationships with their friends and professors, even enhancing intimacy. This paper examines Wenzhou-Kean University students' second language proficiency and interpersonal relationships. The purpose of the research was to explore the relationship between second language proficiency, extent of intimacy, and interpersonal relationships of the 100 Wenzhou-Kean University students. A mixed methodology was utilized in the research study. Student respondents from Wenzhou-Kean University were chosen randomly by using random sampling. The data analysis used descriptive data in terms of figures and thematical data in the table. The researcher found that Wenzhou-Kean University’s students have shown lower intermediate level of second language proficiency and that their intimacy is middle when using a second language. Especially when talking about some sensitive topics, students tend not to use a second language due to low proficiency. This research project has a strong implication on interpersonal relationships and second language proficiency. The outcome of the study would be greatly helpful to enhance the interpersonal relationship and intimacy between students and students, students and professors who use.Keywords: Interpersonal relationship, second language proficiency, intimacy, education, univeristy students
Procedia PDF Downloads 4322757 Recreation and Environmental Quality of Tropical Wetlands: A Social Media Based Spatial Analysis
Authors: Michael Sinclair, Andrea Ghermandi, Sheela A. Moses, Joseph Sabu
Abstract:
Passively crowdsourced data, such as geotagged photographs from social media, represent an opportunistic source of location-based and time-specific behavioral data for ecosystem services analysis. Such data have innovative applications for environmental management and protection, which are replicable at wide spatial scales and in the context of both developed and developing countries. Here we test one such innovation, based on the analysis of the metadata of online geotagged photographs, to investigate the provision of recreational services by the entire network of wetland ecosystems in the state of Kerala, India. We estimate visitation to individual wetlands state-wide and extend, for the first time to a developing region, the emerging application of cultural ecosystem services modelling using data from social media. The impacts of restoration of wetland areal extension and water quality improvement are explored as a means to inform more sustainable management strategies. Findings show that improving water quality to a level suitable for the preservation of wildlife and fisheries could increase annual visits by 350,000, an increase of 13% in wetland visits state-wide, while restoring previously encroached wetland area could result in a 7% increase in annual visits, corresponding to 49,000 visitors, in the Ashtamudi and Vembanad lakes alone, two large coastal Ramsar wetlands in Kerala. We discuss how passive crowdsourcing of social media data has the potential to improve current ecosystem service analyses and environmental management practices also in the context of developing countries.Keywords: coastal wetlands, cultural ecosystem services, India, passive crowdsourcing, social media, wetland restoration
Procedia PDF Downloads 15522756 Landscape Classification in North of Jordan by Integrated Approach of Remote Sensing and Geographic Information Systems
Authors: Taleb Odeh, Nizar Abu-Jaber, Nour Khries
Abstract:
The southern part of Wadi Al Yarmouk catchment area covers north of Jordan. It locates within latitudes 32° 20’ to 32° 45’N and longitudes 35° 42’ to 36° 23’ E and has an area of about 1426 km2. However, it has high relief topography where the elevation varies between 50 to 1100 meter above sea level. The variations in the topography causes different units of landforms, climatic zones, land covers and plant species. As a results of these different landscapes units exists in that region. Spatial planning is a major challenge in such a vital area for Jordan which could not be achieved without determining landscape units. However, an integrated approach of remote sensing and geographic information Systems (GIS) is an optimized tool to investigate and map landscape units of such a complicated area. Remote sensing has the capability to collect different land surface data, of large landscape areas, accurately and in different time periods. GIS has the ability of storage these land surface data, analyzing them spatially and present them in form of professional maps. We generated a geo-land surface data that include land cover, rock units, soil units, plant species and digital elevation model using ASTER image and Google Earth while analyzing geo-data spatially were done by ArcGIS 10.2 software. We found that there are twenty two different landscape units in the study area which they have to be considered for any spatial planning in order to avoid and environmental problems.Keywords: landscape, spatial planning, GIS, spatial analysis, remote sensing
Procedia PDF Downloads 52822755 BingleSeq: A User-Friendly R Package for Single-Cell RNA-Seq Data Analysis
Authors: Quan Gu, Daniel Dimitrov
Abstract:
BingleSeq was developed as a shiny-based, intuitive, and comprehensive application that enables the analysis of single-Cell RNA-Sequencing count data. This was achieved via incorporating three state-of-the-art software packages for each type of RNA sequencing analysis, alongside functional annotation analysis and a way to assess the overlap of differential expression method results. At its current state, the functionality implemented within BingleSeq is comparable to that of other applications, also developed with the purpose of lowering the entry requirements to RNA Sequencing analyses. BingleSeq is available on GitHub and will be submitted to R/Bioconductor.Keywords: bioinformatics, functional annotation analysis, single-cell RNA-sequencing, transcriptomics
Procedia PDF Downloads 20522754 Applications of Nonlinear Models to Measure and Predict Thermo Physical Properties of Binary Liquid Mixtures1, 4 Dioxane with Bromo Benzene at Various Temperatures
Authors: R. Ramesh, M. Y. M. Yunus, K. Ramesh
Abstract:
The study conducted in this research are Viscosities, η, and Densities ,ρ, of 1, 4-dioxane with Bromobenzene at different mole fractions and various temperatures in the atmospheric pressure condition. From experimentations excess volumes, VE, and deviations in viscosities, Δη, of mixtures at infinite dilutions have been obtained. The measured systems exhibited positive values of VmE and negative values of Δη. The binary mixture 1, 4 dioxane + Bromobenzene show positive VE and negative Δη with increasing temperatures. The outcomes clearly indicate that weak interactions present in mixture. It is mainly because of number and position of methyl groups exist in these aromatic hydrocarbons. These measured data tailored to the nonlinear models to derive the binary coefficients. Standard deviations have been considered between the fitted outcomes and the calculated data is helpful deliberate mixing behavior of the binary mixtures. It can conclude that in our cases, the data found with the values correlated by the corresponding models very well. The molecular interactions existing between the components and comparison of liquid mixtures were also discussed.Keywords: 1, 4 dioxane, bromobenzene, density, excess molar volume
Procedia PDF Downloads 41222753 Benchmarking Service Quality among Quick-Service Restaurants towards Service Innovations
Authors: Scott Earthy Baldo, Anna Cred Patricia Barroma, Miguel Angelo Eñano, John Ares Hipolito, Orange Sundra Sison, Rixielle Gwendale Tumambing
Abstract:
Service Innovation is the introduction of several new-fangled ways on how to deliver service to customers with the intention to improve one’s existing service quality and to attract more customers. This research paper aims to identify the various service practices being implemented on the different quick-service restaurants within Morayta Street, Manila, Philippines and compare each establishment to the best within the industry through the process of benchmarking towards service innovations. In order for the gathering of valuable data to be possible, a mixed-method approach was used, wherein qualitative data were taken from the managers of each establishment, indicating the service practices being used, and quantitative data were collected from the customers and employees regarding their perception towards the present service quality of each selected quick-service restaurants, in line with the current service innovations being implemented. This research was conducted in order to discern which service practices are effective in attracting customers and boosting their satisfaction for future references of practitioners who are planning to manage a quick-service restaurant and for students studying in the field of hospitality, specifically on service.Keywords: benchmarking, quick-service restaurants, service innovations, service quality
Procedia PDF Downloads 37322752 A Numerical Description of a Fibre Reinforced Concrete Using a Genetic Algorithm
Authors: Henrik L. Funke, Lars Ulke-Winter, Sandra Gelbrich, Lothar Kroll
Abstract:
This work reports about an approach for an automatic adaptation of concrete formulations based on genetic algorithms (GA) to optimize a wide range of different fit-functions. In order to achieve the goal, a method was developed which provides a numerical description of a fibre reinforced concrete (FRC) mixture regarding the production technology and the property spectrum of the concrete. In a first step, the FRC mixture with seven fixed components was characterized by varying amounts of the components. For that purpose, ten concrete mixtures were prepared and tested. The testing procedure comprised flow spread, compressive and bending tensile strength. The analysis and approximation of the determined data was carried out by GAs. The aim was to obtain a closed mathematical expression which best describes the given seven-point cloud of FRC by applying a Gene Expression Programming with Free Coefficients (GEP-FC) strategy. The seven-parametric FRC-mixtures model which is generated according to this method correlated well with the measured data. The developed procedure can be used for concrete mixtures finding closed mathematical expressions, which are based on the measured data.Keywords: concrete design, fibre reinforced concrete, genetic algorithms, GEP-FC
Procedia PDF Downloads 28022751 Prediction of Thermodynamic Properties of N-Heptane in the Critical Region
Authors: Sabrina Ladjama, Aicha Rizi, Azzedine Abbaci
Abstract:
In this work, we use the crossover model to formulate a comprehensive fundamental equation of state for the thermodynamic properties for several n-alkanes in the critical region that extends to the classical region. This equation of state is constructed on the basis of comparison of selected measurements of pressure-density-temperature data, isochoric and isobaric heat capacity. The model can be applied in a wide range of temperatures and densities around the critical point for n-heptane. It is found that the developed model represents most of the reliable experimental data accurately.Keywords: crossover model, critical region, fundamental equation, n-heptane
Procedia PDF Downloads 47522750 Is There a Group of "Digital Natives" at Secondary Schools?
Authors: L. Janská, J. Kubrický
Abstract:
The article describes a research focused on the influence of the information and communication technology (ICT) on the pupils' learning. The investigation deals with the influences that distinguish between the group of pupils influenced by ICT and the group of pupils not influenced by ICT. The group influenced by ICT should evince a different approach in number of areas (in managing of two and more activities at once, in a quick orientation and searching for information on the Internet, in an ability to quickly and effectively assess the data sources, in the assessment of attitudes and opinions of the other users of the network, in critical thinking, in the preference to work in teams, in the sharing of information and personal data via the virtual social networking, in insisting on the immediate reaction on their every action etc.).Keywords: ICT influence, digital natives, pupil´s learning
Procedia PDF Downloads 29122749 Exploring Disruptive Innovation Capacity Effects on Firm Performance: An Investigation in Industries 4.0
Authors: Selma R. Oliveira, E. W. Cazarini
Abstract:
Recently, studies have referenced innovation as a key factor affecting the performance of firms. Companies make use of its innovative capacities to achieve sustainable competitive advantage. In this perspective, the objective of this paper is to contribute to innovation planning policies in industry 4.0. Thus, this paper examines the disruptive innovation capacity on firm performance in Europe. This procedure was prepared according to the following phases: Phase 1: Determination of the conceptual model; and Phase 2: Verification of the conceptual model. The research was initially conducted based on the specialized literature, which extracted the data regarding the constructs/structure and content in order to build the model. The research involved the intervention of experts knowledgeable on the object studied, selected by technical-scientific criteria. The data were extracted using an assessment matrix. To reduce subjectivity in the results achieved the following methods were used complementarily and in combination: multicriteria analysis, multivariate analysis, psychometric scaling and neurofuzzy technology. The data were extracted using an assessment matrix and the results were satisfactory, validating the modeling approach.Keywords: disruptive innovation, capacity, performance, Industry 4.0
Procedia PDF Downloads 16522748 Determining the Extent and Direction of Relief Transformations Caused by Ski Run Construction Using LIDAR Data
Authors: Joanna Fidelus-Orzechowska, Dominika Wronska-Walach, Jaroslaw Cebulski
Abstract:
Mountain areas are very often exposed to numerous transformations connected with the development of tourist infrastructure. In mountain areas in Poland ski tourism is very popular, so agricultural areas are often transformed into tourist areas. The construction of new ski runs can change the direction and rate of slope development. The main aim of this research was to determine geomorphological and hydrological changes within slopes caused by ski run constructions. The study was conducted in the Remiaszów catchment in the Inner Polish Carpathians (southern Poland). The mean elevation of the catchment is 859 m a.s.l. and the maximum is 946 m a.s.l. The surface area of the catchment is 1.16 km2, of which 16.8% is the area of the two studied ski runs. The studied ski runs were constructed in 2014 and 2015. In order to determine the relief transformations connected with new ski run construction high resolution LIDAR data was analyzed. The general relief changes in the studied catchment were determined on the basis of ALS (Airborne Laser Scanning ) data obtained before (2013) and after (2016) ski run construction. Based on the two sets of ALS data a digital elevation models of differences (DoDs) was created, which made it possible to determine the quantitative relief changes in the entire studied catchment. Additionally, cross and longitudinal profiles were calculated within slopes where new ski runs were built. Detailed data on relief changes within selected test surfaces was obtained based on TLS (Terrestrial Laser Scanning). Hydrological changes within the analyzed catchment were determined based on the convergence and divergence index. The study shows that the construction of the new ski runs caused significant geomorphological and hydrological changes in the entire studied catchment. However, the most important changes were identified within the ski slopes. After the construction of ski runs the entire catchment area lowered about 0.02 m. Hydrological changes in the studied catchment mainly led to the interruption of surface runoff pathways and changes in runoff direction and geometry.Keywords: hydrological changes, mountain areas, relief transformations, ski run construction
Procedia PDF Downloads 14322747 Net Fee and Commission Income Determinants of European Cooperative Banks
Authors: Karolína Vozková, Matěj Kuc
Abstract:
Net fee and commission income is one of the key elements of a bank’s core income. In the current low-interest rate environment, this type of income is gaining importance relative to net interest income. This paper analyses the effects of bank and country specific determinants of net fee and commission income on a set of cooperative banks from European countries in the 2007-2014 period. In order to do that, dynamic panel data methods (system Generalized Methods of Moments) were employed. Subsequently, alternative panel data methods were run as robustness checks of the analysis. Strong positive impact of bank concentration on the share of net fee and commission income was found, which proves that cooperative banks tend to display a higher share of fee income in less competitive markets. This is probably connected with the fact that they stick with their traditional deposit-taking and loan-providing model and fees on these services are driven down by the competitors. Moreover, compared to commercial banks, cooperatives do not expand heavily into non-traditional fee bearing services under competition and their overall fee income share is therefore decreasing with the increased competitiveness of the sector.Keywords: cooperative banking, dynamic panel data models, net fee and commission income, system GMM
Procedia PDF Downloads 33022746 Feature Weighting Comparison Based on Clustering Centers in the Detection of Diabetic Retinopathy
Authors: Kemal Polat
Abstract:
In this paper, three feature weighting methods have been used to improve the classification performance of diabetic retinopathy (DR). To classify the diabetic retinopathy, features extracted from the output of several retinal image processing algorithms, such as image-level, lesion-specific and anatomical components, have been used and fed them into the classifier algorithms. The dataset used in this study has been taken from University of California, Irvine (UCI) machine learning repository. Feature weighting methods including the fuzzy c-means clustering based feature weighting, subtractive clustering based feature weighting, and Gaussian mixture clustering based feature weighting, have been used and compered with each other in the classification of DR. After feature weighting, five different classifier algorithms comprising multi-layer perceptron (MLP), k- nearest neighbor (k-NN), decision tree, support vector machine (SVM), and Naïve Bayes have been used. The hybrid method based on combination of subtractive clustering based feature weighting and decision tree classifier has been obtained the classification accuracy of 100% in the screening of DR. These results have demonstrated that the proposed hybrid scheme is very promising in the medical data set classification.Keywords: machine learning, data weighting, classification, data mining
Procedia PDF Downloads 32522745 Robust Heart Rate Estimation from Multiple Cardiovascular and Non-Cardiovascular Physiological Signals Using Signal Quality Indices and Kalman Filter
Authors: Shalini Rankawat, Mansi Rankawat, Rahul Dubey, Mazad Zaveri
Abstract:
Physiological signals such as electrocardiogram (ECG) and arterial blood pressure (ABP) in the intensive care unit (ICU) are often seriously corrupted by noise, artifacts, and missing data, which lead to errors in the estimation of heart rate (HR) and incidences of false alarm from ICU monitors. Clinical support in ICU requires most reliable heart rate estimation. Cardiac activity, because of its relatively high electrical energy, may introduce artifacts in Electroencephalogram (EEG), Electrooculogram (EOG), and Electromyogram (EMG) recordings. This paper presents a robust heart rate estimation method by detection of R-peaks of ECG artifacts in EEG, EMG & EOG signals, using energy-based function and a novel Signal Quality Index (SQI) assessment technique. SQIs of physiological signals (EEG, EMG, & EOG) were obtained by correlation of nonlinear energy operator (teager energy) of these signals with either ECG or ABP signal. HR is estimated from ECG, ABP, EEG, EMG, and EOG signals from separate Kalman filter based upon individual SQIs. Data fusion of each HR estimate was then performed by weighing each estimate by the Kalman filters’ SQI modified innovations. The fused signal HR estimate is more accurate and robust than any of the individual HR estimate. This method was evaluated on MIMIC II data base of PhysioNet from bedside monitors of ICU patients. The method provides an accurate HR estimate even in the presence of noise and artifacts.Keywords: ECG, ABP, EEG, EMG, EOG, ECG artifacts, Teager-Kaiser energy, heart rate, signal quality index, Kalman filter, data fusion
Procedia PDF Downloads 69622744 The On-Board Critical Message Transmission Design for Navigation Satellite Delay/Disruption Tolerant Network
Authors: Ji-yang Yu, Dan Huang, Guo-ping Feng, Xin Li, Lu-yuan Wang
Abstract:
The navigation satellite network, especially the Beidou MEO Constellation, can relay data effectively with wide coverage and is applied in navigation, detection, and position widely. But the constellation has not been completed, and the amount of satellites on-board is not enough to cover the earth, which makes the data-relay disrupted or delayed in the transition process. The data-relay function needs to tolerant the delay or disruption in some extension, which make the Beidou MEO Constellation a delay/disruption-tolerant network (DTN). The traditional DTN designs mainly employ the relay table as the basic of data path schedule computing. But in practical application, especially in critical condition, such as the war-time or the infliction heavy losses on the constellation, parts of the nodes may become invalid, then the traditional DTN design could be useless. Furthermore, when transmitting the critical message in the navigation system, the maximum priority strategy is used, but the nodes still inquiry the relay table to design the path, which makes the delay more than minutes. Under this circumstances, it needs a function which could compute the optimum data path on-board in real-time according to the constellation states. The on-board critical message transmission design for navigation satellite delay/disruption-tolerant network (DTN) is proposed, according to the characteristics of navigation satellite network. With the real-time computation of parameters in the network link, the least-delay transition path is deduced to retransmit the critical message in urgent conditions. First, the DTN model for constellation is established based on the time-varying matrix (TVM) instead of the time-varying graph (TVG); then, the least transition delay data path is deduced with the parameters of the current node; at last, the critical message transits to the next best node. For the on-board real-time computing, the time delay and misjudges of constellation states in ground stations are eliminated, and the residual information channel for each node can be used flexibly. Compare with the minute’s delay of traditional DTN; the proposed transmits the critical message in seconds, which improves the re-transition efficiency. The hardware is implemented in FPGA based on the proposed model, and the tests prove the validity.Keywords: critical message, DTN, navigation satellite, on-board, real-time
Procedia PDF Downloads 34322743 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods
Authors: Devatha Kalyan Kumar, R. Poovarasan
Abstract:
In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric
Procedia PDF Downloads 25622742 Electricity Load Modeling: An Application to Italian Market
Authors: Giovanni Masala, Stefania Marica
Abstract:
Forecasting electricity load plays a crucial role regards decision making and planning for economical purposes. Besides, in the light of the recent privatization and deregulation of the power industry, the forecasting of future electricity load turned out to be a very challenging problem. Empirical data about electricity load highlights a clear seasonal behavior (higher load during the winter season), which is partly due to climatic effects. We also emphasize the presence of load periodicity at a weekly basis (electricity load is usually lower on weekends or holidays) and at daily basis (electricity load is clearly influenced by the hour). Finally, a long-term trend may depend on the general economic situation (for example, industrial production affects electricity load). All these features must be captured by the model. The purpose of this paper is then to build an hourly electricity load model. The deterministic component of the model requires non-linear regression and Fourier series while we will investigate the stochastic component through econometrical tools. The calibration of the parameters’ model will be performed by using data coming from the Italian market in a 6 year period (2007- 2012). Then, we will perform a Monte Carlo simulation in order to compare the simulated data respect to the real data (both in-sample and out-of-sample inspection). The reliability of the model will be deduced thanks to standard tests which highlight a good fitting of the simulated values.Keywords: ARMA-GARCH process, electricity load, fitting tests, Fourier series, Monte Carlo simulation, non-linear regression
Procedia PDF Downloads 39522741 Factors Affecting Profitability of Pharmaceutical Company During the COVID-19 Pandemic: An Indonesian Evidence
Authors: Septiany Trisnaningtyas
Abstract:
Purpose: This research aims to examine the factors affecting the profitability of pharmaceutical company during the Covid-19 Pandemic in Indonesia. A sharp decline in the number of patients coming to the hospital for treatment during the pandemic has an impact on the growth of the pharmaceutical sector and brought major changes in financial position and business performance. Pharmaceutical companies that provide products related to the Covid-19 pandemic can survive and continue to grow. This study investigates the factors affecting the profitability of pharmaceutical company during the Covid-19 Pandemic in Indonesia associated with the number of Covid-19 cases. Design/methodology/approach: This study uses panel-data regression models to evaluate the influence of the number of Covid-19 confirmed cases on profitability of ninelisted pharmaceuticalcompanies in Indonesia. This research is based on four independent variables that were empirically examined for their relationship with profitability. These variables are liquidity (current ratio), growth rate (sales growth), firm size (total sales), and market power (the Lerner index). Covid-19 case is used as moderating variable. Data of nine pharmaceutical companies listed on the Indonesia Stock Exchange covering the period of 2018–2021 were extracted from companies’ quarterly annual reports. Findings: In the period during Covid-19, company growth (sales growth) and market power (lerner index) have a positive and significant relationship to ROA and ROE. Total of confirmed Covid-19 cases has a positive and significant relationship to ROA and is proven to have a moderating effect between company’s growth (sales growth) to ROA and ROE and market power (Lerner index) to ROA. Research limitations/implications: Due to data availability, this study only includes data from nine listed pharmaceutical companies in Indonesian Stock exchange and quarterly annual reportscovering the period of 2018-2021. Originality/value: This study focuses onpharmaceutical companies in Indonesia during Covid-19 pandemic. Previous study analyzes the data from pharmaceutical companies’ annual reports since 2014 and focus on universal health coverage (national health insurance) implementation from the Indonesian government. This study analyzes the data using fixed effect panel-data regression models to evaluate the influence of Covid-19 confirmed cases on profitability. Pooled ordinary least squares regression and fixed effects were used to analyze the data in previous study. This study also investigate the moderating effect of Covid-19 confirmed cases to profitability in relevant with the pandemic situation.Keywords: profitability, indonesia, pharmaceutical, Covid-19
Procedia PDF Downloads 12322740 A Systematic Review of the Methodological and Reporting Quality of Case Series in Surgery
Authors: Riaz A. Agha, Alexander J. Fowler, Seon-Young Lee, Buket Gundogan, Katharine Whitehurst, Harkiran K. Sagoo, Kyung Jin Lee Jeong, Douglas G. Altman, Dennis P. Orgill
Abstract:
Introduction: Case Series are an important and common study type. Currently, no guideline exists for reporting case series and there is evidence of key data being missed from such reports. We propose to develop a reporting guideline for case series using a methodologically robust technique. The first step in this process is a systematic review of literature relevant to the reporting deficiencies of case series. Methods: A systematic review of methodological and reporting quality in surgical case series was performed. The electronic search strategy was developed by an information specialist and included MEDLINE, EMBASE, Cochrane Methods Register, Science Citation index and Conference Proceedings Citation index, from the start of indexing until 5th November 2014. Independent screening, eligibility assessments and data extraction was performed. Included articles were analyzed for five areas of deficiency: failure to use standardized definitions missing or selective data transparency or incomplete reporting whether alternate study designs were considered. Results: The database searching identified 2,205 records. Through the process of screening and eligibility assessments, 92 articles met inclusion criteria. Frequency of methodological and reporting issues identified was a failure to use standardized definitions (57%), missing or selective data (66%), transparency, or incomplete reporting (70%), whether alternate study designs were considered (11%) and other issues (52%). Conclusion: The methodological and reporting quality of surgical case series needs improvement. Our data shows that clear evidence-based guidelines for the conduct and reporting of a case series may be useful to those planning or conducting them.Keywords: case series, reporting quality, surgery, systematic review
Procedia PDF Downloads 35922739 Model-Free Distributed Control of Dynamical Systems
Authors: Javad Khazaei, Rick Blum
Abstract:
Distributed control is an efficient and flexible approach for coordination of multi-agent systems. One of the main challenges in designing a distributed controller is identifying the governing dynamics of the dynamical systems. Data-driven system identification is currently undergoing a revolution. With the availability of high-fidelity measurements and historical data, model-free identification of dynamical systems can facilitate the control design without tedious modeling of high-dimensional and/or nonlinear systems. This paper develops a distributed control design using consensus theory for linear and nonlinear dynamical systems using sparse identification of system dynamics. Compared with existing consensus designs that heavily rely on knowing the detailed system dynamics, the proposed model-free design can accurately capture the dynamics of the system with available measurements and input data and provide guaranteed performance in consensus and tracking problems. Heterogeneous damped oscillators are chosen as examples of dynamical system for validation purposes.Keywords: consensus tracking, distributed control, model-free control, sparse identification of dynamical systems
Procedia PDF Downloads 26522738 Geospatial Assessment of Waste Disposal System in Akure, Ondo State, Nigeria
Authors: Babawale Akin Adeyemi, Esan Temitayo, Adeyemi Olabisi Omowumi
Abstract:
The paper analyzed waste disposal system in Akure, Ondo State using GIS techniques. Specifically, the study identified the spatial distribution of collection points and existing dumpsite; evaluated the accessibility of waste collection points and their proximity to each other with the view of enhancing better performance of the waste disposal system. Data for the study were obtained from both primary and secondary sources. Primary data were obtained through the administration of questionnaire. From field survey, 35 collection points were identified in the study area. 10 questionnaires were administered around each collection point making a total of 350 questionnaires for the study. Also, co-ordinates of each collection point were captured using a hand-held Global Positioning System (GPS) receiver which was used to analyze the spatial distribution of collection points. Secondary data used include administrative map collected from Akure South Local Government Secretariat. Data collected was analyzed using the GIS analytical tools which is neighborhood function. The result revealed that collection points were found in all parts of Akure with the highest concentration around the central business district. The study also showed that 80% of the collection points enjoyed efficient waste service while the remaining 20% does not. The study further revealed that most collection points in the core of the city were in close proximity to each other. In conclusion, the paper revealed the capability of Geographic Information System (GIS) as a technique in management of waste collection and disposal technique. The application of Geographic Information System (GIS) in the evaluation of the solid waste management in Akure is highly invaluable for the state waste management board which could also be beneficial to other states in developing a modern day solid waste management system. Further study on solid waste management is also recommended especially for updating of information on both spatial and non-spatial data.Keywords: assessment, geospatial, system, waste disposal
Procedia PDF Downloads 23922737 Students' Perspectives on Quality of Course Evaluation Practices and Feedbacks in Eritrea
Authors: Ermias Melake Tesfay
Abstract:
The importance of evaluation practice and feedback to student advancement and retention has gained importance in the literature over the past ten years. So many issues and cases have been raised about the quality and types of evaluation carried out in higher education and the quality and quantity of student feedback. The aim of this study was to explore the students’ perspectives on the quality of course evaluation practice and feedback in College of Education and College of Science. The study used both quantitative and qualitative methods to collect data. Data were collected from third-year and fourth-year students of 13 departments in the College of Education and College of Science in Eritrea. A modified Service Performance (SERVPERF) questionnaire and focus group discussions were used to collect the data. The sample population comprised of 135 third-year and fourth-year students’ from both Colleges. A questionnaire using a 5 point Likert-scale was administered to all respondents whilst two focus group discussions were conducted. Findings from survey data and focus group discussions showed that the majority of students hold a positive perception of the quality of course evaluation practice but had a negative perception of methods of awarding grades and administrators’ role in listening to the students complain about the course. Furthermore, the analysis from the questionnaire showed that there is no statistically significant difference between third-year and fourth-year students, College of Education and College of Science and male and female students on the quality of course evaluation practice and feedback. The study recommends that colleges improve the quality of fairness and feedback during course assessment.Keywords: evaluation, feedback, quality, students' perception
Procedia PDF Downloads 15722736 Artificial Intelligence in Melanoma Prognosis: A Narrative Review
Authors: Shohreh Ghasemi
Abstract:
Introduction: Melanoma is a complex disease with various clinical and histopathological features that impact prognosis and treatment decisions. Traditional methods of melanoma prognosis involve manual examination and interpretation of clinical and histopathological data by dermatologists and pathologists. However, the subjective nature of these assessments can lead to inter-observer variability and suboptimal prognostic accuracy. AI, with its ability to analyze vast amounts of data and identify patterns, has emerged as a promising tool for improving melanoma prognosis. Methods: A comprehensive literature search was conducted to identify studies that employed AI techniques for melanoma prognosis. The search included databases such as PubMed and Google Scholar, using keywords such as "artificial intelligence," "melanoma," and "prognosis." Studies published between 2010 and 2022 were considered. The selected articles were critically reviewed, and relevant information was extracted. Results: The review identified various AI methodologies utilized in melanoma prognosis, including machine learning algorithms, deep learning techniques, and computer vision. These techniques have been applied to diverse data sources, such as clinical images, dermoscopy images, histopathological slides, and genetic data. Studies have demonstrated the potential of AI in accurately predicting melanoma prognosis, including survival outcomes, recurrence risk, and response to therapy. AI-based prognostic models have shown comparable or even superior performance compared to traditional methods.Keywords: artificial intelligence, melanoma, accuracy, prognosis prediction, image analysis, personalized medicine
Procedia PDF Downloads 81