Search results for: low data rate
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29820

Search results for: low data rate

28830 Spatial Distribution of Certified Mental Disabilities in China

Authors: Jiayue Yang

Abstract:

Based on an analysis of China's database of certified disabled persons in 2021, this study reveals several key findings. Firstly, the proportion of certified mentally disabled persons among China's certified disabled population (Certification rate 1) shows a decreasing distribution from the East to the West and from the South to the North. Secondly, the spatial distribution of the number of mentally disabled persons per 1,000 people holding certificates (certification rate 2) shows a relatively scattered pattern, with significant variations observed between cities in the eastern region. However, on an overall scale, a south-north gradient can still be observed, with higher rates in the North and lower rates in the west, while the central region demonstrates higher rates compared to the western region. The variation in the rate of mentally handicapped certificates among regions is influenced not only by traditional culture and welfare level but also exhibits a certain correlation with the level of economic development.

Keywords: certified disabled persons, mentally disabled persons, spatial distribution, China

Procedia PDF Downloads 71
28829 Data Management and Analytics for Intelligent Grid

Authors: G. Julius P. Roy, Prateek Saxena, Sanjeev Singh

Abstract:

Power distribution utilities two decades ago would collect data from its customers not later than a period of at least one month. The origin of SmartGrid and AMI has subsequently increased the sampling frequency leading to 1000 to 10000 fold increase in data quantity. This increase is notable and this steered to coin the tern Big Data in utilities. Power distribution industry is one of the largest to handle huge and complex data for keeping history and also to turn the data in to significance. Majority of the utilities around the globe are adopting SmartGrid technologies as a mass implementation and are primarily focusing on strategic interdependence and synergies of the big data coming from new information sources like AMI and intelligent SCADA, there is a rising need for new models of data management and resurrected focus on analytics to dissect data into descriptive, predictive and dictatorial subsets. The goal of this paper is to is to bring load disaggregation into smart energy toolkit for commercial usage.

Keywords: data management, analytics, energy data analytics, smart grid, smart utilities

Procedia PDF Downloads 763
28828 Predisposition of Small Scale Businesses in Fagge, Kano State, Nigeria, Towards Profit and Loss Sharing Mode of Finance

Authors: Farida, M. Shehu, Shehu U. R. Aliyu

Abstract:

Access to finance has been recognized in the literature as one of the major impediments confronting small scale businesses (SSBs). This largely arises due to high lending rate, religious inclinations, collateral, etc. Islamic mode finance operates under Profit and Loss Sharing (PLS) arrangement between a borrower (business owner) and a lender (Islamic bank). This paper empirically assesses the determinants of predisposition of small scale business operators in Fagge local government area, Kano State, Nigeria, towards the PLS. Cross-sectional data from a sample of 291 small scale business operators was analyzed using logit and probit regression models. Empirical results reveal that while awareness and religion inclination positively drive interest towards the PLS, lending rate and collateral work against it. The paper, therefore, strongly recommends more advocacy campaigns and setting up of more Islamic banks in the country to cater for the financing and religious needs of SSBs in the study area.

Keywords: Islamic finance, logit and probit models, profit and loss sharing small scale businesses, finance, commerce

Procedia PDF Downloads 355
28827 Corrosion Protection of Structural Steel by Surfactant Containing Reagents

Authors: D. Erdenechimeg, T. Bujinlkham, N. Erdenepurev

Abstract:

The anti-corrosion performance of fatty acid coated mild steel samples is studied. Samples of structural steel coated with collector reagents deposited from surfactant in ethanol solution and overcoated with an epoxy barrier paint. A quantitative corrosion rate was determined by linear polarization resistance method using biopotentiostat/galvanostat 400. Coating morphology was determined by scanning electronic microscopy. A test for hydrophobic surface of steel by surfactant was done. From the samples, the main component or high content iron was determined by chemical method and other metal contents were determined by Inductively Coupled Plasma-Optical Emission Spectrometry (ICP-OES) method. Prior to measuring the corrosion rate, mechanical and chemical treatments were performed to prepare the test specimens. Overcoating the metal samples with epoxy barrier paint after exposing them with surfactant the corrosion rate can be inhibited by 34-35 µm/year.

Keywords: corrosion, linear polarization resistance, coating, surfactant

Procedia PDF Downloads 85
28826 Privacy Preserving Data Publishing Based on Sensitivity in Context of Big Data Using Hive

Authors: P. Srinivasa Rao, K. Venkatesh Sharma, G. Sadhya Devi, V. Nagesh

Abstract:

Privacy Preserving Data Publication is the main concern in present days because the data being published through the internet has been increasing day by day. This huge amount of data was named as Big Data by its size. This project deals the privacy preservation in the context of Big Data using a data warehousing solution called hive. We implemented Nearest Similarity Based Clustering (NSB) with Bottom-up generalization to achieve (v,l)-anonymity. (v,l)-Anonymity deals with the sensitivity vulnerabilities and ensures the individual privacy. We also calculate the sensitivity levels by simple comparison method using the index values, by classifying the different levels of sensitivity. The experiments were carried out on the hive environment to verify the efficiency of algorithms with Big Data. This framework also supports the execution of existing algorithms without any changes. The model in the paper outperforms than existing models.

Keywords: sensitivity, sensitive level, clustering, Privacy Preserving Data Publication (PPDP), bottom-up generalization, Big Data

Procedia PDF Downloads 276
28825 Maternal Request: A Minor but Important Contributor to the Rising Rates of Caesarean Section: A Retrospective Observational Study

Authors: Katherine Russell

Abstract:

Background: Over recent decades the number of caesarean sections performed in the UK has continued to rise. The cause of the rising caesarean rate (CSR) is not well understood. However, one of the most heavily cited reasons is an increase in maternal request for caesarean section. Maternal request for caesarean section (CDMR) refers to a caesarean section performed on maternal request with no medical indication. The true rate of caesarean delivery on maternal request in the UK and its contribution to the caesarean section rate is not known. Methods: To elucidate current understanding of the cause of the rising caesarean section rate and the role of CDMR we conducted a systematic review of the literature. To determine the role of CDMR in the CSR at the PRH we conducted a retrospective observational study of the caesarean section rates and CDMR from 2009-2015. Results: We demonstrated a negative correlation between rates of elective sections and CDMR over the study period (-0.123). On average, there were more elective sections performed after 2011 (15.10% of all deliveries) than before 2011 (12.41% of all deliveries); this difference was statistically significant (p = < 0.001). There were more cases of CDMR after 2011 (1.39% of all deliveries) than before 2011 (0.85% of all deliveries). The difference in average rates of CDMR before and after 2011 was statistically significant (p ≤ 0.001). Conclusions: CDMR is only a minor contributor to the CSR at the PRH. However, it remains an important factor because it represents a target for the reduction of the CSR that is more manageable than other, more complex and ubiquitous causes of the rising CSR.

Keywords: cesarean section, maternal request for cesarean section, obstetrics, pre-natal health

Procedia PDF Downloads 90
28824 A Comparative Study between Ionic Wind and Conventional Fan

Authors: J. R. Lee, E. V. Lau

Abstract:

Ionic wind is developed when high voltage is supplied to an anode and a grounded cathode in a gaseous medium. This paper studies the ionic wind profile with different anode configurations, the relationship between electrode gap against the voltage supplied and finally a comparison of the heat transfer coefficient of ionic wind over a horizontal flat plate against a conventional fan experimentally. It is observed that increase in the distance between electrodes decreases at a rate of 1-e-0.0206x as the voltage supply is increased until a distance of 3.1536cm. It is also observed that the wind speed produced by ionic wind is stronger, 2.7ms-1 at 2W compared to conventional fan, 2.5ms-1 at 2W but the wind produced decays at a fast exponential rate and is more localized as compared to conventional fan wind that decays at a slower exponential rate and is less localized. Next, it is found out that the ionic wind profile is the same regardless of the position of the anode relative to the cathode. Lastly, it is discovered that ionic wind produced a heat transfer coefficient that is almost 1.6 times higher compared to a conventional fan with Nusselt number reaching 164 compared to 102 for conventional fan.

Keywords: conventional fan, heat transfer, ionic wind, wind profile

Procedia PDF Downloads 307
28823 The Effectiveness of Humanoid Diagram Teaching Strategy on Retention Rate of Novice Nurses in Taiwan

Authors: Yung-Hui Tang, Yan-Chiou Ku, Li-Chi Huang

Abstract:

Aim: The aim of this study is to explore the effect of the Humanoid Diagram Teaching (HDT) strategy on novice nurses’ care ability and retention rate. Methods: This study was a quasi-experimental study using two groups concurrently with repeat measurements sample consisted of 24 novice nurses (12 in each experimental and control group) in a medical center in southern Taiwan. Both groups all received regular training program (nursing standard techniques and practices, concept map, mini-CEX, CbD, and clinical education and training), and experimental group added the HDT program. The HDT strategy includes the contents of patients’ body humanoid drawing and discussion for 30 minutes each time, three times a week, and continually for four weeks. The effectiveness of HDT was evaluated by mini-CEX, CbD and clinical assessment and retention rate at the 3rd month and 6th month. Results: The novice nurses' care ability were examined, only CbD score in the control group was improved in the 3rd month and with statistical difference, p = .003. The mini-CEX and CbD in the experimental group were significantly improved in both the first and third month with statistical differences p = .00. Although mini-CEX and CbD in the experimental group were higher than the control group, but there was no significant difference p > .05. Retention rate of the experimental group in the third month and sixth month was significantly higher than the control group, and there was a statistically significant difference p < .05. Conclusions: The study reveals that HDT strategy can help novice nurses learning, enhancing their knowledge and technical capability, analytical skills in case-based caring, and retention. The HDT strategy can be served as an effective strategy for novice training for better nurse retention rate.

Keywords: humanoid diagram teaching strategy, novice nurses retention, teaching strategy of nurse retention, visual learning mode

Procedia PDF Downloads 157
28822 A Fuzzy Kernel K-Medoids Algorithm for Clustering Uncertain Data Objects

Authors: Behnam Tavakkol

Abstract:

Uncertain data mining algorithms use different ways to consider uncertainty in data such as by representing a data object as a sample of points or a probability distribution. Fuzzy methods have long been used for clustering traditional (certain) data objects. They are used to produce non-crisp cluster labels. For uncertain data, however, besides some uncertain fuzzy k-medoids algorithms, not many other fuzzy clustering methods have been developed. In this work, we develop a fuzzy kernel k-medoids algorithm for clustering uncertain data objects. The developed fuzzy kernel k-medoids algorithm is superior to existing fuzzy k-medoids algorithms in clustering data sets with non-linearly separable clusters.

Keywords: clustering algorithm, fuzzy methods, kernel k-medoids, uncertain data

Procedia PDF Downloads 197
28821 A Comparison of Single of Decision Tree, Decision Tree Forest and Group Method of Data Handling to Evaluate the Surface Roughness in Machining Process

Authors: S. Ghorbani, N. I. Polushin

Abstract:

The machinability of workpieces (AISI 1045 Steel, AA2024 aluminum alloy, A48-class30 gray cast iron) in turning operation has been carried out using different types of cutting tool (conventional, cutting tool with holes in toolholder and cutting tool filled up with composite material) under dry conditions on a turning machine at different stages of spindle speed (630-1000 rpm), feed rate (0.05-0.075 mm/rev), depth of cut (0.05-0.15 mm) and tool overhang (41-65 mm). Experimentation was performed as per Taguchi’s orthogonal array. To evaluate the relative importance of factors affecting surface roughness the single decision tree (SDT), Decision tree forest (DTF) and Group method of data handling (GMDH) were applied.

Keywords: decision tree forest, GMDH, surface roughness, Taguchi method, turning process

Procedia PDF Downloads 424
28820 Democracy Bytes: Interrogating the Exploitation of Data Democracy by Radical Terrorist Organizations

Authors: Nirmala Gopal, Sheetal Bhoola, Audecious Mugwagwa

Abstract:

This paper discusses the continued infringement and exploitation of data by non-state actors for destructive purposes, emphasizing radical terrorist organizations. It will discuss how terrorist organizations access and use data to foster their nefarious agendas. It further examines how cybersecurity, designed as a tool to curb data exploitation, is ineffective in raising global citizens' concerns about how their data can be kept safe and used for its acquired purpose. The study interrogates several policies and data protection instruments, such as the Data Protection Act, Cyber Security Policies, Protection of Personal Information(PPI) and General Data Protection Regulations (GDPR), to understand data use and storage in democratic states. The study outcomes point to the fact that international cybersecurity and cybercrime legislation, policies, and conventions have not curbed violations of data access and use by radical terrorist groups. The study recommends ways to enhance cybersecurity and reduce cyber risks using democratic principles.

Keywords: cybersecurity, data exploitation, terrorist organizations, data democracy

Procedia PDF Downloads 181
28819 Healthcare Data Mining Innovations

Authors: Eugenia Jilinguirian

Abstract:

In the healthcare industry, data mining is essential since it transforms the field by collecting useful data from large datasets. Data mining is the process of applying advanced analytical methods to large patient records and medical histories in order to identify patterns, correlations, and trends. Healthcare professionals can improve diagnosis accuracy, uncover hidden linkages, and predict disease outcomes by carefully examining these statistics. Additionally, data mining supports personalized medicine by personalizing treatment according to the unique attributes of each patient. This proactive strategy helps allocate resources more efficiently, enhances patient care, and streamlines operations. However, to effectively apply data mining, however, and ensure the use of private healthcare information, issues like data privacy and security must be carefully considered. Data mining continues to be vital for searching for more effective, efficient, and individualized healthcare solutions as technology evolves.

Keywords: data mining, healthcare, big data, individualised healthcare, healthcare solutions, database

Procedia PDF Downloads 50
28818 Summarizing Data Sets for Data Mining by Using Statistical Methods in Coastal Engineering

Authors: Yunus Doğan, Ahmet Durap

Abstract:

Coastal regions are the one of the most commonly used places by the natural balance and the growing population. In coastal engineering, the most valuable data is wave behaviors. The amount of this data becomes very big because of observations that take place for periods of hours, days and months. In this study, some statistical methods such as the wave spectrum analysis methods and the standard statistical methods have been used. The goal of this study is the discovery profiles of the different coast areas by using these statistical methods, and thus, obtaining an instance based data set from the big data to analysis by using data mining algorithms. In the experimental studies, the six sample data sets about the wave behaviors obtained by 20 minutes of observations from Mersin Bay in Turkey and converted to an instance based form, while different clustering techniques in data mining algorithms were used to discover similar coastal places. Moreover, this study discusses that this summarization approach can be used in other branches collecting big data such as medicine.

Keywords: clustering algorithms, coastal engineering, data mining, data summarization, statistical methods

Procedia PDF Downloads 344
28817 Child Labor and Injury Occurrence in Nicaragua: A Gender Perspective Analysis

Authors: Cristina Domínguez, Steven N. Cuadra

Abstract:

Aims: The aims of this study are: 1) to describe the occurrence and estimate the risk of suffering injuries of any kind, especially work-related injuries, in rural children working in agricultural activities and in urban children working on the street 2) to explore factors that might be associated with the occurrence of work-related injuries among child workers such as gender, school attendance, and performance of household chore. Method: We performed a crossectional study among working children in agricultural activities (120) and on the street (108) and in non-working referents (140) in 2019. We investigated self-reported injuries during the last 12 months, with focus on work-related injuries. Incidence rate, rate ratios, and 95% CI were calculated by Poisson regression. Results: Agricultural workers have a higher incidence of work-related injuries (2.1 per 1000 person-days) than children working on the street (1.8 per 1000 person-days). However, when considering girl’s unpaid work at home, girls had higher occurrence. Girls had a 30% increase on the risk of suffering work related injuries compared to boys. Performing household chore and attending school were the major predictors of injury occurrence. Discussion: Our data suggest If such partial and full-time girl’s housework is taken into account, there would be little or no variation between the sexes with regard to injuries occurrence, and the incidence rate of work related injuries among girls could even exceed that of boys A greater understanding of the interaction of factors related to how child workers spend their time, and its impact on children’s health, is needed in order to identify feasible and appropriate strategies to reduce the negative effect of work on children when elimination of child labor is not reachable in the short term. Clearly, gender aspects on child labor may allow for more effective targeting of prevention efforts.

Keywords: injuries, child labor, agricultural work, gender

Procedia PDF Downloads 112
28816 Use of Cobalt Graphene in Place of Platnium in Catalytic Converter

Authors: V. Srinivasan, S. M. Sriram Nandan

Abstract:

Today in the modern world the most important problem faced by the mankind is increasing the pollution in a very high rate. It affects the ecosystem of the environment and also aids to increase the greenhouse effect. The exhaust gases from the automobile is the major cause of a pollution. Automobiles have increased to a large number which has increased the pollution of our world to an alarming rate. There are two methods of controlling the pollution namely, pre-pollution control method and post-pollution control method. This paper is based on controlling the emission by post-pollution control method. The ratio of surface area of nanoparticles to the volume of the nanoparticles is inversely proportional to the radius of the nanoparticles. So decreasing the radius, this ratio is leading resulting in an increased rate of reaction and thus the concentration of the pollution is decreased. To achieve this objective, use of cobalt-graphene element is proposed. The proposed method is mainly to decrease the cost of platinum as it is expensive. This has a longer life than the platinum-based catalysts.

Keywords: automobile emissions, catalytic converter, cobalt-graphene, replacement of platinum

Procedia PDF Downloads 373
28815 3D Receiver Operator Characteristic Histogram

Authors: Xiaoli Zhang, Xiongfei Li, Yuncong Feng

Abstract:

ROC curves, as a widely used evaluating tool in machine learning field, are the tradeoff of true positive rate and negative rate. However, they are blamed for ignoring some vital information in the evaluation process, such as the amount of information about the target that each instance carries, predicted score given by each classification model to each instance. Hence, in this paper, a new classification performance method is proposed by extending the Receiver Operator Characteristic (ROC) curves to 3D space, which is denoted as 3D ROC Histogram. In the histogram, the

Keywords: classification, performance evaluation, receiver operating characteristic histogram, hardness prediction

Procedia PDF Downloads 297
28814 Infusion Pump Historical Development, Measurement and Parts of Infusion Pump

Authors: Samuel Asrat

Abstract:

Infusion pumps have become indispensable tools in modern healthcare, allowing for precise and controlled delivery of fluids, medications, and nutrients to patients. This paper provides an overview of the historical development, measurement, and parts of infusion pumps. The historical development of infusion pumps can be traced back to the early 1960s when the first rudimentary models were introduced. These early pumps were large, cumbersome, and often unreliable. However, advancements in technology and engineering over the years have led to the development of smaller, more accurate, and user-friendly infusion pumps. Measurement of infusion pumps involves assessing various parameters such as flow rate, volume delivered, and infusion duration. Flow rate, typically measured in milliliters per hour (mL/hr), is a critical parameter that determines the rate at which fluids or medications are delivered to the patient. Accurate measurement of flow rate is essential to ensure the proper administration of therapy and prevent adverse effects. Infusion pumps consist of several key parts, including the pump mechanism, fluid reservoir, tubing, and control interface. The pump mechanism is responsible for generating the necessary pressure to push fluids through the tubing and into the patient's bloodstream. The fluid reservoir holds the medication or solution to be infused, while the tubing serves as the conduit through which the fluid travels from the reservoir to the patient. The control interface allows healthcare providers to program and adjust the infusion parameters, such as flow rate and volume. In conclusion, infusion pumps have evolved significantly since their inception, offering healthcare providers unprecedented control and precision in delivering fluids and medications to patients. Understanding the historical development, measurement, and parts of infusion pumps is essential for ensuring their safe and effective use in clinical practice.

Keywords: dip, ip, sp, is

Procedia PDF Downloads 43
28813 An Extension of the Generalized Extreme Value Distribution

Authors: Serge Provost, Abdous Saboor

Abstract:

A q-analogue of the generalized extreme value distribution which includes the Gumbel distribution is introduced. The additional parameter q allows for increased modeling flexibility. The resulting distribution can have a finite, semi-infinite or infinite support. It can also produce several types of hazard rate functions. The model parameters are determined by making use of the method of maximum likelihood. It will be shown that it compares favourably to three related distributions in connection with the modeling of a certain hydrological data set.

Keywords: extreme value theory, generalized extreme value distribution, goodness-of-fit statistics, Gumbel distribution

Procedia PDF Downloads 327
28812 A Research and Application of Feature Selection Based on IWO and Tabu Search

Authors: Laicheng Cao, Xiangqian Su, Youxiao Wu

Abstract:

Feature selection is one of the important problems in network security, pattern recognition, data mining and other fields. In order to remove redundant features, effectively improve the detection speed of intrusion detection system, proposes a new feature selection method, which is based on the invasive weed optimization (IWO) algorithm and tabu search algorithm(TS). Use IWO as a global search, tabu search algorithm for local search, to improve the results of IWO algorithm. The experimental results show that the feature selection method can effectively remove the redundant features of network data information in feature selection, reduction time, and to guarantee accurate detection rate, effectively improve the speed of detection system.

Keywords: intrusion detection, feature selection, iwo, tabu search

Procedia PDF Downloads 510
28811 Access to Health Data in Medical Records in Indonesia in Terms of Personal Data Protection Principles: The Limitation and Its Implication

Authors: Anny Retnowati, Elisabeth Sundari

Abstract:

This research aims to elaborate the meaning of personal data protection principles on patient access to health data in medical records in Indonesia and its implications. The method uses normative legal research by examining health law in Indonesia regarding the patient's right to access their health data in medical records. The data will be analysed qualitatively using the interpretation method to elaborate on the limitation of the meaning of personal data protection principles on patients' access to their data in medical records. The results show that patients only have the right to obtain copies of their health data in medical records. There is no right to inspect directly at any time. Indonesian health law limits the principle of patients' right to broad access to their health data in medical records. This restriction has implications for the reduction of personal data protection as part of human rights. This research contribute to show that a limitaion of personal data protection may abuse the human rights.

Keywords: access, health data, medical records, personal data, protection

Procedia PDF Downloads 66
28810 Conceptualizing the Knowledge to Manage and Utilize Data Assets in the Context of Digitization: Case Studies of Multinational Industrial Enterprises

Authors: Martin Böhmer, Agatha Dabrowski, Boris Otto

Abstract:

The trend of digitization significantly changes the role of data for enterprises. Data turn from an enabler to an intangible organizational asset that requires management and qualifies as a tradeable good. The idea of a networked economy has gained momentum in the data domain as collaborative approaches for data management emerge. Traditional organizational knowledge consequently needs to be extended by comprehensive knowledge about data. The knowledge about data is vital for organizations to ensure that data quality requirements are met and data can be effectively utilized and sovereignly governed. As this specific knowledge has been paid little attention to so far by academics, the aim of the research presented in this paper is to conceptualize it by proposing a “data knowledge model”. Relevant model entities have been identified based on a design science research (DSR) approach that iteratively integrates insights of various industry case studies and literature research.

Keywords: data management, digitization, industry 4.0, knowledge engineering, metamodel

Procedia PDF Downloads 337
28809 Biocontrol Effectiveness of Indigenous Trichoderma Species against Meloidogyne javanica and Fusarium oxysporum f. sp. radicis lycopersici on Tomato

Authors: Hajji Lobna, Chattaoui Mayssa, Regaieg Hajer, M'Hamdi-Boughalleb Naima, Rhouma Ali, Horrigue-Raouani Najet

Abstract:

In this study, three local isolates of Trichoderma (Tr1: T. viride, Tr2: T. harzianum and Tr3: T. asperellum) were isolated and evaluated for their biocontrol effectiveness under in vitro conditions and in greenhouse. In vitro bioassay revealed a biopotential control against Fusarium oxysporum f. sp. radicis lycopersici and Meloidogyne javanica (RKN) separately. All species of Trichoderma exhibited biocontrol performance and (Tr1) Trichoderma viride was the most efficient. In fact, growth rate inhibition of Fusarium oxysporum f. sp. radicis lycopersici (FORL) was reached 75.5% with Tr1. Parasitism rate of root-knot nematode was 60% for juveniles and 75% for eggs with the same one. Pots experiment results showed that Tr1 and Tr2, compared to chemical treatment, enhanced the plant growth and exhibited better antagonism against root-knot nematode and root-rot fungi separated or combined. All Trichoderma isolates revealed a bioprotection potential against Fusarium oxysporum f. sp. radicis lycopersici. When pathogen fungi inoculated alone, Fusarium wilt index and browning vascular rate were reduced significantly with Tr1 (0.91, 2.38%) and Tr2 (1.5, 5.5%), respectively. In the case of combined infection with Fusarium and nematode, the same isolate of Trichoderma Tr1 and Tr2 decreased Fusarium wilt index at 1.1 and 0.83 and reduced the browning vascular rate at 6.5% and 6%, respectively. Similarly, the isolate Tr1 and Tr2 caused maximum inhibition of nematode multiplication. Multiplication rate was declined at 4% with both isolates either tomato infected by nematode separately or concomitantly with Fusarium. The chemical treatment was moderate in activity against Meloidogyne javanica and Fusarium oxysporum f. sp. radicis lycopersici alone and combined.

Keywords: trichoderma spp., meloidogyne javanica, Fusarium oxysporum f.sp.radicis lycopersici, biocontrol

Procedia PDF Downloads 263
28808 Design and Implementation a Platform for Adaptive Online Learning Based on Fuzzy Logic

Authors: Budoor Al Abid

Abstract:

Educational systems are increasingly provided as open online services, providing guidance and support for individual learners. To adapt the learning systems, a proper evaluation must be made. This paper builds the evaluation model Fuzzy C Means Adaptive System (FCMAS) based on data mining techniques to assess the difficulty of the questions. The following steps are implemented; first using a dataset from an online international learning system called (slepemapy.cz) the dataset contains over 1300000 records with 9 features for students, questions and answers information with feedback evaluation. Next, a normalization process as preprocessing step was applied. Then FCM clustering algorithms are used to adaptive the difficulty of the questions. The result is three cluster labeled data depending on the higher Wight (easy, Intermediate, difficult). The FCM algorithm gives a label to all the questions one by one. Then Random Forest (RF) Classifier model is constructed on the clustered dataset uses 70% of the dataset for training and 30% for testing; the result of the model is a 99.9% accuracy rate. This approach improves the Adaptive E-learning system because it depends on the student behavior and gives accurate results in the evaluation process more than the evaluation system that depends on feedback only.

Keywords: machine learning, adaptive, fuzzy logic, data mining

Procedia PDF Downloads 173
28807 Morphological Characterization and Gas Permeation of Commercially Available Alumina Membrane

Authors: Ifeyinwa Orakwe, Ngozi Nwogu, Edward Gobina

Abstract:

This work presents experimental results relating to the structural characterization of a commercially available alumina membrane. A γ-alumina mesoporous tubular membrane has been used. Nitrogen adsorption-desorption, scanning electron microscopy and gas permeability test has been carried out on the alumina membrane to characterize its structural features. Scanning electron microscopy (SEM) was used to determine the pore size distribution of the membrane. Pore size, specific surface area and pore size distribution were also determined with the use of the Nitrogen adsorption-desorption instrument. Gas permeation tests were carried out on the membrane using a variety of single and mixed gases. The permeabilities at different pressure between 0.05-1 bar and temperature range of 25-200oC were used for the single and mixed gases: nitrogen (N2), helium (He), oxygen (O2), carbon dioxide (CO2), 14%CO₂/N₂, 60%CO₂/N₂, 30%CO₂/CH4 and 21%O₂/N₂. Plots of flow rate verses pressure were obtained. Results got showed the effect of temperature on the permeation rate of the various gases. At 0.5 bar for example, the flow rate for N2 was relatively constant before decreasing with an increase in temperature, while for O2, it continuously decreased with an increase in temperature. In the case of 30%CO₂/CH4 and 14%CO₂/N₂, the flow rate showed an increase then a decrease with increase in temperature. The effect of temperature on the membrane performance of the various gases is presented and the influence of the trans membrane pressure drop will be discussed in this paper.

Keywords: alumina membrane, Nitrogen adsorption-desorption, scanning electron microscopy, gas permeation, temperature

Procedia PDF Downloads 310
28806 Dynamic Modeling of the Exchange Rate in Tunisia: Theoretical and Empirical Study

Authors: Chokri Slim

Abstract:

The relative failure of simultaneous equation models in the seventies has led researchers to turn to other approaches that take into account the dynamics of economic and financial systems. In this paper, we use an approach based on vector autoregressive model that is widely used in recent years. Their popularity is due to their flexible nature and ease of use to produce models with useful descriptive characteristics. It is also easy to use them to test economic hypotheses. The standard econometric techniques assume that the series studied are stable over time (stationary hypothesis). Most economic series do not verify this hypothesis, which assumes, when one wishes to study the relationships that bind them to implement specific techniques. This is cointegration which characterizes non-stationary series (integrated) with a linear combination is stationary, will also be presented in this paper. Since the work of Johansen, this approach is generally presented as part of a multivariate analysis and to specify long-term stable relationships while at the same time analyzing the short-term dynamics of the variables considered. In the empirical part, we have applied these concepts to study the dynamics of of the exchange rate in Tunisia, which is one of the most important economic policy of a country open to the outside. According to the results of the empirical study by the cointegration method, there is a cointegration relationship between the exchange rate and its determinants. This relationship shows that the variables have a significant influence in determining the exchange rate in Tunisia.

Keywords: stationarity, cointegration, dynamic models, causality, VECM models

Procedia PDF Downloads 341
28805 Establishment of a Classifier Model for Early Prediction of Acute Delirium in Adult Intensive Care Unit Using Machine Learning

Authors: Pei Yi Lin

Abstract:

Objective: The objective of this study is to use machine learning methods to build an early prediction classifier model for acute delirium to improve the quality of medical care for intensive care patients. Background: Delirium is a common acute and sudden disturbance of consciousness in critically ill patients. After the occurrence, it is easy to prolong the length of hospital stay and increase medical costs and mortality. In 2021, the incidence of delirium in the intensive care unit of internal medicine was as high as 59.78%, which indirectly prolonged the average length of hospital stay by 8.28 days, and the mortality rate is about 2.22% in the past three years. Therefore, it is expected to build a delirium prediction classifier through big data analysis and machine learning methods to detect delirium early. Method: This study is a retrospective study, using the artificial intelligence big data database to extract the characteristic factors related to delirium in intensive care unit patients and let the machine learn. The study included patients aged over 20 years old who were admitted to the intensive care unit between May 1, 2022, and December 31, 2022, excluding GCS assessment <4 points, admission to ICU for less than 24 hours, and CAM-ICU evaluation. The CAMICU delirium assessment results every 8 hours within 30 days of hospitalization are regarded as an event, and the cumulative data from ICU admission to the prediction time point are extracted to predict the possibility of delirium occurring in the next 8 hours, and collect a total of 63,754 research case data, extract 12 feature selections to train the model, including age, sex, average ICU stay hours, visual and auditory abnormalities, RASS assessment score, APACHE-II Score score, number of invasive catheters indwelling, restraint and sedative and hypnotic drugs. Through feature data cleaning, processing and KNN interpolation method supplementation, a total of 54595 research case events were extracted to provide machine learning model analysis, using the research events from May 01 to November 30, 2022, as the model training data, 80% of which is the training set for model training, and 20% for the internal verification of the verification set, and then from December 01 to December 2022 The CU research event on the 31st is an external verification set data, and finally the model inference and performance evaluation are performed, and then the model has trained again by adjusting the model parameters. Results: In this study, XG Boost, Random Forest, Logistic Regression, and Decision Tree were used to analyze and compare four machine learning models. The average accuracy rate of internal verification was highest in Random Forest (AUC=0.86), and the average accuracy rate of external verification was in Random Forest and XG Boost was the highest, AUC was 0.86, and the average accuracy of cross-validation was the highest in Random Forest (ACC=0.77). Conclusion: Clinically, medical staff usually conduct CAM-ICU assessments at the bedside of critically ill patients in clinical practice, but there is a lack of machine learning classification methods to assist ICU patients in real-time assessment, resulting in the inability to provide more objective and continuous monitoring data to assist Clinical staff can more accurately identify and predict the occurrence of delirium in patients. It is hoped that the development and construction of predictive models through machine learning can predict delirium early and immediately, make clinical decisions at the best time, and cooperate with PADIS delirium care measures to provide individualized non-drug interventional care measures to maintain patient safety, and then Improve the quality of care.

Keywords: critically ill patients, machine learning methods, delirium prediction, classifier model

Procedia PDF Downloads 45
28804 Sound Exposure Effects towards Ross Broilers Growth Rate

Authors: Rashidah Ghazali, Herlina Abdul Rahim, Mashitah Shikh Maidin, Shafishuhaza Sahlan, Noramli Abdul Razak

Abstract:

Sound exposure effects have been investigated by broadcasting a group of broilers with sound of Quran verses (Group B) whereas the other group is the control broilers (Group C). The growth rate comparisons in terms of weight and raw meat texture measured by shear force have been investigated. Twenty-seven broilers were randomly selected from each group on Day 24 and weight measurement was carried out every week till the harvest day (Day 39). Group B showed a higher mean weight on Day 24 (1.441±0.013 kg) than Group C. Significant difference in the weight on Day 39 existed for Group B compared to Group C (p< 0.05). However, there was no significant (p> 0.05) difference of shear force in the same muscles (breast and drumstick raw meat) of both groups but the shear force of the breast meat for Group B and C broilers was lower (p < 0.05) than that of their drumstick meat. Thus, broadcasting the sound of Quran verses in the coop can be applied to improve the growth rate of broilers for producing better quality poultry.

Keywords: broilers, sound, shear force, weight

Procedia PDF Downloads 401
28803 Experimental Analysis of the Influence of Water Mass Flow Rate on the Performance of a CO2 Direct-Expansion Solar Assisted Heat Pump

Authors: Sabrina N. Rabelo, Tiago de F. Paulino, Willian M. Duarte, Samer Sawalha, Luiz Machado

Abstract:

Energy use is one of the main indicators for the economic and social development of a country, reflecting directly in the quality of life of the population. The expansion of energy use together with the depletion of fossil resources and the poor efficiency of energy systems have led many countries in recent years to invest in renewable energy sources. In this context, solar-assisted heat pump has become very important in energy industry, since it can transfer heat energy from the sun to water or another absorbing source. The direct-expansion solar assisted heat pump (DX-SAHP) water heater system operates by receiving solar energy incident in a solar collector, which serves as an evaporator in a refrigeration cycle, and the energy reject by the condenser is used for water heating. In this paper, a DX-SAHP using carbon dioxide as refrigerant (R744) was assembled, and the influence of the variation of the water mass flow rate in the system was analyzed. The parameters such as high pressure, water outlet temperature, gas cooler outlet temperature, evaporator temperature, and the coefficient of performance were studied. The mainly components used to assemble the heat pump were a reciprocating compressor, a gas cooler which is a countercurrent concentric tube heat exchanger, a needle-valve, and an evaporator that is a copper bare flat plate solar collector designed to capture direct and diffuse radiation. Routines were developed in the LabVIEW and CoolProp through MATLAB software’s, respectively, to collect data and calculate the thermodynamics properties. The range of coefficient of performance measured was from 3.2 to 5.34. It was noticed that, with the higher water mass flow rate, the water outlet temperature decreased, and consequently, the coefficient of performance of the system increases since the heat transfer in the gas cooler is higher. In addition, the high pressure of the system and the CO2 gas cooler outlet temperature decreased. The heat pump using carbon dioxide as a refrigerant, especially operating with solar radiation has been proven to be a renewable source in an efficient system for heating residential water compared to electrical heaters reaching temperatures between 40 °C and 80 °C.

Keywords: water mass flow rate, R-744, heat pump, solar evaporator, water heater

Procedia PDF Downloads 158
28802 Analysis and Forecasting of Bitcoin Price Using Exogenous Data

Authors: J-C. Leneveu, A. Chereau, L. Mansart, T. Mesbah, M. Wyka

Abstract:

Extracting and interpreting information from Big Data represent a stake for years to come in several sectors such as finance. Currently, numerous methods are used (such as Technical Analysis) to try to understand and to anticipate market behavior, with mixed results because it still seems impossible to exactly predict a financial trend. The increase of available data on Internet and their diversity represent a great opportunity for the financial world. Indeed, it is possible, along with these standard financial data, to focus on exogenous data to take into account more macroeconomic factors. Coupling the interpretation of these data with standard methods could allow obtaining more precise trend predictions. In this paper, in order to observe the influence of exogenous data price independent of other usual effects occurring in classical markets, behaviors of Bitcoin users are introduced in a model reconstituting Bitcoin value, which is elaborated and tested for prediction purposes.

Keywords: big data, bitcoin, data mining, social network, financial trends, exogenous data, global economy, behavioral finance

Procedia PDF Downloads 341
28801 Influence of Transportation Mode to the Deterioration Rate: Case Study of Food Transport by Ship

Authors: Danijela Tuljak-Suban, Valter Suban

Abstract:

Food as perishable goods represents a specific and sensitive part in the supply chain theory, since changing of its physical or chemical characteristics considerably influences the approach to stock management. The most delicate phase of this process is transportation, where it becomes difficult to ensure stability conditions that limit the deterioration, since the value of the deterioration rate could be easily influenced by the transportation mode. Fuzzy definition of variables allows taking into account these variations. Furthermore an appropriate choice of the defuzzification method permits to adapt results, as much as possible, to real conditions. In the article will be applied the those methods to the relationship between the deterioration rate of perishable goods and transportation by ship, with the aim: (a) to minimize the total costs function, defined as the sum of the ordering cost, holding cost, disposing cost and transportation costs, and (b) to improve supply chain sustainability by reducing the environmental impact and waste disposal costs.

Keywords: perishable goods, fuzzy reasoning, transport by ship, supply chain sustainability

Procedia PDF Downloads 529