Search results for: data driven diagnosis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26866

Search results for: data driven diagnosis

21916 Cloud Monitoring and Performance Optimization Ensuring High Availability

Authors: Inayat Ur Rehman, Georgia Sakellari

Abstract:

Cloud computing has evolved into a vital technology for businesses, offering scalability, flexibility, and cost-effectiveness. However, maintaining high availability and optimal performance in the cloud is crucial for reliable services. This paper explores the significance of cloud monitoring and performance optimization in sustaining the high availability of cloud-based systems. It discusses diverse monitoring tools, techniques, and best practices for continually assessing the health and performance of cloud resources. The paper also delves into performance optimization strategies, including resource allocation, load balancing, and auto-scaling, to ensure efficient resource utilization and responsiveness. Addressing potential challenges in cloud monitoring and optimization, the paper offers insights into data security and privacy considerations. Through this thorough analysis, the paper aims to underscore the importance of cloud monitoring and performance optimization for ensuring a seamless and highly available cloud computing environment.

Keywords: cloud computing, cloud monitoring, performance optimization, high availability, scalability, resource allocation, load balancing, auto-scaling, data security, data privacy

Procedia PDF Downloads 50
21915 Design of a Fuzzy Expert System for the Impact of Diabetes Mellitus on Cardiac and Renal Impediments

Authors: E. Rama Devi Jothilingam

Abstract:

Diabetes mellitus is now one of the most common non communicable diseases globally. India leads the world with largest number of diabetic subjects earning the title "diabetes capital of the world". In order to reduce the mortality rate, a fuzzy expert system is designed to predict the severity of cardiac and renal problems of diabetic patients using fuzzy logic. Since uncertainty is inherent in medicine, fuzzy logic is used in this research work to remove the inherent fuzziness of linguistic concepts and uncertain status in diabetes mellitus which is the prime cause for the cardiac arrest and renal failure. In this work, the controllable risk factors "blood sugar, insulin, ketones, lipids, obesity, blood pressure and protein/creatinine ratio" are considered as input parameters and the "the stages of cardiac" (SOC)" and the stages of renal" (SORD) are considered as the output parameters. The triangular membership functions are used to model the input and output parameters. The rule base is constructed for the proposed expert system based on the knowledge from the medical experts. Mamdani inference engine is used to infer the information based on the rule base to take major decision in diagnosis. Mean of maximum is used to get a non fuzzy control action that best represent possibility distribution of an inferred fuzzy control action. The proposed system also classifies the patients with high risk and low risk using fuzzy c means clustering techniques so that the patients with high risk are treated immediately. The system is validated with Matlab and is used as a tracking system with accuracy and robustness.

Keywords: Diabetes mellitus, fuzzy expert system, Mamdani, MATLAB

Procedia PDF Downloads 286
21914 The Use of Artificial Intelligence to Curb Corruption in Brazil

Authors: Camila Penido Gomes

Abstract:

Over the past decade, an emerging body of research has been pointing to artificial intelligence´s great potential to improve the use of open data, increase transparency and curb corruption in the public sector. Nonetheless, studies on this subject are scant and usually lack evidence to validate AI-based technologies´ effectiveness in addressing corruption, especially in developing countries. Aiming to fill this void in the literature, this paper sets out to examine how AI has been deployed by civil society to improve the use of open data and prevent congresspeople from misusing public resources in Brazil. Building on the current debates and carrying out a systematic literature review and extensive document analyses, this research reveals that AI should not be deployed as one silver bullet to fight corruption. Instead, this technology is more powerful when adopted by a multidisciplinary team as a civic tool in conjunction with other strategies. This study makes considerable contributions, bringing to the forefront discussion a more accurate understanding of the factors that play a decisive role in the successful implementation of AI-based technologies in anti-corruption efforts.

Keywords: artificial intelligence, civil society organization, corruption, open data, transparency

Procedia PDF Downloads 198
21913 Performance Study of Classification Algorithms for Consumer Online Shopping Attitudes and Behavior Using Data Mining

Authors: Rana Alaa El-Deen Ahmed, M. Elemam Shehab, Shereen Morsy, Nermeen Mekawie

Abstract:

With the growing popularity and acceptance of e-commerce platforms, users face an ever increasing burden in actually choosing the right product from the large number of online offers. Thus, techniques for personalization and shopping guides are needed by users. For a pleasant and successful shopping experience, users need to know easily which products to buy with high confidence. Since selling a wide variety of products has become easier due to the popularity of online stores, online retailers are able to sell more products than a physical store. The disadvantage is that the customers might not find products they need. In this research the customer will be able to find the products he is searching for, because recommender systems are used in some ecommerce web sites. Recommender system learns from the information about customers and products and provides appropriate personalized recommendations to customers to find the needed product. In this paper eleven classification algorithms are comparatively tested to find the best classifier fit for consumer online shopping attitudes and behavior in the experimented dataset. The WEKA knowledge analysis tool, which is an open source data mining workbench software used in comparing conventional classifiers to get the best classifier was used in this research. In this research by using the data mining tool (WEKA) with the experimented classifiers the results show that decision table and filtered classifier gives the highest accuracy and the lowest accuracy classification via clustering and simple cart.

Keywords: classification, data mining, machine learning, online shopping, WEKA

Procedia PDF Downloads 343
21912 Learning from Small Amount of Medical Data with Noisy Labels: A Meta-Learning Approach

Authors: Gorkem Algan, Ilkay Ulusoy, Saban Gonul, Banu Turgut, Berker Bakbak

Abstract:

Computer vision systems recently made a big leap thanks to deep neural networks. However, these systems require correctly labeled large datasets in order to be trained properly, which is very difficult to obtain for medical applications. Two main reasons for label noise in medical applications are the high complexity of the data and conflicting opinions of experts. Moreover, medical imaging datasets are commonly tiny, which makes each data very important in learning. As a result, if not handled properly, label noise significantly degrades the performance. Therefore, a label-noise-robust learning algorithm that makes use of the meta-learning paradigm is proposed in this article. The proposed solution is tested on retinopathy of prematurity (ROP) dataset with a very high label noise of 68%. Results show that the proposed algorithm significantly improves the classification algorithm's performance in the presence of noisy labels.

Keywords: deep learning, label noise, robust learning, meta-learning, retinopathy of prematurity

Procedia PDF Downloads 151
21911 Relational Attention Shift on Images Using Bu-Td Architecture and Sequential Structure Revealing

Authors: Alona Faktor

Abstract:

In this work, we present a NN-based computational model that can perform attention shifts according to high-level instruction. The instruction specifies the type of attentional shift using explicit geometrical relation. The instruction also can be of cognitive nature, specifying more complex human-human interaction or human-object interaction, or object-object interaction. Applying this approach sequentially allows obtaining a structural description of an image. A novel data-set of interacting humans and objects is constructed using a computer graphics engine. Using this data, we perform systematic research of relational segmentation shifts.

Keywords: cognitive science, attentin, deep learning, generalization

Procedia PDF Downloads 191
21910 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture

Authors: Sajjad Akbar, Rabia Bashir

Abstract:

With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.

Keywords: agent based web content mining, content centric networking, information centric networking

Procedia PDF Downloads 467
21909 Location Uncertainty – A Probablistic Solution for Automatic Train Control

Authors: Monish Sengupta, Benjamin Heydecker, Daniel Woodland

Abstract:

New train control systems rely mainly on Automatic Train Protection (ATP) and Automatic Train Operation (ATO) dynamically to control the speed and hence performance. The ATP and the ATO form the vital element within the CBTC (Communication Based Train Control) and within the ERTMS (European Rail Traffic Management System) system architectures. Reliable and accurate measurement of train location, speed and acceleration are vital to the operation of train control systems. In the past, all CBTC and ERTMS system have deployed a balise or equivalent to correct the uncertainty element of the train location. Typically a CBTC train is allowed to miss only one balise on the track, after which the Automatic Train Protection (ATP) system applies emergency brake to halt the service. This is because the location uncertainty, which grows within the train control system, cannot tolerate missing more than one balise. Balises contribute a significant amount towards wayside maintenance and studies have shown that balises on the track also forms a constraint for future track layout change and change in speed profile.This paper investigates the causes of the location uncertainty that is currently experienced and considers whether it is possible to identify an effective filter to ascertain, in conjunction with appropriate sensors, more accurate speed, distance and location for a CBTC driven train without the need of any external balises. An appropriate sensor fusion algorithm and intelligent sensor selection methodology will be deployed to ascertain the railway location and speed measurement at its highest precision. Similar techniques are already in use in aviation, satellite, submarine and other navigation systems. Developing a model for the speed control and the use of Kalman filter is a key element in this research. This paper will summarize the research undertaken and its significant findings, highlighting the potential for introducing alternative approaches to train positioning that would enable removal of all trackside location correction balises, leading to huge reduction in maintenances and more flexibility in future track design.

Keywords: ERTMS, CBTC, ATP, ATO

Procedia PDF Downloads 407
21908 A Simple Algorithm for Real-Time 3D Capturing of an Interior Scene Using a Linear Voxel Octree and a Floating Origin Camera

Authors: Vangelis Drosos, Dimitrios Tsoukalos, Dimitrios Tsolis

Abstract:

We present a simple algorithm for capturing a 3D scene (focused on the usage of mobile device cameras in the context of augmented/mixed reality) by using a floating origin camera solution and storing the resulting information in a linear voxel octree. Data is derived from cloud points captured by a mobile device camera. For the purposes of this paper, we assume a scene of fixed size (known to us or determined beforehand) and a fixed voxel resolution. The resulting data is stored in a linear voxel octree using a hashtable. We commence by briefly discussing the logic behind floating origin approaches and the usage of linear voxel octrees for efficient storage. Following that, we present the algorithm for translating captured feature points into voxel data in the context of a fixed origin world and storing them. Finally, we discuss potential applications and areas of future development and improvement to the efficiency of our solution.

Keywords: voxel, octree, computer vision, XR, floating origin

Procedia PDF Downloads 127
21907 The Effect of Excel on Undergraduate Students’ Understanding of Statistics and the Normal Distribution

Authors: Masomeh Jamshid Nejad

Abstract:

Nowadays, statistical literacy is no longer a necessary skill but an essential skill with broad applications across diverse fields, especially in operational decision areas such as business management, finance, and economics. As such, learning and deep understanding of statistical concepts are essential in the context of business studies. One of the crucial topics in statistical theory and its application is the normal distribution, often called a bell-shaped curve. To interpret data and conduct hypothesis tests, comprehending the properties of normal distribution (the mean and standard deviation) is essential for business students. This requires undergraduate students in the field of economics and business management to visualize and work with data following a normal distribution. Since technology is interconnected with education these days, it is important to teach statistics topics in the context of Python, R-studio, and Microsoft Excel to undergraduate students. This research endeavours to shed light on the effect of Excel-based instruction on learners’ knowledge of statistics, specifically the central concept of normal distribution. As such, two groups of undergraduate students (from the Business Management program) were compared in this research study. One group underwent Excel-based instruction and another group relied only on traditional teaching methods. We analyzed experiential data and BBA participants’ responses to statistic-related questions focusing on the normal distribution, including its key attributes, such as the mean and standard deviation. The results of our study indicate that exposing students to Excel-based learning supports learners in comprehending statistical concepts more effectively compared with the other group of learners (teaching with the traditional method). In addition, students in the context of Excel-based instruction showed ability in picturing and interpreting data concentrated on normal distribution.

Keywords: statistics, excel-based instruction, data visualization, pedagogy

Procedia PDF Downloads 46
21906 Novel Recommender Systems Using Hybrid CF and Social Network Information

Authors: Kyoung-Jae Kim

Abstract:

Collaborative Filtering (CF) is a popular technique for the personalization in the E-commerce domain to reduce information overload. In general, CF provides recommending items list based on other similar users’ preferences from the user-item matrix and predicts the focal user’s preference for particular items by using them. Many recommender systems in real-world use CF techniques because it’s excellent accuracy and robustness. However, it has some limitations including sparsity problems and complex dimensionality in a user-item matrix. In addition, traditional CF does not consider the emotional interaction between users. In this study, we propose recommender systems using social network and singular value decomposition (SVD) to alleviate some limitations. The purpose of this study is to reduce the dimensionality of data set using SVD and to improve the performance of CF by using emotional information from social network data of the focal user. In this study, we test the usability of hybrid CF, SVD and social network information model using the real-world data. The experimental results show that the proposed model outperforms conventional CF models.

Keywords: recommender systems, collaborative filtering, social network information, singular value decomposition

Procedia PDF Downloads 282
21905 The Cost of Non-Communicable Diseases in the European Union: A Projection towards the Future

Authors: Desiree Vandenberghe, Johan Albrecht

Abstract:

Non-communicable diseases (NCDs) are responsible for the vast majority of deaths in the European Union (EU) and represent a large share of total health care spending. A future increase in this health and financial burden is likely to be driven by population ageing, lifestyle changes and technological advances in medicine. Without adequate prevention measures, this burden can severely threaten population health and economic development. To tackle this challenge, a correct assessment of the current burden of NCDs is required, as well as a projection of potential increases of this burden. The contribution of this paper is to offer perspective on the evolution of the NCD burden towards the future and to give an indication of the potential of prevention policy. A Non-Homogenous, Semi-Markov model for the EU was constructed, which allowed for a projection of the cost burden for the four main NCDs (cancer, cardiovascular disease, chronic respiratory disease and diabetes mellitus) towards 2030 and 2050. This simulation is done based on multiple baseline scenarios that vary in demand and supply factors such as health status, population structure, and technological advances. Finally, in order to assess the potential of preventive measures to curb the cost explosion of NCDs, a simulation is executed which includes increased efforts for preventive health care measures. According to the Markov model, by 2030 and 2050, total costs (direct and indirect costs) in the EU could increase by 30.1% and 44.1% respectively, compared to 2015 levels. An ambitious prevention policy framework for NCDs will be required if the EU wants to meet this challenge of rising costs. To conclude, significant cost increases due to Non-Communicable Diseases are likely to occur due to demographic and lifestyle changes. Nevertheless, an ambitious prevention program throughout the EU can aid in making this cost burden manageable for future generations.

Keywords: non-communicable diseases, preventive health care, health policy, Markov model, scenario analysis

Procedia PDF Downloads 131
21904 Comparison of Developed Statokinesigram and Marker Data Signals by Model Approach

Authors: Boris Barbolyas, Kristina Buckova, Tomas Volensky, Cyril Belavy, Ladislav Dedik

Abstract:

Background: Based on statokinezigram, the human balance control is often studied. Approach to human postural reaction analysis is based on a combination of stabilometry output signal with retroreflective marker data signal processing, analysis, and understanding, in this study. The study shows another original application of Method of Developed Statokinesigram Trajectory (MDST), too. Methods: In this study, the participants maintained quiet bipedal standing for 10 s on stabilometry platform. Consequently, bilateral vibration stimuli to Achilles tendons in 20 s interval was applied. Vibration stimuli caused that human postural system took the new pseudo-steady state. Vibration frequencies were 20, 60 and 80 Hz. Participant's body segments - head, shoulders, hips, knees, ankles and little fingers were marked by 12 retroreflective markers. Markers positions were scanned by six cameras system BTS SMART DX. Registration of their postural reaction lasted 60 s. Sampling frequency was 100 Hz. For measured data processing were used Method of Developed Statokinesigram Trajectory. Regression analysis of developed statokinesigram trajectory (DST) data and retroreflective marker developed trajectory (DMT) data were used to find out which marker trajectories most correlate with stabilometry platform output signals. Scaling coefficients (λ) between DST and DMT by linear regression analysis were evaluated, too. Results: Scaling coefficients for marker trajectories were identified for all body segments. Head markers trajectories reached maximal value and ankle markers trajectories had a minimal value of scaling coefficient. Hips, knees and ankles markers were approximately symmetrical in the meaning of scaling coefficient. Notable differences of scaling coefficient were detected in head and shoulders markers trajectories which were not symmetrical. The model of postural system behavior was identified by MDST. Conclusion: Value of scaling factor identifies which body segment is predisposed to postural instability. Hypothetically, if statokinesigram represents overall human postural system response to vibration stimuli, then markers data represented particular postural responses. It can be assumed that cumulative sum of particular marker postural responses is equal to statokinesigram.

Keywords: center of pressure (CoP), method of developed statokinesigram trajectory (MDST), model of postural system behavior, retroreflective marker data

Procedia PDF Downloads 342
21903 Text Emotion Recognition by Multi-Head Attention based Bidirectional LSTM Utilizing Multi-Level Classification

Authors: Vishwanath Pethri Kamath, Jayantha Gowda Sarapanahalli, Vishal Mishra, Siddhesh Balwant Bandgar

Abstract:

Recognition of emotional information is essential in any form of communication. Growing HCI (Human-Computer Interaction) in recent times indicates the importance of understanding of emotions expressed and becomes crucial for improving the system or the interaction itself. In this research work, textual data for emotion recognition is used. The text being the least expressive amongst the multimodal resources poses various challenges such as contextual information and also sequential nature of the language construction. In this research work, the proposal is made for a neural architecture to resolve not less than 8 emotions from textual data sources derived from multiple datasets using google pre-trained word2vec word embeddings and a Multi-head attention-based bidirectional LSTM model with a one-vs-all Multi-Level Classification. The emotions targeted in this research are Anger, Disgust, Fear, Guilt, Joy, Sadness, Shame, and Surprise. Textual data from multiple datasets were used for this research work such as ISEAR, Go Emotions, Affect datasets for creating the emotions’ dataset. Data samples overlap or conflicts were considered with careful preprocessing. Our results show a significant improvement with the modeling architecture and as good as 10 points improvement in recognizing some emotions.

Keywords: text emotion recognition, bidirectional LSTM, multi-head attention, multi-level classification, google word2vec word embeddings

Procedia PDF Downloads 171
21902 The Mass Attenuation Coefficients, Effective Atomic Cross Sections, Effective Atomic Numbers and Electron Densities of Some Halides

Authors: Shivalinge Gowda

Abstract:

The total mass attenuation coefficients m/r, of some halides such as, NaCl, KCl, CuCl, NaBr, KBr, RbCl, AgCl, NaI, KI, AgBr, CsI, HgCl2, CdI2 and HgI2 were determined at photon energies 279.2, 320.07, 514.0, 661.6, 1115.5, 1173.2 and 1332.5 keV in a well-collimated narrow beam good geometry set-up using a high resolution, hyper pure germanium detector. The mass attenuation coefficients and the effective atomic cross sections are found to be in good agreement with the XCOM values. From these mass attenuation coefficients, the effective atomic cross sections sa, of the compounds were determined. These effective atomic cross section sa data so obtained are then used to compute the effective atomic numbers Zeff. For this, the interpolation of total attenuation cross-sections of photons of energy E in elements of atomic number Z was performed by using the logarithmic regression analysis of the data measured by the authors and reported earlier for the above said energies along with XCOM data for standard energies. The best-fit coefficients in the photon energy range of 250 to 350 keV, 350 to 500 keV, 500 to 700 keV, 700 to 1000 keV and 1000 to 1500 keV by a piecewise interpolation method were then used to find the Zeff of the compounds with respect to the effective atomic cross section sa from the relation obtained by piece wise interpolation method. Using these Zeff values, the electron densities Nel of halides were also determined. The present Zeff and Nel values of halides are found to be in good agreement with the values calculated from XCOM data and other available published values.

Keywords: mass attenuation coefficient, atomic cross-section, effective atomic number, electron density

Procedia PDF Downloads 375
21901 Developing a Culturally Acceptable End of Life Survey (the VOICES-ESRD/Thai Questionnaire) for Evaluation Health Services Provision of Older Persons with End-Stage Renal Disease (ESRD) in Thailand

Authors: W. Pungchompoo, A. Richardson, L. Brindle

Abstract:

Background: The developing of a culturally acceptable end of life survey (the VOICES-ESRD/Thai questionnaire) is an essential instrument for evaluation health services provision of older persons with ESRD in Thailand. The focus of the questionnaire was on symptoms, symptom control and the health care needs of older people with ESRD who are managed without dialysis. Objective: The objective of this study was to develop and adapt VOICES to make it suitable for use in a population survey in Thailand. Methods: The mixed methods exploratory sequential design was focussed on modifying an instrument. Data collection: A cognitive interviewing technique was implemented, using two cycles of data collection with a sample of 10 bereaved carers and a prototype of the Thai VOICES questionnaire. Qualitative study was used to modify the developing a culturally acceptable end of life survey (the VOICES-ESRD/Thai questionnaire). Data analysis: The data were analysed by using content analysis. Results: The revisions to the prototype questionnaire were made. The results were used to adapt the VOICES questionnaire for use in a population-based survey with older ESRD patients in Thailand. Conclusions: A culturally specific questionnaire was generated during this second phase and issues with questionnaire design were rectified.

Keywords: VOICES-ESRD/Thai questionnaire, cognitive interviewing, end of life survey, health services provision, older persons with ESRD

Procedia PDF Downloads 276
21900 Conservation Studies on Endangered and Potential Native Ornamentals and Their Domestication for Novelty in Floriculture Industry

Authors: Puja Sharma, S. R. Dhiman, Bhararti Kashyap, Y. C. Gupta, Shabnam Pangtu

Abstract:

The experiments were carried out for mass multiplication and domestication of an endangered native tree spp, an orchid and an ornamental shrub having high medicinal value. Floriculture industry is novelty driven, hence the potential of these native ornamentals was assessed for their utilization as a novelty in the industry. For the mass propagation of endangered tree Oroxylum indicum, seed propagation and vegetative propagation techniques were successfully utilized. Highest seed germination was recorded in a medium containing cocopeat and perlite (1:1 v/v). Semi hard wood cuttings treated with IBA 2000 ppm planted in cocopeat+ sand+ perlite medium and maintained at 80% RH has resulted in about 90% rooting. The low growing tree was successfully domestication and has potential to be utilized in landscape industry. In the present study, cutting propagation and division of clump were used as methods for multiplication of Aerides multiflora, a native orchid spp. Soft wood cuttings treated with IBA 500 ppm planted in cocopeat medium was found to be the most suitable vegetative method resulting in 90 % rooting. It was domesticated as pot plant and for making hanging baskets. Propagation through seeds and cuttings was carried out for Pyracantha crenulata, a native ornamental shrub which is a cardiovascular medicine. For vegetative propagation, treatment of basal end of semi- hardwood cuttings of Pyracantha with IBA 3000 ppm (quick dip) and planting in cocopeat under mist chamber maintained at a relative humidity of 70-80% resulted in about 90% rooting out of all applied treatments in the study. For seed propagation, treatment of seeds in boiling water for 20 minutes and planting in cocopeat resulted in 82.55 % germination. The shrub was domesticated for its use as pot plant, protective hedge and for making bonsai.

Keywords: native, endangered, multiplication, domestication, oroxylum, aerides, pyracantha

Procedia PDF Downloads 70
21899 Cost Sensitive Feature Selection in Decision-Theoretic Rough Set Models for Customer Churn Prediction: The Case of Telecommunication Sector Customers

Authors: Emel Kızılkaya Aydogan, Mihrimah Ozmen, Yılmaz Delice

Abstract:

In recent days, there is a change and the ongoing development of the telecommunications sector in the global market. In this sector, churn analysis techniques are commonly used for analysing why some customers terminate their service subscriptions prematurely. In addition, customer churn is utmost significant in this sector since it causes to important business loss. Many companies make various researches in order to prevent losses while increasing customer loyalty. Although a large quantity of accumulated data is available in this sector, their usefulness is limited by data quality and relevance. In this paper, a cost-sensitive feature selection framework is developed aiming to obtain the feature reducts to predict customer churn. The framework is a cost based optional pre-processing stage to remove redundant features for churn management. In addition, this cost-based feature selection algorithm is applied in a telecommunication company in Turkey and the results obtained with this algorithm.

Keywords: churn prediction, data mining, decision-theoretic rough set, feature selection

Procedia PDF Downloads 440
21898 Applying Pre-Accident Observational Methods for Accident Assessment and Prediction at Intersections in Norrkoping City in Sweden

Authors: Ghazwan Al-Haji, Adeyemi Adedokun

Abstract:

Traffic safety at intersections is highly represented, given the fact that accidents occur randomly in time and space. It is necessary to judge whether the intersection is dangerous or not based on short-term observations, and not waiting for many years of assessing historical accident data. There are active and pro-active road infrastructure safety methods for assessing safety at intersections. This study aims to investigate the use of quantitative and qualitative pre-observational methods as the best practice for accident prediction, future black spot identification, and treatment. Historical accident data from STRADA (the Swedish Traffic Accident Data Acquisition) was used within Norrkoping city in Sweden. The ADT (Average Daily Traffic), capacity and speed were used to predict accident rates. Locations with the highest accident records and predicted accident counts were identified and hence audited qualitatively by using Street Audit. The results from these quantitative and qualitative methods were analyzed, validated and compared. The paper provides recommendations on the used methods as well as on how to reduce the accident occurrence at the chosen intersections.

Keywords: intersections, traffic conflict, traffic safety, street audit, accidents predictions

Procedia PDF Downloads 226
21897 Two-Stage Estimation of Tropical Cyclone Intensity Based on Fusion of Coarse and Fine-Grained Features from Satellite Microwave Data

Authors: Huinan Zhang, Wenjie Jiang

Abstract:

Accurate estimation of tropical cyclone intensity is of great importance for disaster prevention and mitigation. Existing techniques are largely based on satellite imagery data, and research and utilization of the inner thermal core structure characteristics of tropical cyclones still pose challenges. This paper presents a two-stage tropical cyclone intensity estimation network based on the fusion of coarse and fine-grained features from microwave brightness temperature data. The data used in this network are obtained from the thermal core structure of tropical cyclones through the Advanced Technology Microwave Sounder (ATMS) inversion. Firstly, the thermal core information in the pressure direction is comprehensively expressed through the maximal intensity projection (MIP) method, constructing coarse-grained thermal core images that represent the tropical cyclone. These images provide a coarse-grained feature range wind speed estimation result in the first stage. Then, based on this result, fine-grained features are extracted by combining thermal core information from multiple view profiles with a distributed network and fused with coarse-grained features from the first stage to obtain the final two-stage network wind speed estimation. Furthermore, to better capture the long-tail distribution characteristics of tropical cyclones, focal loss is used in the coarse-grained loss function of the first stage, and ordinal regression loss is adopted in the second stage to replace traditional single-value regression. The selection of tropical cyclones spans from 2012 to 2021, distributed in the North Atlantic (NA) regions. The training set includes 2012 to 2017, the validation set includes 2018 to 2019, and the test set includes 2020 to 2021. Based on the Saffir-Simpson Hurricane Wind Scale (SSHS), this paper categorizes tropical cyclone levels into three major categories: pre-hurricane, minor hurricane, and major hurricane, with a classification accuracy rate of 86.18% and an intensity estimation error of 4.01m/s for NA based on this accuracy. The results indicate that thermal core data can effectively represent the level and intensity of tropical cyclones, warranting further exploration of tropical cyclone attributes under this data.

Keywords: Artificial intelligence, deep learning, data mining, remote sensing

Procedia PDF Downloads 50
21896 An Approach for Coagulant Dosage Optimization Using Soft Jar Test: A Case Study of Bangkhen Water Treatment Plant

Authors: Ninlawat Phuangchoke, Waraporn Viyanon, Setta Sasananan

Abstract:

The most important process of the water treatment plant process is the coagulation using alum and poly aluminum chloride (PACL), and the value of usage per day is a hundred thousand baht. Therefore, determining the dosage of alum and PACL are the most important factors to be prescribed. Water production is economical and valuable. This research applies an artificial neural network (ANN), which uses the Levenberg–Marquardt algorithm to create a mathematical model (Soft Jar Test) for prediction chemical dose used to coagulation such as alum and PACL, which input data consists of turbidity, pH, alkalinity, conductivity, and, oxygen consumption (OC) of Bangkhen water treatment plant (BKWTP) Metropolitan Waterworks Authority. The data collected from 1 January 2019 to 31 December 2019 cover changing seasons of Thailand. The input data of ANN is divided into three groups training set, test set, and validation set, which the best model performance with a coefficient of determination and mean absolute error of alum are 0.73, 3.18, and PACL is 0.59, 3.21 respectively.

Keywords: soft jar test, jar test, water treatment plant process, artificial neural network

Procedia PDF Downloads 158
21895 Drought Detection and Water Stress Impact on Vegetation Cover Sustainability Using Radar Data

Authors: E. Farg, M. M. El-Sharkawy, M. S. Mostafa, S. M. Arafat

Abstract:

Mapping water stress provides important baseline data for sustainable agriculture. Recent developments in the new Sentinel-1 data which allow the acquisition of high resolution images and varied polarization capabilities. This study was conducted to detect and quantify vegetation water content from canopy backscatter for extracting spatial information to encourage drought mapping activities throughout new reclaimed sandy soils in western Nile delta, Egypt. The performance of radar imagery in agriculture strongly depends on the sensor polarization capability. The dual mode capabilities of Sentinel-1 improve the ability to detect water stress and the backscatter from the structure components improves the identification and separation of vegetation types with various canopy structures from other features. The fieldwork data allowed identifying of water stress zones based on land cover structure; those classes were used for producing harmonious water stress map. The used analysis techniques and results show high capability of active sensors data in water stress mapping and monitoring especially when integrated with multi-spectral medium resolution images. Also sub soil drip irrigation systems cropped areas have lower drought and water stress than center pivot sprinkler irrigation systems. That refers to high level of evaporation from soil surface in initial growth stages. Results show that high relationship between vegetation indices such as Normalized Difference Vegetation Index NDVI the observed radar backscattering. In addition to observational evidence showed that the radar backscatter is highly sensitive to vegetation water stress, and essentially potential to monitor and detect vegetative cover drought.

Keywords: canopy backscatter, drought, polarization, NDVI

Procedia PDF Downloads 137
21894 New Technique of Estimation of Charge Carrier Density of Nanomaterials from Thermionic Emission Data

Authors: Dilip K. De, Olukunle C. Olawole, Emmanuel S. Joel, Moses Emetere

Abstract:

A good number of electronic properties such as electrical and thermal conductivities depend on charge carrier densities of nanomaterials. By controlling the charge carrier densities during the fabrication (or growth) processes, the physical properties can be tuned. In this paper, we discuss a new technique of estimating the charge carrier densities of nanomaterials from the thermionic emission data using the newly modified Richardson-Dushman equation. We find that the technique yields excellent results for graphene and carbon nanotube.

Keywords: charge carrier density, nano materials, new technique, thermionic emission

Procedia PDF Downloads 311
21893 Field Environment Sensing and Modeling for Pears towards Precision Agriculture

Authors: Tatsuya Yamazaki, Kazuya Miyakawa, Tomohiko Sugiyama, Toshitaka Iwatani

Abstract:

The introduction of sensor technologies into agriculture is a necessary step to realize Precision Agriculture. Although sensing methodologies themselves have been prevailing owing to miniaturization and reduction in costs of sensors, there are some difficulties to analyze and understand the sensing data. Targeting at pears ’Le Lectier’, which is particular to Niigata in Japan, cultivation environmental data have been collected at pear fields by eight sorts of sensors: field temperature, field humidity, rain gauge, soil water potential, soil temperature, soil moisture, inner-bag temperature, and inner-bag humidity sensors. With regard to the inner-bag temperature and humidity sensors, they are used to measure the environment inside the fruit bag used for pre-harvest bagging of pears. In this experiment, three kinds of fruit bags were used for the pre-harvest bagging. After over 100 days continuous measurement, volumes of sensing data have been collected. Firstly, correlation analysis among sensing data measured by respective sensors reveals that one sensor can replace another sensor so that more efficient and cost-saving sensing systems can be proposed to pear farmers. Secondly, differences in characteristic and performance of the three kinds of fruit bags are clarified by the measurement results by the inner-bag environmental sensing. It is found that characteristic and performance of the inner-bags significantly differ from each other by statistical analysis. Lastly, a relational model between the sensing data and the pear outlook quality is established by use of Structural Equation Model (SEM). Here, the pear outlook quality is related with existence of stain, blob, scratch, and so on caused by physiological impair or diseases. Conceptually SEM is a combination of exploratory factor analysis and multiple regression. By using SEM, a model is constructed to connect independent and dependent variables. The proposed SEM model relates the measured sensing data and the pear outlook quality determined on the basis of farmer judgement. In particularly, it is found that the inner-bag humidity variable relatively affects the pear outlook quality. Therefore, inner-bag humidity sensing might help the farmers to control the pear outlook quality. These results are supported by a large quantity of inner-bag humidity data measured over the years 2014, 2015, and 2016. The experimental and analytical results in this research contribute to spreading Precision Agriculture technologies among the farmers growing ’Le Lectier’.

Keywords: precision agriculture, pre-harvest bagging, sensor fusion, structural equation model

Procedia PDF Downloads 307
21892 Metagenomics-Based Molecular Epidemiology of Viral Diseases

Authors: Vyacheslav Furtak, Merja Roivainen, Olga Mirochnichenko, Majid Laassri, Bella Bidzhieva, Tatiana Zagorodnyaya, Vladimir Chizhikov, Konstantin Chumakov

Abstract:

Molecular epidemiology and environmental surveillance are parts of a rational strategy to control infectious diseases. They have been widely used in the worldwide campaign to eradicate poliomyelitis, which otherwise would be complicated by the inability to rapidly respond to outbreaks and determine sources of the infection. The conventional scheme involves isolation of viruses from patients and the environment, followed by their identification by nucleotide sequences analysis to determine phylogenetic relationships. This is a tedious and time-consuming process that yields definitive results when it may be too late to implement countermeasures. Because of the difficulty of high-throughput full-genome sequencing, most such studies are conducted by sequencing only capsid genes or their parts. Therefore the important information about the contribution of other parts of the genome and inter- and intra-species recombination to viral evolution is not captured. Here we propose a new approach based on the rapid concentration of sewage samples with tangential flow filtration followed by deep sequencing and reconstruction of nucleotide sequences of viruses present in the samples. The entire nucleic acids content of each sample is sequenced, thus preserving in digital format the complete spectrum of viruses. A set of rapid algorithms was developed to separate deep sequence reads into discrete populations corresponding to each virus and assemble them into full-length consensus contigs, as well as to generate a complete profile of sequence heterogeneities in each of them. This provides an effective approach to study molecular epidemiology and evolution of natural viral populations.

Keywords: poliovirus, eradication, environmental surveillance, laboratory diagnosis

Procedia PDF Downloads 271
21891 “Post-Industrial” Journalism as a Creative Industry

Authors: Lynette Sheridan Burns, Benjamin J. Matthews

Abstract:

The context of post-industrial journalism is one in which the material circumstances of mechanical publication have been displaced by digital technologies, increasing the distance between the orthodoxy of the newsroom and the culture of journalistic writing. Content is, with growing frequency, created for delivery via the internet, publication on web-based ‘platforms’ and consumption on screen media. In this environment, the question is not ‘who is a journalist?’ but ‘what is journalism?’ today. The changes bring into sharp relief new distinctions between journalistic work and journalistic labor, providing a key insight into the current transition between the industrial journalism of the 20th century, and the post-industrial journalism of the present. In the 20th century, the work of journalists and journalistic labor went hand-in-hand as most journalists were employees of news organizations, whilst in the 21st century evidence of a decoupling of ‘acts of journalism’ (work) and journalistic employment (labor) is beginning to appear. This 'decoupling' of the work and labor that underpins journalism practice is far reaching in its implications, not least for institutional structures. Under these conditions we are witnessing the emergence of expanded ‘entrepreneurial’ journalism, based on smaller, more independent and agile - if less stable - enterprise constructs that are a feature of creative industries. Entrepreneurial journalism is realized in a range of organizational forms from social enterprise, through to profit driven start-ups and hybrids of the two. In all instances, however, the primary motif of the organization is an ideological definition of journalism. An example is the Scoop Foundation for Public Interest Journalism in New Zealand, which owns and operates Scoop Publishing Limited, a not for profit company and social enterprise that publishes an independent news site that claims to have over 500,000 monthly users. Our paper demonstrates that this journalistic work meets the ideological definition of journalism; conducted within the creative industries using an innovative organizational structure that offers a new, viable post-industrial future for journalism.

Keywords: creative industries, digital communication, journalism, post industrial

Procedia PDF Downloads 274
21890 Organic Tuber Production Fosters Food Security and Soil Health: A Decade of Evidence from India

Authors: G. Suja, J. Sreekumar, A. N. Jyothi, V. S. Santhosh Mithra

Abstract:

Worldwide concerns regarding food safety, environmental degradation and threats to human health have generated interest in alternative systems like organic farming. Tropical tuber crops, cassava, sweet potato, yams, and aroids are food-cum-nutritional security-cum climate resilient crops. These form stable or subsidiary food for about 500 million global population. Cassava, yams (white yam, greater yam, and lesser yam) and edible aroids (elephant foot yam, taro, and tannia) are high energy tuberous vegetables with good taste and nutritive value. Seven on-station field experiments at ICAR-Central Tuber Crops Research Institute, Thiruvananthapuram, India and seventeen on-farm trials in three districts of Kerala, were conducted over a decade (2004-2015) to compare the varietal response, yield, quality and soil properties under organic vs conventional system in these crops and to develop a learning system based on the data generated. The industrial, as well as domestic varieties of cassava, the elite and local varieties of elephant foot yam and taro and the three species of Dioscorea (yams), were on a par under both systems. Organic management promoted yield by 8%, 20%, 9%, 11% and 7% over conventional practice in cassava, elephant foot yam, white yam, greater yam and lesser yam respectively. Elephant foot yam was the most responsive to organic management followed by yams and cassava. In taro, slight yield reduction (5%) was noticed under organic farming with almost similar tuber quality. The tuber quality was improved with higher dry matter, starch, crude protein, K, Ca and Mg contents. The anti-nutritional factors, oxalate content in elephant foot yam and cyanogenic glucoside content in cassava were lowered by 21 and 12.4% respectively. Organic plots had significantly higher water holding capacity, pH, available K, Fe, Mn and Cu, higher soil organic matter, available N, P, exchangeable Ca and Mg, dehydrogenase enzyme activity and microbial count. Organic farming scored significantly higher soil quality index (1.93) than conventional practice (1.46). The soil quality index was driven by water holding capacity, pH and available Zn followed by soil organic matter. Organic management enhanced net profit by 20-40% over chemical farming. A case in point is the cost-benefit analysis in elephant foot yam which indicated that the net profit was 28% higher and additional income of Rs. 47,716 ha-1 was obtained due to organic farming. Cost-effective technologies were field validated. The on-station technologies developed were validated and popularized through on-farm trials in 10 sites (5 ha) under National Horticulture Mission funded programme in elephant foot yam and seven sites in yams and taro. The technologies are included in the Package of Practices Recommendations for crops of Kerala Agricultural University. A learning system developed using artificial neural networks (ANN) predicted the performance of elephant foot yam organic system. Use of organically produced seed materials, seed treatment in cow-dung, neem cake, bio-inoculant slurry, farmyard manure incubated with bio-inoculants, green manuring, use of neem cake, bio-fertilizers and ash formed the strategies for organic production. Organic farming is an eco-friendly management strategy that enables 10-20% higher yield, quality tubers and maintenance of soil health in tuber crops.

Keywords: eco-agriculture, quality, root crops, healthy soil, yield

Procedia PDF Downloads 328
21889 Use of Artificial Neural Networks to Estimate Evapotranspiration for Efficient Irrigation Management

Authors: Adriana Postal, Silvio C. Sampaio, Marcio A. Villas Boas, Josué P. Castro

Abstract:

This study deals with the estimation of reference evapotranspiration (ET₀) in an agricultural context, focusing on efficient irrigation management to meet the growing interest in the sustainable management of water resources. Given the importance of water in agriculture and its scarcity in many regions, efficient use of this resource is essential to ensure food security and environmental sustainability. The methodology used involved the application of artificial intelligence techniques, specifically Multilayer Perceptron (MLP) Artificial Neural Networks (ANNs), to predict ET₀ in the state of Paraná, Brazil. The models were trained and validated with meteorological data from the Brazilian National Institute of Meteorology (INMET), together with data obtained from a producer's weather station in the western region of Paraná. Two optimizers (SGD and Adam) and different meteorological variables, such as temperature, humidity, solar radiation, and wind speed, were explored as inputs to the models. Nineteen configurations with different input variables were tested; amidst them, configuration 9, with 8 input variables, was identified as the most efficient of all. Configuration 10, with 4 input variables, was considered the most effective, considering the smallest number of variables. The main conclusions of this study show that MLP ANNs are capable of accurately estimating ET₀, providing a valuable tool for irrigation management in agriculture. Both configurations (9 and 10) showed promising performance in predicting ET₀. The validation of the models with cultivator data underlined the practical relevance of these tools and confirmed their generalization ability for different field conditions. The results of the statistical metrics, including Mean Absolute Error (MAE), Mean Squared Error (MSE), Root Mean Squared Error (RMSE), and Coefficient of Determination (R²), showed excellent agreement between the model predictions and the observed data, with MAE as low as 0.01 mm/day and 0.03 mm/day, respectively. In addition, the models achieved an R² between 0.99 and 1, indicating a satisfactory fit to the real data. This agreement was also confirmed by the Kolmogorov-Smirnov test, which evaluates the agreement of the predictions with the statistical behavior of the real data and yields values between 0.02 and 0.04 for the producer data. In addition, the results of this study suggest that the developed technique can be applied to other locations by using specific data from these sites to further improve ET₀ predictions and thus contribute to sustainable irrigation management in different agricultural regions. The study has some limitations, such as the use of a single ANN architecture and two optimizers, the validation with data from only one producer, and the possible underestimation of the influence of seasonality and local climate variability. An irrigation management application using the most efficient models from this study is already under development. Future research can explore different ANN architectures and optimization techniques, validate models with data from multiple producers and regions, and investigate the model's response to different seasonal and climatic conditions.

Keywords: agricultural technology, neural networks in agriculture, water efficiency, water use optimization

Procedia PDF Downloads 34
21888 Need of Trained Clinical Research Professionals Globally to Conduct Clinical Trials

Authors: Tambe Daniel Atem

Abstract:

Background: Clinical Research is an organized research on human beings intended to provide adequate information on the drug use as a therapeutic agent on its safety and efficacy. The significance of the study is to educate the global health and life science graduates in Clinical Research in depth to perform better as it involves testing drugs on human beings. Objectives: to provide an overall understanding of the scientific approach to the evaluation of new and existing medical interventions and to apply ethical and regulatory principles appropriate to any individual research. Methodology: It is based on – Primary data analysis and Secondary data analysis. Primary data analysis: means the collection of data from journals, the internet, and other online sources. Secondary data analysis: a survey was conducted with a questionnaire to interview the Clinical Research Professionals to understand the need of training to perform clinical trials globally. The questionnaire consisted details of the professionals working with the expertise. It also included the areas of clinical research which needed intense training before entering into hardcore clinical research domain. Results: The Clinical Trials market worldwide worth over USD 26 billion and the industry has employed an estimated 2,10,000 people in the US and over 70,000 in the U.K, and they form one-third of the total research and development staff. There are more than 2,50,000 vacant positions globally with salary variations in the regions for a Clinical Research Coordinator. R&D cost on new drug development is estimated at US$ 70-85 billion. The cost of doing clinical trials for a new drug is US$ 200-250 million. Due to an increase trained Clinical Research Professionals India has emerged as a global hub for clinical research. The Global Clinical Trial outsourcing opportunity in India in the pharmaceutical industry increased to more than $2 billion in 2014 due to increased outsourcing from U.S and Europe to India. Conclusion: Assessment of training need is recommended for newer Clinical Research Professionals and trial sites, especially prior the conduct of larger confirmatory clinical trials.

Keywords: clinical research, clinical trials, clinical research professionals

Procedia PDF Downloads 446
21887 Smart Brain Wave Sensor for Paralyzed- a Real Time Implementation

Authors: U.B Mahadevswamy UBM, Siraj Ahmed Siraj

Abstract:

As the title of the paper indicates about brainwaves and its uses for various applications based on their frequencies and different parameters which can be implemented as real time application with the title a smart brain wave sensor system for paralyzed patients. Brain wave sensing is to detect a person's mental status. The purpose of brain wave sensing is to give exact treatment to paralyzed patients. The data or signal is obtained from the brainwaves sensing band. This data are converted as object files using Visual Basics. The processed data is further sent to Arduino which has the human's behavioral aspects like emotions, sensations, feelings, and desires. The proposed device can sense human brainwaves and detect the percentage of paralysis that the person is suffering. The advantage of this paper is to give a real-time smart sensor device for paralyzed patients with paralysis percentage for their exact treatment. Keywords:-Brainwave sensor, BMI, Brain scan, EEG, MCH.

Keywords: Keywords:-Brainwave sensor , BMI, Brain scan, EEG, MCH

Procedia PDF Downloads 147