Search results for: maximal data sets
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25280

Search results for: maximal data sets

24230 Recommender System Based on Mining Graph Databases for Data-Intensive Applications

Authors: Mostafa Gamal, Hoda K. Mohamed, Islam El-Maddah, Ali Hamdi

Abstract:

In recent years, many digital documents on the web have been created due to the rapid growth of ’social applications’ communities or ’Data-intensive applications’. The evolution of online-based multimedia data poses new challenges in storing and querying large amounts of data for online recommender systems. Graph data models have been shown to be more efficient than relational data models for processing complex data. This paper will explain the key differences between graph and relational databases, their strengths and weaknesses, and why using graph databases is the best technology for building a realtime recommendation system. Also, The paper will discuss several similarity metrics algorithms that can be used to compute a similarity score of pairs of nodes based on their neighbourhoods or their properties. Finally, the paper will discover how NLP strategies offer the premise to improve the accuracy and coverage of realtime recommendations by extracting the information from the stored unstructured knowledge, which makes up the bulk of the world’s data to enrich the graph database with this information. As the size and number of data items are increasing rapidly, the proposed system should meet current and future needs.

Keywords: graph databases, NLP, recommendation systems, similarity metrics

Procedia PDF Downloads 92
24229 Digital Revolution a Veritable Infrastructure for Technological Development

Authors: Osakwe Jude Odiakaosa

Abstract:

Today’s digital society is characterized by e-education or e-learning, e-commerce, and so on. All these have been propelled by digital revolution. Digital technology such as computer technology, Global Positioning System (GPS) and Geographic Information System (GIS) has been having a tremendous impact on the field of technology. This development has positively affected the scope, methods, speed of data acquisition, data management and the rate of delivery of the results (map and other map products) of data processing. This paper tries to address the impact of revolution brought by digital technology.

Keywords: digital revolution, internet, technology, data management

Procedia PDF Downloads 429
24228 Aging-Related Changes in Calf Muscle Function: Implications for Venous Hemodynamic and the Role of External Mechanical Activation

Authors: Bhavatharani S., Boopathy V., Kavin S., Naveethkumar R.

Abstract:

Context: Resistance training with blood flow restriction (BFR) has increased in clinical rehabilitation due to the substantial benefits observed in augmenting muscle mass and strength using low loads. However, there is a great variability of training pressures for clinical populations as well as methods to estimate it. The aim of this study was to estimate the percentage of maximal BFR that could result by applying different methodologies based on arbitrary or individual occlusion levels using a cuff width between 9 and 13 cm. Design: A secondary analysis was performed on the combined databases of 2 previous larger studies using BFR training. Methods: To estimate these percentages, the occlusion values needed to reach complete BFR (100% limb occlusion pressure [LOP]) were estimated by Doppler ultrasound. Seventy-five participants (age 24.32 [4.86] y; weight: 78.51 [14.74] kg; height: 1.77 [0.09] m) were enrolled in the laboratory study for measuring LOP in the thigh, arm, or calf. Results: When arbitrary values of restriction are applied, a supra-occlusive LOP between 120% and 190% LOP may result. Furthermore, the application of 130% resting brachial systolic blood pressure creates a similar occlusive stimulus as 100% LOP. Conclusions: Methods using 100 mm Hg and the resting brachial systolic blood pressure could represent the safest application prescriptions as they resulted in applied pressures between 60% and 80% LOP. One hundred thirty percent of the resting brachial systolic blood pressure could be used to indirectly estimate 100% LOP at cuff widths between 9 and 13 cm. Finally, methodologies that use standard values of 200 and, 300 mm Hg far exceed LOP and may carry additional risk during BFR exercise.

Keywords: lower limb rehabilitation, ESP32, pneumatics for medical, programmed rehabilitation

Procedia PDF Downloads 68
24227 Scheduling in a Single-Stage, Multi-Item Compatible Process Using Multiple Arc Network Model

Authors: Bokkasam Sasidhar, Ibrahim Aljasser

Abstract:

The problem of finding optimal schedules for each equipment in a production process is considered, which consists of a single stage of manufacturing and which can handle different types of products, where changeover for handling one type of product to the other type incurs certain costs. The machine capacity is determined by the upper limit for the quantity that can be processed for each of the products in a set up. The changeover costs increase with the number of set ups and hence to minimize the costs associated with the product changeover, the planning should be such that similar types of products should be processed successively so that the total number of changeovers and in turn the associated set up costs are minimized. The problem of cost minimization is equivalent to the problem of minimizing the number of set ups or equivalently maximizing the capacity utilization in between every set up or maximizing the total capacity utilization. Further, the production is usually planned against customers’ orders, and generally different customers’ orders are assigned one of the two priorities – “normal” or “priority” order. The problem of production planning in such a situation can be formulated into a Multiple Arc Network (MAN) model and can be solved sequentially using the algorithm for maximizing flow along a MAN and the algorithm for maximizing flow along a MAN with priority arcs. The model aims to provide optimal production schedule with an objective of maximizing capacity utilization, so that the customer-wise delivery schedules are fulfilled, keeping in view the customer priorities. Algorithms have been presented for solving the MAN formulation of the production planning with customer priorities. The application of the model is demonstrated through numerical examples.

Keywords: scheduling, maximal flow problem, multiple arc network model, optimization

Procedia PDF Downloads 390
24226 BigCrypt: A Probable Approach of Big Data Encryption to Protect Personal and Business Privacy

Authors: Abdullah Al Mamun, Talal Alkharobi

Abstract:

As data size is growing up, people are became more familiar to store big amount of secret information into cloud storage. Companies are always required to need transfer massive business files from one end to another. We are going to lose privacy if we transmit it as it is and continuing same scenario repeatedly without securing the communication mechanism means proper encryption. Although asymmetric key encryption solves the main problem of symmetric key encryption but it can only encrypt limited size of data which is inapplicable for large data encryption. In this paper we propose a probable approach of pretty good privacy for encrypt big data using both symmetric and asymmetric keys. Our goal is to achieve encrypt huge collection information and transmit it through a secure communication channel for committing the business and personal privacy. To justify our method an experimental dataset from three different platform is provided. We would like to show that our approach is working for massive size of various data efficiently and reliably.

Keywords: big data, cloud computing, cryptography, hadoop, public key

Procedia PDF Downloads 307
24225 Implementation of Big Data Concepts Led by the Business Pressures

Authors: Snezana Savoska, Blagoj Ristevski, Violeta Manevska, Zlatko Savoski, Ilija Jolevski

Abstract:

Big data is widely accepted by the pharmaceutical companies as a result of business demands create through legal pressure. Pharmaceutical companies have many legal demands as well as standards’ demands and have to adapt their procedures to the legislation. To manage with these demands, they have to standardize the usage of the current information technology and use the latest software tools. This paper highlights some important aspects of experience with big data projects implementation in a pharmaceutical Macedonian company. These projects made improvements of their business processes by the help of new software tools selected to comply with legal and business demands. They use IT as a strategic tool to obtain competitive advantage on the market and to reengineer the processes towards new Internet economy and quality demands. The company is required to manage vast amounts of structured as well as unstructured data. For these reasons, they implement projects for emerging and appropriate software tools which have to deal with big data concepts accepted in the company.

Keywords: big data, unstructured data, SAP ERP, documentum

Procedia PDF Downloads 253
24224 Saving Energy at a Wastewater Treatment Plant through Electrical and Production Data Analysis

Authors: Adriano Araujo Carvalho, Arturo Alatrista Corrales

Abstract:

This paper intends to show how electrical energy consumption and production data analysis were used to find opportunities to save energy at Taboada wastewater treatment plant in Callao, Peru. In order to access the data, it was used independent data networks for both electrical and process instruments, which were taken to analyze under an ISO 50001 energy audit, which considered, thus, Energy Performance Indexes for each process and a step-by-step guide presented in this text. Due to the use of aforementioned methodology and data mining techniques applied on information gathered through electronic multimeters (conveniently placed on substation switchboards connected to a cloud network), it was possible to identify thoroughly the performance of each process and thus, evidence saving opportunities which were previously hidden before. The data analysis brought both costs and energy reduction, allowing the plant to save significant resources and to be certified under ISO 50001.

Keywords: energy and production data analysis, energy management, ISO 50001, wastewater treatment plant energy analysis

Procedia PDF Downloads 181
24223 Predicting Photovoltaic Energy Profile of Birzeit University Campus Based on Weather Forecast

Authors: Muhammad Abu-Khaizaran, Ahmad Faza’, Tariq Othman, Yahia Yousef

Abstract:

This paper presents a study to provide sufficient and reliable information about constructing a Photovoltaic energy profile of the Birzeit University campus (BZU) based on the weather forecast. The developed Photovoltaic energy profile helps to predict the energy yield of the Photovoltaic systems based on the weather forecast and hence helps planning energy production and consumption. Two models will be developed in this paper; a Clear Sky Irradiance model and a Cloud-Cover Radiation model to predict the irradiance for a clear sky day and a cloudy day, respectively. The adopted procedure for developing such models takes into consideration two levels of abstraction. First, irradiance and weather data were acquired by a sensory (measurement) system installed on the rooftop of the Information Technology College building at Birzeit University campus. Second, power readings of a fully operational 51kW commercial Photovoltaic system installed in the University at the rooftop of the adjacent College of Pharmacy-Nursing and Health Professions building are used to validate the output of a simulation model and to help refine its structure. Based on a comparison between a mathematical model, which calculates Clear Sky Irradiance for the University location and two sets of accumulated measured data, it is found that the simulation system offers an accurate resemblance to the installed PV power station on clear sky days. However, these comparisons show a divergence between the expected energy yield and actual energy yield in extreme weather conditions, including clouding and soiling effects. Therefore, a more accurate prediction model for irradiance that takes into consideration weather factors, such as relative humidity and cloudiness, which affect irradiance, was developed; Cloud-Cover Radiation Model (CRM). The equivalent mathematical formulas implement corrections to provide more accurate inputs to the simulation system. The results of the CRM show a very good match with the actual measured irradiance during a cloudy day. The developed Photovoltaic profile helps in predicting the output energy yield of the Photovoltaic system installed at the University campus based on the predicted weather conditions. The simulation and practical results for both models are in a very good match.

Keywords: clear-sky irradiance model, cloud-cover radiation model, photovoltaic, weather forecast

Procedia PDF Downloads 118
24222 Data Clustering in Wireless Sensor Network Implemented on Self-Organization Feature Map (SOFM) Neural Network

Authors: Krishan Kumar, Mohit Mittal, Pramod Kumar

Abstract:

Wireless sensor network is one of the most promising communication networks for monitoring remote environmental areas. In this network, all the sensor nodes are communicated with each other via radio signals. The sensor nodes have capability of sensing, data storage and processing. The sensor nodes collect the information through neighboring nodes to particular node. The data collection and processing is done by data aggregation techniques. For the data aggregation in sensor network, clustering technique is implemented in the sensor network by implementing self-organizing feature map (SOFM) neural network. Some of the sensor nodes are selected as cluster head nodes. The information aggregated to cluster head nodes from non-cluster head nodes and then this information is transferred to base station (or sink nodes). The aim of this paper is to manage the huge amount of data with the help of SOM neural network. Clustered data is selected to transfer to base station instead of whole information aggregated at cluster head nodes. This reduces the battery consumption over the huge data management. The network lifetime is enhanced at a greater extent.

Keywords: artificial neural network, data clustering, self organization feature map, wireless sensor network

Procedia PDF Downloads 500
24221 Review and Comparison of Associative Classification Data Mining Approaches

Authors: Suzan Wedyan

Abstract:

Data mining is one of the main phases in the Knowledge Discovery Database (KDD) which is responsible of finding hidden and useful knowledge from databases. There are many different tasks for data mining including regression, pattern recognition, clustering, classification, and association rule. In recent years a promising data mining approach called associative classification (AC) has been proposed, AC integrates classification and association rule discovery to build classification models (classifiers). This paper surveys and critically compares several AC algorithms with reference of the different procedures are used in each algorithm, such as rule learning, rule sorting, rule pruning, classifier building, and class allocation for test cases.

Keywords: associative classification, classification, data mining, learning, rule ranking, rule pruning, prediction

Procedia PDF Downloads 523
24220 Modelling Flood Events in Botswana (Palapye) for Protecting Roads Structure against Floods

Authors: Thabo M. Bafitlhile, Adewole Oladele

Abstract:

Botswana has been affected by floods since long ago and is still experiencing this tragic event. Flooding occurs mostly in the North-West, North-East, and parts of Central district due to heavy rainfalls experienced in these areas. The torrential rains destroyed homes, roads, flooded dams, fields and destroyed livestock and livelihoods. Palapye is one area in the central district that has been experiencing floods ever since 1995 when its greatest flood on record occurred. Heavy storms result in floods and inundation; this has been exacerbated by poor and absence of drainage structures. Since floods are a part of nature, they have existed and will to continue to exist, hence more destruction. Furthermore floods and highway plays major role in erosion and destruction of roads structures. Already today, many culverts, trenches, and other drainage facilities lack the capacity to deal with current frequency for extreme flows. Future changes in the pattern of hydro climatic events will have implications for the design and maintenance costs of roads. Increase in rainfall and severe weather events can affect the demand for emergent responses. Therefore flood forecasting and warning is a prerequisite for successful mitigation of flood damage. In flood prone areas like Palapye, preventive measures should be taken to reduce possible adverse effects of floods on the environment including road structures. Therefore this paper attempts to estimate return periods associated with huge storms of different magnitude from recorded historical rainfall depth using statistical method. The method of annual maxima was used to select data sets for the rainfall analysis. In the statistical method, the Type 1 extreme value (Gumbel), Log Normal, Log Pearson 3 distributions were all applied to the annual maximum series for Palapye area to produce IDF curves. The Kolmogorov-Smirnov test and Chi Squared were used to confirm the appropriateness of fitted distributions for the location and the data do fit the distributions used to predict expected frequencies. This will be a beneficial tool for urgent flood forecasting and water resource administration as proper drainage design will be design based on the estimated flood events and will help to reclaim and protect the road structures from adverse impacts of flood.

Keywords: drainage, estimate, evaluation, floods, flood forecasting

Procedia PDF Downloads 355
24219 Hierarchical Checkpoint Protocol in Data Grids

Authors: Rahma Souli-Jbali, Minyar Sassi Hidri, Rahma Ben Ayed

Abstract:

Grid of computing nodes has emerged as a representative means of connecting distributed computers or resources scattered all over the world for the purpose of computing and distributed storage. Since fault tolerance becomes complex due to the availability of resources in decentralized grid environment, it can be used in connection with replication in data grids. The objective of our work is to present fault tolerance in data grids with data replication-driven model based on clustering. The performance of the protocol is evaluated with Omnet++ simulator. The computational results show the efficiency of our protocol in terms of recovery time and the number of process in rollbacks.

Keywords: data grids, fault tolerance, clustering, chandy-lamport

Procedia PDF Downloads 318
24218 An Observation of the Information Technology Research and Development Based on Article Data Mining: A Survey Study on Science Direct

Authors: Muhammet Dursun Kaya, Hasan Asil

Abstract:

One of the most important factors of research and development is the deep insight into the evolutions of scientific development. The state-of-the-art tools and instruments can considerably assist the researchers, and many of the world organizations have become aware of the advantages of data mining for the acquisition of the knowledge required for the unstructured data. This paper was an attempt to review the articles on the information technology published in the past five years with the aid of data mining. A clustering approach was used to study these articles, and the research results revealed that three topics, namely health, innovation, and information systems, have captured the special attention of the researchers.

Keywords: information technology, data mining, scientific development, clustering

Procedia PDF Downloads 261
24217 Security in Resource Constraints: Network Energy Efficient Encryption

Authors: Mona Almansoori, Ahmed Mustafa, Ahmad Elshamy

Abstract:

Wireless nodes in a sensor network gather and process critical information designed to process and communicate, information flooding through such network is critical for decision making and data processing, the integrity of such data is one of the most critical factors in wireless security without compromising the processing and transmission capability of the network. This paper presents mechanism to securely transmit data over a chain of sensor nodes without compromising the throughput of the network utilizing available battery resources available at the sensor node.

Keywords: hybrid protocol, data integrity, lightweight encryption, neighbor based key sharing, sensor node data processing, Z-MAC

Procedia PDF Downloads 134
24216 Data Mining Techniques for Anti-Money Laundering

Authors: M. Sai Veerendra

Abstract:

Today, money laundering (ML) poses a serious threat not only to financial institutions but also to the nation. This criminal activity is becoming more and more sophisticated and seems to have moved from the cliché of drug trafficking to financing terrorism and surely not forgetting personal gain. Most of the financial institutions internationally have been implementing anti-money laundering solutions (AML) to fight investment fraud activities. However, traditional investigative techniques consume numerous man-hours. Recently, data mining approaches have been developed and are considered as well-suited techniques for detecting ML activities. Within the scope of a collaboration project on developing a new data mining solution for AML Units in an international investment bank in Ireland, we survey recent data mining approaches for AML. In this paper, we present not only these approaches but also give an overview on the important factors in building data mining solutions for AML activities.

Keywords: data mining, clustering, money laundering, anti-money laundering solutions

Procedia PDF Downloads 524
24215 Mental Illness, Dargahs and Healing: A Qualitative Exploration in a North Indian City

Authors: Reetinder Kaur, R. K. Pathak

Abstract:

Mental health is recognised as an important global health concern. World Health Organisation in 2004 estimated that neuropsychiatric illnesses in India account for 10.8 percent of the global burden. The prevalence of serious mental illnesses is estimated as 6.5 percent by National Commission of Macroeconomics and Health in 2005. India spends only 0.06 percent of its health budget on mental health. One of the major problems that exist in Indian mental health care is the treatment gap due to scarcity of manpower, inadequate infrastructure and deficiencies in policy initiatives. As a result, traditional healing is a popular resource for mentally ill individuals and their families. The various traditional healing resources include faith healers, healers at temples and Dargahs. Chandigarh is a Union Territory located in North India. It has surplus manpower and infrastructure available for mental health care. Inspite of availability of mental health care services, mentally ill individuals and their families seek help from traditional healers at various Dargahs within or outside Chandigarh. For the present study, the data was collected from four dargahs. A total of thirty patients medically diagnosed with various mental illnesses, their family members who accompanied them and healers were part of this study. The aim of the study was to: Understand the interactions between healer, patient and family members during the course of treatment, understand explanations of mental illnesses and analyse the healing practices in context of culture. The interviews were conducted using an interview guide for the three sets of informants: Healers, patients and family members. The interview guide for healer focussed on the healing process, healer’s understanding of patient’s explanatory models, healer’s knowledge about mental illnesses and types of these illnesses cured by the healer. The interview guide for patients and family members focussed on their understanding of the symptoms, explanations for illness and help-seeking behaviour. The patients were observed over the weeks (every Thursday, the day of pir and healing) during their visits to the healer. Detailed discussions were made with the healer regarding the healing process and benefits of healing. The data was analysed thematically and the themes: The role of sacred, holistic healing, healer’s understanding of patient’s explanatory models of mental illness, the patient’s, and family’s understanding of mental illnesses, healer’s knowledge about mental illnesses, types of mental illnesses cured by the healer, bad dreams and their interpretation emerged. From the analysis of data, it was found that the healers concentrate their interventions in the social arena, ‘curing’ distressed patients by bringing significant changes in their social environment. It is suggested that in order to make the mental health care services effective in India, the collaboration between healers and psychiatrist is essential. However, certain specifications need to be made to make this kind of collaboration successful and beneficial for the stakeholders.

Keywords: Dargah, mental illness, traditional healing, policy

Procedia PDF Downloads 301
24214 Development of High-Efficiency Down-Conversion Fluoride Phosphors to Increase the Efficiency of Solar Panels

Authors: S. V. Kuznetsov, M. N. Mayakova, V. Yu. Proydakova, V. V. Pavlov, A. S. Nizamutdinov, O. A. Morozov, V. V. Voronov, P. P. Fedorov

Abstract:

Increase in the share of electricity received by conversion of solar energy results in the reduction of the industrial impact on the environment from the use of the hydrocarbon energy sources. One way to increase said share is to improve the efficiency of solar energy conversion in silicon-based solar panels. Such efficiency increase can be achieved by transferring energy from sunlight-insensitive areas of work of silicon solar panels to the area of their photoresistivity. To achieve this goal, a transition to new luminescent materials with the high quantum yield of luminescence is necessary. Improvement in the quantum yield can be achieved by quantum cutting, which allows obtaining a quantum yield of down conversion of more than 150% due to the splitting of high-energy photons of the UV spectral range into lower-energy photons of the visible and near infrared spectral ranges. The goal of present work is to test approach of excitation through sensibilization of 4f-4f fluorescence of Yb3+ by various RE ions absorbing in UV and Vis spectral ranges. One of promising materials for quantum cutting luminophores are fluorides. In our investigation we have developed synthesis of nano- and submicron powders of calcium fluoride and strontium doped with rare-earth elements (Yb: Ce, Yb: Pr, Yb: Eu) of controlled dimensions and shape by co-precipitation from water solution technique. We have used Ca(NO3)2*4H2O, Sr(NO3)2, HF, NH4F as precursors. After initial solutions of nitrates were prepared they have been mixed with fluorine containing solution by dropwise manner. According to XRD data, the synthesis resulted in single phase samples with fluorite structure. By means of SEM measurements, we have confirmed spherical morphology and have determined sizes of particles (50-100 nm after synthesis and 150-300 nm after calcination). Temperature of calcination appeared to be 600°C. We have investigated the spectral-kinetic characteristics of above mentioned compounds. Here the diffuse reflection and laser induced fluorescence spectra of Yb3+ ions excited at around 4f-4f and 4f-5d transitions of Pr3+, Eu3+ and Ce3+ ions in the synthesized powders are reported. The investigation of down conversion luminescence capability of synthesized compounds included measurements of fluorescence decays and quantum yield of 2F5/2-2F7/2 fluorescence of Yb3+ ions as function of Yb3+ and sensitizer contents. An optimal chemical composition of CaF2-YbF3- LnF3 (Ln=Ce, Eu, Pr), SrF2-YbF3-LnF3 (Ln=Ce, Eu, Pr) micro- and nano- powders according to criteria of maximal IR fluorescence yield is proposed. We suppose that investigated materials are prospective in solar panels improvement applications. Work was supported by Russian Science Foundation grant #17-73- 20352.

Keywords: solar cell, fluorides, down-conversion luminescence, maximum quantum yield

Procedia PDF Downloads 256
24213 Development of New Technology Evaluation Model by Using Patent Information and Customers' Review Data

Authors: Kisik Song, Kyuwoong Kim, Sungjoo Lee

Abstract:

Many global firms and corporations derive new technology and opportunity by identifying vacant technology from patent analysis. However, previous studies failed to focus on technologies that promised continuous growth in industrial fields. Most studies that derive new technology opportunities do not test practical effectiveness. Since previous studies depended on expert judgment, it became costly and time-consuming to evaluate new technologies based on patent analysis. Therefore, research suggests a quantitative and systematic approach to technology evaluation indicators by using patent data to and from customer communities. The first step involves collecting two types of data. The data is used to construct evaluation indicators and apply these indicators to the evaluation of new technologies. This type of data mining allows a new method of technology evaluation and better predictor of how new technologies are adopted.

Keywords: data mining, evaluating new technology, technology opportunity, patent analysis

Procedia PDF Downloads 360
24212 Anomaly Detection Based on System Log Data

Authors: M. Kamel, A. Hoayek, M. Batton-Hubert

Abstract:

With the increase of network virtualization and the disparity of vendors, the continuous monitoring and detection of anomalies cannot rely on static rules. An advanced analytical methodology is needed to discriminate between ordinary events and unusual anomalies. In this paper, we focus on log data (textual data), which is a crucial source of information for network performance. Then, we introduce an algorithm used as a pipeline to help with the pretreatment of such data, group it into patterns, and dynamically label each pattern as an anomaly or not. Such tools will provide users and experts with continuous real-time logs monitoring capability to detect anomalies and failures in the underlying system that can affect performance. An application of real-world data illustrates the algorithm.

Keywords: logs, anomaly detection, ML, scoring, NLP

Procedia PDF Downloads 77
24211 Entomological Origin of Honey Discriminated by NMR Chloroform Extracts in Ecuadorian Honey

Authors: P. Vit, J. Uddin, V. Zuccato, F. Maza, E. Schievano

Abstract:

In Ecuador honeys are produced by Apis mellifera and stingless bees (Meliponini). We studied honey produced in beeswax combs by Apis mellifera, and honey produced in pots by Geotrigona and Scaptotrigona bees. Chloroform extracts of honey were obtained for fast NMR spectra. The 1D spectra were acquired at 298 K, with a 600 MHz NMR Bruker instrument, using a modified double pulsed field gradient spin echoes (DPFGSE) sequence. Signals of 1H NMR spectra were integrated and used as inputs for PCA, PLS-DA analysis, and labelled sets of classes were successfully identified, enhancing the separation between the three groups of honey according to the entomological origin: A. mellifera, Geotrigona and Scaptotrigona. This procedure is therefore recommended for authenticity test of honey in Ecuador.

Keywords: Apis mellifera, honey, 1H NMR, entomological origin, meliponini

Procedia PDF Downloads 388
24210 Degradation of Endosulfan in Different Soils by Indigenous and Adapted Microorganisms

Authors: A. Özyer, N. G. Turan, Y. Ardalı

Abstract:

The environmental fate of organic contaminants in soils is influenced significantly by the pH, texture of soil, water content and also presence of organic matter. In this study, biodegradation of endosulfan isomers was studied in two different soils (Soil A and Soil B) that have contrasting properties in terms of their texture, pH, organic content, etc. Two Nocardia sp., which were isolated from soil, were used for degradation of endosulfan. Soils were contaminated with commercial endosulfan. Six sets were maintained from two different soils, contaminated with different endosulfan concentrations for degradation experiments. Inoculated and uninoculated mineral media with Nocardia isolates were added to the soils and mixed. Soils were incubated at a certain temperature (30 °C) during ten weeks. Residue endosulfan and its metabolites’ concentrations were determined weekly during the incubation period. The changes of the soil microorganisms were investigated weekly.

Keywords: endosulfan, biodegradation, Nocardia sp. soil, organochlorine pesticide

Procedia PDF Downloads 362
24209 EnumTree: An Enumerative Biclustering Algorithm for DNA Microarray Data

Authors: Haifa Ben Saber, Mourad Elloumi

Abstract:

In a number of domains, like in DNA microarray data analysis, we need to cluster simultaneously rows (genes) and columns (conditions) of a data matrix to identify groups of constant rows with a group of columns. This kind of clustering is called biclustering. Biclustering algorithms are extensively used in DNA microarray data analysis. More effective biclustering algorithms are highly desirable and needed. We introduce a new algorithm called, Enumerative tree (EnumTree) for biclustering of binary microarray data. is an algorithm adopting the approach of enumerating biclusters. This algorithm extracts all biclusters consistent good quality. The main idea of ​​EnumLat is the construction of a new tree structure to represent adequately different biclusters discovered during the process of enumeration. This algorithm adopts the strategy of all biclusters at a time. The performance of the proposed algorithm is assessed using both synthetic and real DNA micryarray data, our algorithm outperforms other biclustering algorithms for binary microarray data. Biclusters with different numbers of rows. Moreover, we test the biological significance using a gene annotation web tool to show that our proposed method is able to produce biologically relevent biclusters.

Keywords: DNA microarray, biclustering, gene expression data, tree, datamining.

Procedia PDF Downloads 361
24208 The Impact of Financial Reporting on Sustainability

Authors: Lynn Ruggieri

Abstract:

The worldwide pandemic has only increased sustainability awareness. The public is demanding that businesses be held accountable for their impact on the environment. While financial data enjoys uniformity in reporting requirements, there are no uniform reporting requirements for non-financial data. Europe is leading the way with some standards being implemented for reporting non-financial sustainability data; however, there is no uniformity globally. And without uniformity, there is not a clear understanding of what information to include and how to disclose it. Sustainability reporting will provide important information to stakeholders and will enable businesses to understand their impact on the environment. Therefore, there is a crucial need for this data. This paper looks at the history of sustainability reporting in the countries of the European Union and throughout the world and makes a case for worldwide reporting requirements for sustainability.

Keywords: financial reporting, non-financial data, sustainability, global financial reporting

Procedia PDF Downloads 159
24207 Foot-and-Mouth Virus Detection in Asymptomatic Dairy Cows without Foot-and-Mouth Disease Outbreak

Authors: Duanghathai Saipinta, Tanittian Panyamongkol, Witaya Suriyasathaporn

Abstract:

Animal management aims to provide a suitable environment for animals allowing maximal productivity in those animals. Prevention of disease is an important part of animal management. Foot-and-mouth disease (FMD) is a highly contagious viral disease in cattle and is an economically important animal disease worldwide. Monitoring the FMD virus in farms is useful management for the prevention of the FMD outbreak. A recent publication indicated collection samples from nasal swabs can be used for monitoring FMD in symptomatic cows. Therefore, the objectives of this study were to determine the FMD virus in asymptomatic dairy cattle using nasal swab samples during the absence of an FMD outbreak. The study was conducted from December 2020 to June 2021 using 185 asymptomatic signs of FMD dairy cattle in Chiang Mai Province, Thailand. By random cow selection, nasal mucosal swabs were used to collect samples from the selected cows and then were to evaluate the presence of FMD viruses using the real-time rt-PCR assay. In total, 4.9% of dairy cattle detected FMD virus, including 2 dairy farms in Mae-on (8 samples; 9.6%) and 1 farm in the Chai-Prakan district (1 sample; 1.2%). Interestingly, both farms in Mae-on were the outbreak of the FMD after this detection for 6 months. This indicated that the FMD virus presented in asymptomatic cattle might relate to the subsequent outbreak of FMD. The outbreak demonstrates the presence of the virus in the environment. In conclusion, monitoring of FMD can be performed by nasal swab collection. Further investigation is needed to show whether the FMD virus presented in asymptomatic FMD cattle could be the cause of the subsequent FMD outbreak or not.

Keywords: cattle, foot-and-mouth disease, nasal swab, real-time rt-PCR assay

Procedia PDF Downloads 214
24206 Methods and Algorithms of Ensuring Data Privacy in AI-Based Healthcare Systems and Technologies

Authors: Omar Farshad Jeelani, Makaire Njie, Viktoriia M. Korzhuk

Abstract:

Recently, the application of AI-powered algorithms in healthcare continues to flourish. Particularly, access to healthcare information, including patient health history, diagnostic data, and PII (Personally Identifiable Information) is paramount in the delivery of efficient patient outcomes. However, as the exchange of healthcare information between patients and healthcare providers through AI-powered solutions increases, protecting a person’s information and their privacy has become even more important. Arguably, the increased adoption of healthcare AI has resulted in a significant concentration on the security risks and protection measures to the security and privacy of healthcare data, leading to escalated analyses and enforcement. Since these challenges are brought by the use of AI-based healthcare solutions to manage healthcare data, AI-based data protection measures are used to resolve the underlying problems. Consequently, this project proposes AI-powered safeguards and policies/laws to protect the privacy of healthcare data. The project presents the best-in-school techniques used to preserve the data privacy of AI-powered healthcare applications. Popular privacy-protecting methods like Federated learning, cryptographic techniques, differential privacy methods, and hybrid methods are discussed together with potential cyber threats, data security concerns, and prospects. Also, the project discusses some of the relevant data security acts/laws that govern the collection, storage, and processing of healthcare data to guarantee owners’ privacy is preserved. This inquiry discusses various gaps and uncertainties associated with healthcare AI data collection procedures and identifies potential correction/mitigation measures.

Keywords: data privacy, artificial intelligence (AI), healthcare AI, data sharing, healthcare organizations (HCOs)

Procedia PDF Downloads 66
24205 Mapping Tunnelling Parameters for Global Optimization in Big Data via Dye Laser Simulation

Authors: Sahil Imtiyaz

Abstract:

One of the biggest challenges has emerged from the ever-expanding, dynamic, and instantaneously changing space-Big Data; and to find a data point and inherit wisdom to this space is a hard task. In this paper, we reduce the space of big data in Hamiltonian formalism that is in concordance with Ising Model. For this formulation, we simulate the system using dye laser in FORTRAN and analyse the dynamics of the data point in energy well of rhodium atom. After mapping the photon intensity and pulse width with energy and potential we concluded that as we increase the energy there is also increase in probability of tunnelling up to some point and then it starts decreasing and then shows a randomizing behaviour. It is due to decoherence with the environment and hence there is a loss of ‘quantumness’. This interprets the efficiency parameter and the extent of quantum evolution. The results are strongly encouraging in favour of the use of ‘Topological Property’ as a source of information instead of the qubit.

Keywords: big data, optimization, quantum evolution, hamiltonian, dye laser, fermionic computations

Procedia PDF Downloads 182
24204 Applying Different Stenography Techniques in Cloud Computing Technology to Improve Cloud Data Privacy and Security Issues

Authors: Muhammad Muhammad Suleiman

Abstract:

Cloud Computing is a versatile concept that refers to a service that allows users to outsource their data without having to worry about local storage issues. However, the most pressing issues to be addressed are maintaining a secure and reliable data repository rather than relying on untrustworthy service providers. In this study, we look at how stenography approaches and collaboration with Digital Watermarking can greatly improve the system's effectiveness and data security when used for Cloud Computing. The main requirement of such frameworks, where data is transferred or exchanged between servers and users, is safe data management in cloud environments. Steganography is the cloud is among the most effective methods for safe communication. Steganography is a method of writing coded messages in such a way that only the sender and recipient can safely interpret and display the information hidden in the communication channel. This study presents a new text steganography method for hiding a loaded hidden English text file in a cover English text file to ensure data protection in cloud computing. Data protection, data hiding capability, and time were all improved using the proposed technique.

Keywords: cloud computing, steganography, information hiding, cloud storage, security

Procedia PDF Downloads 176
24203 Investigation on Performance of Change Point Algorithm in Time Series Dynamical Regimes and Effect of Data Characteristics

Authors: Farhad Asadi, Mohammad Javad Mollakazemi

Abstract:

In this paper, Bayesian online inference in models of data series are constructed by change-points algorithm, which separated the observed time series into independent series and study the change and variation of the regime of the data with related statistical characteristics. variation of statistical characteristics of time series data often represent separated phenomena in the some dynamical system, like a change in state of brain dynamical reflected in EEG signal data measurement or a change in important regime of data in many dynamical system. In this paper, prediction algorithm for studying change point location in some time series data is simulated. It is verified that pattern of proposed distribution of data has important factor on simpler and smother fluctuation of hazard rate parameter and also for better identification of change point locations. Finally, the conditions of how the time series distribution effect on factors in this approach are explained and validated with different time series databases for some dynamical system.

Keywords: time series, fluctuation in statistical characteristics, optimal learning, change-point algorithm

Procedia PDF Downloads 412
24202 Determination of the Risks of Heart Attack at the First Stage as Well as Their Control and Resource Planning with the Method of Data Mining

Authors: İbrahi̇m Kara, Seher Arslankaya

Abstract:

Frequently preferred in the field of engineering in particular, data mining has now begun to be used in the field of health as well since the data in the health sector have reached great dimensions. With data mining, it is aimed to reveal models from the great amounts of raw data in agreement with the purpose and to search for the rules and relationships which will enable one to make predictions about the future from the large amount of data set. It helps the decision-maker to find the relationships among the data which form at the stage of decision-making. In this study, it is aimed to determine the risk of heart attack at the first stage, to control it, and to make its resource planning with the method of data mining. Through the early and correct diagnosis of heart attacks, it is aimed to reveal the factors which affect the diseases, to protect health and choose the right treatment methods, to reduce the costs in health expenditures, and to shorten the durations of patients’ stay at hospitals. In this way, the diagnosis and treatment costs of a heart attack will be scrutinized, which will be useful to determine the risk of the disease at the first stage, to control it, and to make its resource planning.

Keywords: data mining, decision support systems, heart attack, health sector

Procedia PDF Downloads 343
24201 Bayesian Borrowing Methods for Count Data: Analysis of Incontinence Episodes in Patients with Overactive Bladder

Authors: Akalu Banbeta, Emmanuel Lesaffre, Reynaldo Martina, Joost Van Rosmalen

Abstract:

Including data from previous studies (historical data) in the analysis of the current study may reduce the sample size requirement and/or increase the power of analysis. The most common example is incorporating historical control data in the analysis of a current clinical trial. However, this only applies when the historical control dataare similar enough to the current control data. Recently, several Bayesian approaches for incorporating historical data have been proposed, such as the meta-analytic-predictive (MAP) prior and the modified power prior (MPP) both for single control as well as for multiple historical control arms. Here, we examine the performance of the MAP and the MPP approaches for the analysis of (over-dispersed) count data. To this end, we propose a computational method for the MPP approach for the Poisson and the negative binomial models. We conducted an extensive simulation study to assess the performance of Bayesian approaches. Additionally, we illustrate our approaches on an overactive bladder data set. For similar data across the control arms, the MPP approach outperformed the MAP approach with respect to thestatistical power. When the means across the control arms are different, the MPP yielded a slightly inflated type I error (TIE) rate, whereas the MAP did not. In contrast, when the dispersion parameters are different, the MAP gave an inflated TIE rate, whereas the MPP did not.We conclude that the MPP approach is more promising than the MAP approach for incorporating historical count data.

Keywords: count data, meta-analytic prior, negative binomial, poisson

Procedia PDF Downloads 106