Search results for: data source
26557 Meanings and Concepts of Standardization in Systems Medicine
Authors: Imme Petersen, Wiebke Sick, Regine Kollek
Abstract:
In systems medicine, high-throughput technologies produce large amounts of data on different biological and pathological processes, including (disturbed) gene expressions, metabolic pathways and signaling. The large volume of data of different types, stored in separate databases and often located at different geographical sites have posed new challenges regarding data handling and processing. Tools based on bioinformatics have been developed to resolve the upcoming problems of systematizing, standardizing and integrating the various data. However, the heterogeneity of data gathered at different levels of biological complexity is still a major challenge in data analysis. To build multilayer disease modules, large and heterogeneous data of disease-related information (e.g., genotype, phenotype, environmental factors) are correlated. Therefore, a great deal of attention in systems medicine has been put on data standardization, primarily to retrieve and combine large, heterogeneous datasets into standardized and incorporated forms and structures. However, this data-centred concept of standardization in systems medicine is contrary to the debate in science and technology studies (STS) on standardization that rather emphasizes the dynamics, contexts and negotiations of standard operating procedures. Based on empirical work on research consortia that explore the molecular profile of diseases to establish systems medical approaches in the clinic in Germany, we trace how standardized data are processed and shaped by bioinformatics tools, how scientists using such data in research perceive such standard operating procedures and which consequences for knowledge production (e.g. modeling) arise from it. Hence, different concepts and meanings of standardization are explored to get a deeper insight into standard operating procedures not only in systems medicine, but also beyond.Keywords: data, science and technology studies (STS), standardization, systems medicine
Procedia PDF Downloads 34126556 Multicenter Baseline Survey to Outline Antimicrobial Prescribing Practices at Six Public Sectortertiary Care Hospitals in a Low Middle Income Country
Authors: N. Khursheed, M. Fatima, S. Jamal, A. Raza, S. Rattani, Q. Ahsan, A. Rasheed, M. Jawed
Abstract:
Introduction: Antibiotics are among the commonly prescribed medicines to treat bacterial infections. Their misuse intensifies resistance, and overuse incurs heavy losses to the healthcare system in terms of increased treatment costs and enhanced disease burden. Studies show that 40% of empirically used antibiotics are irrationally utilized. The objective of this study was to evaluate prescribing pattern of antibiotics at six public sector tertiary care hospitals across Pakistan. Methods: A multicenter cross-sectional point prevalence survey (PPS) was conducted in selected wards of six public sector tertiary care hospitals in Pakistan as part of the Clinical Engagement program by Fleming Fund Country Grant Pakistan in collaboration with Indus Hospital & Health Network (IHHN) from February to March 2021, these included Jinnah Postgraduate Medical Center and Dr. Ruth K. M. Pfau Civil Hospital from Karachi, Sheikh Zayed Hospital Lahore, Nishtar Medical University Hospital Multan, Medical Teaching Institute Hayatabad Medical Complex Peshawar, and Provincial Headquarters Hospital Gilgit. WHO PPS methodology was used for data collection (Hospital, ward, and patient level data was collected). Data was entered into the open-source Kobo Collect application and was analyzed using SPSS (version 22.0). Findings: Medical records of 837 in-patients were surveyed, of which the prevalence of antibiotics use was 78.5%. The most commonly prescribed antimicrobial was Ceftriaxone (21.7%) which is categorized in the Watch group of WHO AWaRe Classification, followed by Metronidazole (17.3%), Cefoperazone/Sulbactam (8.4%), Co-Amoxiclav (6.3%) and Piperacillin/Tazobactam (5.9%). The antibiotics were prescribed largely for surgical prophylaxis (36.7%), followed by community-acquired infections (24.7%). One antibiotic was prescribed to 46.7%, two to 39.9%, and three or more to 12.5 %. Two of six (30%) hospitals had functional drug and therapeutic committees, three (50%) had infection prevention and control committees, and one facility had an antibiotic formulary. Conclusion: Findings demonstrate high consumption of broad-spectrum antimicrobials and emphasizes the importance of expanding the antimicrobial stewardship program. Mentoring clinical teams will help to rationalize antimicrobial use.Keywords: antimicrobial resistance, antimicrobial stewardship, point prevalence survey, antibiotics
Procedia PDF Downloads 10426555 Integrated On-Board Diagnostic-II and Direct Controller Area Network Access for Vehicle Monitoring System
Authors: Kavian Khosravinia, Mohd Khair Hassan, Ribhan Zafira Abdul Rahman, Syed Abdul Rahman Al-Haddad
Abstract:
The CAN (controller area network) bus is introduced as a multi-master, message broadcast system. The messages sent on the CAN are used to communicate state information, referred as a signal between different ECUs, which provides data consistency in every node of the system. OBD-II Dongles that are based on request and response method is the wide-spread solution for extracting sensor data from cars among researchers. Unfortunately, most of the past researches do not consider resolution and quantity of their input data extracted through OBD-II technology. The maximum feasible scan rate is only 9 queries per second which provide 8 data points per second with using ELM327 as well-known OBD-II dongle. This study aims to develop and design a programmable, and latency-sensitive vehicle data acquisition system that improves the modularity and flexibility to extract exact, trustworthy, and fresh car sensor data with higher frequency rates. Furthermore, the researcher must break apart, thoroughly inspect, and observe the internal network of the vehicle, which may cause severe damages to the expensive ECUs of the vehicle due to intrinsic vulnerabilities of the CAN bus during initial research. Desired sensors data were collected from various vehicles utilizing Raspberry Pi3 as computing and processing unit with using OBD (request-response) and direct CAN method at the same time. Two types of data were collected for this study. The first, CAN bus frame data that illustrates data collected for each line of hex data sent from an ECU and the second type is the OBD data that represents some limited data that is requested from ECU under standard condition. The proposed system is reconfigurable, human-readable and multi-task telematics device that can be fitted into any vehicle with minimum effort and minimum time lag in the data extraction process. The standard operational procedure experimental vehicle network test bench is developed and can be used for future vehicle network testing experiment.Keywords: CAN bus, OBD-II, vehicle data acquisition, connected cars, telemetry, Raspberry Pi3
Procedia PDF Downloads 20426554 Big Data in Construction Project Management: The Colombian Northeast Case
Authors: Sergio Zabala-Vargas, Miguel Jiménez-Barrera, Luz VArgas-Sánchez
Abstract:
In recent years, information related to project management in organizations has been increasing exponentially. Performance data, management statistics, indicator results have forced the collection, analysis, traceability, and dissemination of project managers to be essential. In this sense, there are current trends to facilitate efficient decision-making in emerging technology projects, such as: Machine Learning, Data Analytics, Data Mining, and Big Data. The latter is the most interesting in this project. This research is part of the thematic line Construction methods and project management. Many authors present the relevance that the use of emerging technologies, such as Big Data, has taken in recent years in project management in the construction sector. The main focus is the optimization of time, scope, budget, and in general mitigating risks. This research was developed in the northeastern region of Colombia-South America. The first phase was aimed at diagnosing the use of emerging technologies (Big-Data) in the construction sector. In Colombia, the construction sector represents more than 50% of the productive system, and more than 2 million people participate in this economic segment. The quantitative approach was used. A survey was applied to a sample of 91 companies in the construction sector. Preliminary results indicate that the use of Big Data and other emerging technologies is very low and also that there is interest in modernizing project management. There is evidence of a correlation between the interest in using new data management technologies and the incorporation of Building Information Modeling BIM. The next phase of the research will allow the generation of guidelines and strategies for the incorporation of technological tools in the construction sector in Colombia.Keywords: big data, building information modeling, tecnology, project manamegent
Procedia PDF Downloads 12826553 Influenza Vaccination Acceptance and Refusal Reasons among Tunisian Elderly
Authors: Ghassen Kharroubi, Ines Cherif, Leila Bouabid, Adel Gharbi, Aicha Boukthir, Margaret McCarron, Nissaf Ben Alaya, Afif Ben Salah, Jihene Bettaieb
Abstract:
Influenza vaccination (IV) is recommended for elderly persons, especially those with underlying conditions. In countries where IV rates in the elderly remain unsatisfactory, exploring attitudes of older persons toward the flu vaccine could be useful to identify barriers and facilitators to IV. The aim of this study was to determine the reasons for IV acceptance or decline in the Tunisian elderly. A national cross-sectional study was conducted in 2019, among persons aged 60 years and over with chronic disease. Data were collected using a standard administered questionnaire. Of the 1191 older persons included, 19.4% received the influenza vaccine in the 2018-2019 flu season. The two main reasons that may lead to refusal of vaccination were concerns that the vaccine could cause side effects (71.5%) and a belief that the vaccine was ineffective (33.9%). The main reason that may lead to accepting vaccination was a doctor’s recommendation (41.1%). Doctors were by far the most trusted source for information regarding influenza vaccine (91.5%) followed by pharmacists (17.6%). Our results highlighted the important role that doctors could play in promoting IV among the Tunisian elderly. Physicians should correct misconceptions about adverse events and the efficiency of the vaccine. In fact, influenza vaccines are generally effective and safe among older persons.Keywords: attitudes, influenza vaccination, older persons, Tunisia
Procedia PDF Downloads 15226552 Deep Learning Approach to Trademark Design Code Identification
Authors: Girish J. Showkatramani, Arthi M. Krishna, Sashi Nareddi, Naresh Nula, Aaron Pepe, Glen Brown, Greg Gabel, Chris Doninger
Abstract:
Trademark examination and approval is a complex process that involves analysis and review of the design components of the marks such as the visual representation as well as the textual data associated with marks such as marks' description. Currently, the process of identifying marks with similar visual representation is done manually in United States Patent and Trademark Office (USPTO) and takes a considerable amount of time. Moreover, the accuracy of these searches depends heavily on the experts determining the trademark design codes used to catalog the visual design codes in the mark. In this study, we explore several methods to automate trademark design code classification. Based on recent successes of convolutional neural networks in image classification, we have used several different convolutional neural networks such as Google’s Inception v3, Inception-ResNet-v2, and Xception net. The study also looks into other techniques to augment the results from CNNs such as using Open Source Computer Vision Library (OpenCV) to pre-process the images. This paper reports the results of the various models trained on year of annotated trademark images.Keywords: trademark design code, convolutional neural networks, trademark image classification, trademark image search, Inception-ResNet-v2
Procedia PDF Downloads 23226551 Quality Analysis of Vegetables Through Image Processing
Authors: Abdul Khalique Baloch, Ali Okatan
Abstract:
The quality analysis of food and vegetable from image is hot topic now a day, where researchers make them better then pervious findings through different technique and methods. In this research we have review the literature, and find gape from them, and suggest better proposed approach, design the algorithm, developed a software to measure the quality from images, where accuracy of image show better results, and compare the results with Perouse work done so for. The Application we uses an open-source dataset and python language with tensor flow lite framework. In this research we focus to sort food and vegetable from image, in the images, the application can sorts and make them grading after process the images, it could create less errors them human base sorting errors by manual grading. Digital pictures datasets were created. The collected images arranged by classes. The classification accuracy of the system was about 94%. As fruits and vegetables play main role in day-to-day life, the quality of fruits and vegetables is necessary in evaluating agricultural produce, the customer always buy good quality fruits and vegetables. This document is about quality detection of fruit and vegetables using images. Most of customers suffering due to unhealthy foods and vegetables by suppliers, so there is no proper quality measurement level followed by hotel managements. it have developed software to measure the quality of the fruits and vegetables by using images, it will tell you how is your fruits and vegetables are fresh or rotten. Some algorithms reviewed in this thesis including digital images, ResNet, VGG16, CNN and Transfer Learning grading feature extraction. This application used an open source dataset of images and language used python, and designs a framework of system.Keywords: deep learning, computer vision, image processing, rotten fruit detection, fruits quality criteria, vegetables quality criteria
Procedia PDF Downloads 7026550 Minimum Data of a Speech Signal as Special Indicators of Identification in Phonoscopy
Authors: Nazaket Gazieva
Abstract:
Voice biometric data associated with physiological, psychological and other factors are widely used in forensic phonoscopy. There are various methods for identifying and verifying a person by voice. This article explores the minimum speech signal data as individual parameters of a speech signal. Monozygotic twins are believed to be genetically identical. Using the minimum data of the speech signal, we came to the conclusion that the voice imprint of monozygotic twins is individual. According to the conclusion of the experiment, we can conclude that the minimum indicators of the speech signal are more stable and reliable for phonoscopic examinations.Keywords: phonogram, speech signal, temporal characteristics, fundamental frequency, biometric fingerprints
Procedia PDF Downloads 14426549 Improvement of Thermal Comfort Conditions in an Urban Space "Case Study: The Square of Independence, Setif, Algeria"
Authors: Ballout Amor, Yasmina Bouchahm, Lacheheb Dhia Eddine Zakaria
Abstract:
Several studies all around the world were conducted on the phenomenon of the urban heat island, and referring to the results obtained, one of the most important factors that influence this phenomenon is the mineralization of the cities which means the reducing of evaporative urban surfaces, replacing vegetation and wetlands with concrete and asphalt. The use of vegetation and water can change the urban environment and improve comfort, thus reduce the heat island. The trees act as a mask to the sun, wind, and sound, and also as a source of humidity which reduces air temperature and surrounding surfaces. Water also acts as a buffer to noise; it is also a source of moisture and regulates temperature not to mention the psychological effect on humans. Our main objective in this paper is to determine the impact of vegetation, ponds and fountains on the urban micro climate in general and on the thermal comfort of people along the Independence square in the Algerian city of Sétif, which is a semi-arid climate, in particularly. In order to reach this objective, a comparative study between different scenarios has been done; the use of the Envi-met program enabled us to model the urban environment of the Independence Square and to study the possibility of improving the conditions of comfort by adding an amount of vegetation and water ponds. After studying the results obtained (temperature, relative humidity, wind speed, PMV and PPD indicators), the efficiency of the additions we've made on the square was confirmed and this is what helped us to confirm our assumptions regarding the terms of comfort in the studied site, and in the end we are trying to develop recommendations and solutions which may contribute to improve the conditions for greater comfort in the Independence square.Keywords: comfort in outer space, urban environment, scenarisation, vegetation, water ponds, public square, simulation
Procedia PDF Downloads 45426548 Internal Audit Function Contributions to the External Audit
Authors: Douglas F. Prawitt, Nathan Y. Sharp, David A. Wood
Abstract:
Consistent with prior experimental and survey studies, we find that IAFs that spend more time directly assisting the external auditor is associated with lower external audit fees. Interestingly, we do not find evidence that external auditors reduce fees based on work previously performed by the IAF. We also find that the time spent assisting the external auditor has a greater negative effect on external audit fees than the time spent performing tasks upon which the auditor may rely but that are not performed as direct assistance to the external audit. Our results also show that previous proxies used to measure this relation is either not associated with or are negatively associated with our direct measures of how the IAF can contribute to the external audit and are highly positively associated with the size and the complexity of the organization. Thus, we conclude the disparate experimental and archival results may be attributable to issues surrounding the construct validity of measures used in previous archival studies and that when measures similar to those used in experimental studies are employed in archival tests, the archival results are consistent with experimental findings. Our research makes four primary contributions to the literature. First, we provide evidence that internal auditing contributes to a reduction in external audit fees. Second, we replicate and provide an explanation for why previous archival studies find that internal auditing has either no association with external audit fees or is associated with an increase in those fees: prior studies generally use proxies of internal audit contribution that do not adequately capture the intended construct. Third, our research expands on survey-based research (e.g., Oil Libya sh.co.) by separately examining the impact on the audit fee of the internal auditors’ work, indirectly assisting external auditors and internal auditors’ prior work upon which external auditors can rely. Finally, we extend prior research by using a new, independent data source to validate and extend prior studies. This data set also allows for a sample of examining the impact of internal auditing on the external audit fee and the use of a more comprehensive external audit fee model that better controls for determinants of the external audit fee.Keywords: internal audit, contribution, external audit, function
Procedia PDF Downloads 12426547 Efficient Solid Oxide Electrolysers for Syn-Gas Generation Using Renewable Energy
Authors: G. Kaur, A. P. Kulkarni, S. Giddey
Abstract:
Production of fuels and chemicals using renewable energy is a promising way for large-scale energy storage and export. Solid oxide electrolysers (SOEs) integrated with renewable source of energy can produce 'Syngas' H₂/CO from H₂O/CO₂ in the desired ratio for further conversion to liquid fuels. As only a waste CO₂ from industrial and power generation processes is utilized in these processes, this approach is CO₂ neutral compared to using fossil fuel feedstock. In addition, the waste heat from industrial processes or heat from solar thermal concentrators can be effectively utilised in SOEs to further reduce the electrical requirements by up to 30% which boosts overall energy efficiency of the process. In this paper, the electrochemical performance of various novel steam/CO₂ reduction electrodes (cathode) would be presented. The efficiency and lifetime degradation data for single cells and a stack would be presented along with the response of cells to variable electrical load input mimicking the intermittent nature of the renewable energy sources. With such optimisation, newly developed electrodes have been tested for 500+ hrs with Faraday efficiency (electricity to fuel conversion efficiency) up to 95%, and thermal efficiency in excess of 70% based upon energy content of the syngas produced.Keywords: carbon dioxide, steam conversion, electrochemical system, energy storage, fuel production, renewable energy
Procedia PDF Downloads 23726546 Degradation of Petroleum Hydrocarbons Using Pseudomonas Aeruginosa Isolated from Oil Contaminated Soil Incorporated into E. coli DH5α Host
Authors: C. S. Jeba Samuel
Abstract:
Soil, especially from oil field has posed a great hazard for terrestrial and marine ecosystems. The traditional treatment of oil contaminated soil cannot degrade the crude oil completely. So far, biodegradation proves to be an efficient method. During biodegradation, crude oil is used as the carbon source and addition of nitrogenous compounds increases the microbial growth, resulting in the effective breakdown of crude oil components to low molecular weight components. The present study was carried out to evaluate the biodegradation of crude oil by hydrocarbon-degrading microorganism Pseudomonas aeruginosa isolated from natural environment like oil contaminated soil. Pseudomonas aeruginosa, an oil degrading microorganism also called as hydrocarbon utilizing microorganism (or “HUM” bug) can utilize crude oil as sole carbon source. In this study, the biodegradation of crude oil was conducted with modified mineral basal salt medium and nitrogen sources so as to increase the degradation. The efficacy of the plasmid from the isolated strain was incorporated into E.coli DH5 α host to speed up the degradation of oil. The usage of molecular techniques has increased oil degradation which was confirmed by the degradation of aromatic and aliphatic rings of hydrocarbons and was inferred by the lesser number of peaks in Fourier Transform Infrared Spectroscopy (FTIR). The gas chromatogram again confirms better degradation by transformed cells by the lesser number of components obtained in the oil treated with transformed cells. This study demonstrated the technical feasibility of using direct inoculation of transformed cells onto the oil contaminated region thereby leading to the achievement of better oil degradation in a shorter time than the degradation caused by the wild strain.Keywords: biodegradation, aromatic rings, plasmid, hydrocarbon, Fourier Transform Infrared Spectroscopy (FTIR)
Procedia PDF Downloads 37226545 A Non-parametric Clustering Approach for Multivariate Geostatistical Data
Authors: Francky Fouedjio
Abstract:
Multivariate geostatistical data have become omnipresent in the geosciences and pose substantial analysis challenges. One of them is the grouping of data locations into spatially contiguous clusters so that data locations within the same cluster are more similar while clusters are different from each other, in some sense. Spatially contiguous clusters can significantly improve the interpretation that turns the resulting clusters into meaningful geographical subregions. In this paper, we develop an agglomerative hierarchical clustering approach that takes into account the spatial dependency between observations. It relies on a dissimilarity matrix built from a non-parametric kernel estimator of the spatial dependence structure of data. It integrates existing methods to find the optimal cluster number and to evaluate the contribution of variables to the clustering. The capability of the proposed approach to provide spatially compact, connected and meaningful clusters is assessed using bivariate synthetic dataset and multivariate geochemical dataset. The proposed clustering method gives satisfactory results compared to other similar geostatistical clustering methods.Keywords: clustering, geostatistics, multivariate data, non-parametric
Procedia PDF Downloads 47726544 A Data Mining Approach for Analysing and Predicting the Bank's Asset Liability Management Based on Basel III Norms
Authors: Nidhin Dani Abraham, T. K. Sri Shilpa
Abstract:
Asset liability management is an important aspect in banking business. Moreover, the today’s banking is based on BASEL III which strictly regulates on the counterparty default. This paper focuses on prediction and analysis of counter party default risk, which is a type of risk occurs when the customers fail to repay the amount back to the lender (bank or any financial institutions). This paper proposes an approach to reduce the counterparty risk occurring in the financial institutions using an appropriate data mining technique and thus predicts the occurrence of NPA. It also helps in asset building and restructuring quality. Liability management is very important to carry out banking business. To know and analyze the depth of liability of bank, a suitable technique is required. For that a data mining technique is being used to predict the dormant behaviour of various deposit bank customers. Various models are implemented and the results are analyzed of saving bank deposit customers. All these data are cleaned using data cleansing approach from the bank data warehouse.Keywords: data mining, asset liability management, BASEL III, banking
Procedia PDF Downloads 55226543 Parallel Coordinates on a Spiral Surface for Visualizing High-Dimensional Data
Authors: Chris Suma, Yingcai Xiao
Abstract:
This paper presents Parallel Coordinates on a Spiral Surface (PCoSS), a parallel coordinate based interactive visualization method for high-dimensional data, and a test implementation of the method. Plots generated by the test system are compared with those generated by XDAT, a software implementing traditional parallel coordinates. Traditional parallel coordinate plots can be cluttered when the number of data points is large or when the dimensionality of the data is high. PCoSS plots display multivariate data on a 3D spiral surface and allow users to see the whole picture of high-dimensional data with less cluttering. Taking advantage of the 3D display environment in PCoSS, users can further reduce cluttering by zooming into an axis of interest for a closer view or by moving vantage points and by reorienting the viewing angle to obtain a desired view of the plots.Keywords: human computer interaction, parallel coordinates, spiral surface, visualization
Procedia PDF Downloads 1126542 Evaluation of the Boiling Liquid Expanding Vapor Explosion Thermal Effects in Hassi R'Mel Gas Processing Plant Using Fire Dynamics Simulator
Authors: Brady Manescau, Ilyas Sellami, Khaled Chetehouna, Charles De Izarra, Rachid Nait-Said, Fati Zidani
Abstract:
During a fire in an oil and gas refinery, several thermal accidents can occur and cause serious damage to people and environment. Among these accidents, the BLEVE (Boiling Liquid Expanding Vapor Explosion) is most observed and remains a major concern for risk decision-makers. It corresponds to a violent vaporization of explosive nature following the rupture of a vessel containing a liquid at a temperature significantly higher than its normal boiling point at atmospheric pressure. Their effects on the environment generally appear in three ways: blast overpressure, radiation from the fireball if the liquid involved is flammable and fragment hazards. In order to estimate the potential damage that would be caused by such an explosion, risk decision-makers often use quantitative risk analysis (QRA). This analysis is a rigorous and advanced approach that requires a reliable data in order to obtain a good estimate and control of risks. However, in most cases, the data used in QRA are obtained from the empirical correlations. These empirical correlations generally overestimate BLEVE effects because they are based on simplifications and do not take into account real parameters like the geometry effect. Considering that these risk analyses are based on an assessment of BLEVE effects on human life and plant equipment, more precise and reliable data should be provided. From this point of view, the CFD modeling of BLEVE effects appears as a solution to the empirical law limitations. In this context, the main objective is to develop a numerical tool in order to predict BLEVE thermal effects using the CFD code FDS version 6. Simulations are carried out with a mesh size of 1 m. The fireball source is modeled as a vertical release of hot fuel in a short time. The modeling of fireball dynamics is based on a single step combustion using an EDC model coupled with the default LES turbulence model. Fireball characteristics (diameter, height, heat flux and lifetime) issued from the large scale BAM experiment are used to demonstrate the ability of FDS to simulate the various steps of the BLEVE phenomenon from ignition up to total burnout. The influence of release parameters such as the injection rate and the radiative fraction on the fireball heat flux is also presented. Predictions are very encouraging and show good agreement in comparison with BAM experiment data. In addition, a numerical study is carried out on an operational propane accumulator in an Algerian gas processing plant of SONATRACH company located in the Hassi R’Mel Gas Field (the largest gas field in Algeria).Keywords: BLEVE effects, CFD, FDS, fireball, LES, QRA
Procedia PDF Downloads 18626541 A Dynamic Ensemble Learning Approach for Online Anomaly Detection in Alibaba Datacenters
Authors: Wanyi Zhu, Xia Ming, Huafeng Wang, Junda Chen, Lu Liu, Jiangwei Jiang, Guohua Liu
Abstract:
Anomaly detection is a first and imperative step needed to respond to unexpected problems and to assure high performance and security in large data center management. This paper presents an online anomaly detection system through an innovative approach of ensemble machine learning and adaptive differentiation algorithms, and applies them to performance data collected from a continuous monitoring system for multi-tier web applications running in Alibaba data centers. We evaluate the effectiveness and efficiency of this algorithm with production traffic data and compare with the traditional anomaly detection approaches such as a static threshold and other deviation-based detection techniques. The experiment results show that our algorithm correctly identifies the unexpected performance variances of any running application, with an acceptable false positive rate. This proposed approach has already been deployed in real-time production environments to enhance the efficiency and stability in daily data center operations.Keywords: Alibaba data centers, anomaly detection, big data computation, dynamic ensemble learning
Procedia PDF Downloads 20126540 Deproteinization of Moroccan Sardine (Sardina pilchardus) Scales: A Pilot-Scale Study
Authors: F. Bellali, M. Kharroubi, Y. Rady, N. Bourhim
Abstract:
In Morocco, fish processing industry is an important source income for a large amount of by-products including skins, bones, heads, guts, and scales. Those underutilized resources particularly scales contain a large amount of proteins and calcium. Sardina plichardus scales from resulting from the transformation operation have the potential to be used as raw material for the collagen production. Taking into account this strong expectation of the regional fish industry, scales sardine upgrading is well justified. In addition, political and societal demands for sustainability and environment-friendly industrial production systems, coupled with the depletion of fish resources, drive this trend forward. Therefore, fish scale used as a potential source to isolate collagen has a wide large of applications in food, cosmetic, and biomedical industry. The main aim of this study is to isolate and characterize the acid solubilize collagen from sardine fish scale, Sardina pilchardus. Experimental design methodology was adopted in collagen processing for extracting optimization. The first stage of this work is to investigate the optimization conditions of the sardine scale deproteinization on using response surface methodology (RSM). The second part focus on the demineralization with HCl solution or EDTA. And the last one is to establish the optimum condition for the isolation of collagen from fish scale by solvent extraction. The advancement from lab scale to pilot scale is a critical stage in the technological development. In this study, the optimal condition for the deproteinization which was validated at laboratory scale was employed in the pilot scale procedure. The deproteinization of fish scale was then demonstrated on a pilot scale (2Kg scales, 20l NaOH), resulting in protein content (0,2mg/ml) and hydroxyproline content (2,11mg/l). These results indicated that the pilot-scale showed similar performances to those of lab-scale one.Keywords: deproteinization, pilot scale, scale, sardine pilchardus
Procedia PDF Downloads 44626539 The Role of Synthetic Data in Aerial Object Detection
Authors: Ava Dodd, Jonathan Adams
Abstract:
The purpose of this study is to explore the characteristics of developing a machine learning application using synthetic data. The study is structured to develop the application for the purpose of deploying the computer vision model. The findings discuss the realities of attempting to develop a computer vision model for practical purpose, and detail the processes, tools, and techniques that were used to meet accuracy requirements. The research reveals that synthetic data represents another variable that can be adjusted to improve the performance of a computer vision model. Further, a suite of tools and tuning recommendations are provided.Keywords: computer vision, machine learning, synthetic data, YOLOv4
Procedia PDF Downloads 22526538 Perception-Oriented Model Driven Development for Designing Data Acquisition Process in Wireless Sensor Networks
Authors: K. Indra Gandhi
Abstract:
Wireless Sensor Networks (WSNs) have always been characterized for application-specific sensing, relaying and collection of information for further analysis. However, software development was not considered as a separate entity in this process of data collection which has posed severe limitations on the software development for WSN. Software development for WSN is a complex process since the components involved are data-driven, network-driven and application-driven in nature. This implies that there is a tremendous need for the separation of concern from the software development perspective. A layered approach for developing data acquisition design based on Model Driven Development (MDD) has been proposed as the sensed data collection process itself varies depending upon the application taken into consideration. This work focuses on the layered view of the data acquisition process so as to ease the software point of development. A metamodel has been proposed that enables reusability and realization of the software development as an adaptable component for WSN systems. Further, observing users perception indicates that proposed model helps in improving the programmer's productivity by realizing the collaborative system involved.Keywords: data acquisition, model-driven development, separation of concern, wireless sensor networks
Procedia PDF Downloads 43426537 Comparative Analysis of Data Gathering Protocols with Multiple Mobile Elements for Wireless Sensor Network
Authors: Bhat Geetalaxmi Jairam, D. V. Ashoka
Abstract:
Wireless Sensor Networks are used in many applications to collect sensed data from different sources. Sensed data has to be delivered through sensors wireless interface using multi-hop communication towards the sink. The data collection in wireless sensor networks consumes energy. Energy consumption is the major constraints in WSN .Reducing the energy consumption while increasing the amount of generated data is a great challenge. In this paper, we have implemented two data gathering protocols with multiple mobile sinks/elements to collect data from sensor nodes. First, is Energy-Efficient Data Gathering with Tour Length-Constrained Mobile Elements in Wireless Sensor Networks (EEDG), in which mobile sinks uses vehicle routing protocol to collect data. Second is An Intelligent Agent-based Routing Structure for Mobile Sinks in WSNs (IAR), in which mobile sinks uses prim’s algorithm to collect data. Authors have implemented concepts which are common to both protocols like deployment of mobile sinks, generating visiting schedule, collecting data from the cluster member. Authors have compared the performance of both protocols by taking statistics based on performance parameters like Delay, Packet Drop, Packet Delivery Ratio, Energy Available, Control Overhead. Authors have concluded this paper by proving EEDG is more efficient than IAR protocol but with few limitations which include unaddressed issues likes Redundancy removal, Idle listening, Mobile Sink’s pause/wait state at the node. In future work, we plan to concentrate more on these limitations to avail a new energy efficient protocol which will help in improving the life time of the WSN.Keywords: aggregation, consumption, data gathering, efficiency
Procedia PDF Downloads 49726536 In vitro Assessment of Bioactive Properties and Dose-Dependent Antioxidant Activities of Commercial Grape Cultivars in Taiwan
Authors: Kandi Sridhar, Charles Albert Linton
Abstract:
Grapes are excellent sources of bioactive compounds, which have been suggested to be responsible for lowering the risk of chronic diseases. Fresh and freeze-dried extracts of Kyoho and Jubilee, commercial grape varieties available in Taiwan and attractive for their quality berries, were investigated for their total phenolics and total flavonoids contents and related dose-dependent antioxidants properties using various in vitro assays. The efficiency of the extraction yield ranged from 7.10 % to 25.53 % (w/w), depending on solvent used. Fresh samples of Kyoho and Jubilee exhibited total polyphenolic contents (351.56 ± 23.08 and 328.67 ± 16.54 µg GAE/mL, respectively), whereas Kyoho freeze-dried methanol: water extracts contains the good levels of total flavonoids (4767.82 ± 22.20 µg QE/mL). Kyoho and Jubilee freeze-dried extracts exhibited the highest total flavonoid contents. There was a weak correlation between total phenolic and flavonoid assays (r= -0.05, R2 = 0.02, p > 0.05). Kyoho fresh and freeze-dried samples showed the DPPH (11.51 – 77.82 %), superoxide scavenging activity (33.61 – 81.95 %), and total antioxidant inhibition (92.01 – 99.28 %), respectively. Total flavonoids were statistically correlated with EC50 DPPH scavenging radicals (r =0.91, p < 0.01), EC50 nitric oxide (r = 0.25, p > 0.05), and EC50 lipid peroxidation radicals (r = 0.38, p > 0.05). These results suggested that the two commercial grape cultivars in Taiwan could be used as a good source of natural antioxidants. Thus, consumption of grapes as a source antioxidant might lower the risk of chronic diseases. Moreover, future studies will investigate and develop phenolic acid profile for the cultivars in Taiwan.Keywords: antioxidants, EC50 radical scavenging activity, grape cultivars, total phenolics
Procedia PDF Downloads 17826535 Photocatalytic Degradation of Naproxen in Water under Solar Irradiation over NiFe₂O₄ Nanoparticle System
Authors: H. Boucheloukh, S. Rouissa, N. Aoun, M. Beloucifa, T. Sehili, F. Parrino, V. Loddo
Abstract:
To optimize water purification and wastewater treatment by heterogeneous photocatalysis, we used NiFe₂O₄ as a catalyst and solar irradiation as a source of energy. In this concept, an organic substance present in many industrial effluents was chosen: naproxen ((S)-6-methoxy-α-methyl-2-naphthaleneacetic acid or 2-(6-methoxynaphthalenyl) propanoic), a non-steroidal anti-inflammatory drug. The main objective of this study is to degrade naproxen by an iron and nickel catalyst, the degradation of this organic pollutant by nickel ferrite has been studied in a heterogeneous aqueous medium, with the study of the various factors influencing photocatalysis such as the concentration of matter and the acidity of the medium. The photocatalytic activity was followed by HPLC-UV andUV-Vis spectroscopy. A first-order kinetic model appropriately fitted the experimental data. The degradation of naproxen was also studied in the presence of H₂O₂ as well as in an aqueous solution. The new hetero-system NiFe₂O₄/oxalic acid is also discussed. The fastest naproxen degradation was obtained with NiFe₂O₄/H₂O₂. In a first-place, we detailed the characteristics of the material NiFe₂O₄, which was synthesized by the sol-gel methods, using various analytical techniques: visible UV spectrophotometry, X-ray diffraction, FTIR, cyclic voltammetry, luminescent discharge optical emission spectroscopy.Keywords: naproxen, nickelate, photocatalysis, oxalic acid
Procedia PDF Downloads 21026534 Status and Results from EXO-200
Authors: Ryan Maclellan
Abstract:
EXO-200 has provided one of the most sensitive searches for neutrinoless double-beta decay utilizing 175 kg of enriched liquid xenon in an ultra-low background time projection chamber. This detector has demonstrated excellent energy resolution and background rejection capabilities. Using the first two years of data, EXO-200 has set a limit of 1.1x10^25 years at 90% C.L. on the neutrinoless double-beta decay half-life of Xe-136. The experiment has experienced a brief hiatus in data taking during a temporary shutdown of its host facility: the Waste Isolation Pilot Plant. EXO-200 expects to resume data taking in earnest this fall with upgraded detector electronics. Results from the analysis of EXO-200 data and an update on the current status of EXO-200 will be presented.Keywords: double-beta, Majorana, neutrino, neutrinoless
Procedia PDF Downloads 41426533 Methodology for the Determination of Triterpenic Compounds in Apple Extracts
Authors: Mindaugas Liaudanskas, Darius Kviklys, Kristina Zymonė, Raimondas Raudonis, Jonas Viškelis, Norbertas Uselis, Pranas Viškelis, Valdimaras Janulis
Abstract:
Apples are among the most commonly consumed fruits in the world. Based on data from the year 2014, approximately 84.63 million tons of apples are grown per annum. Apples are widely used in food industry to produce various products and drinks (juice, wine, and cider); they are also used unprocessed. Apples in human diet are an important source of different groups of biological active compounds that can positively contribute to the prevention of various diseases. They are a source of various biologically active substances – especially vitamins, organic acids, micro- and macro-elements, pectins, and phenolic, triterpenic, and other compounds. Triterpenic compounds, which are characterized by versatile biological activity, are the biologically active compounds found in apples that are among the most promising and most significant for human health. A specific analytical procedure including sample preparation and High Performance Liquid Chromatography (HPLC) analysis was developed, optimized, and validated for the detection of triterpenic compounds in the samples of different apples, their peels, and flesh from widespread apple cultivars 'Aldas', 'Auksis', 'Connel Red', 'Ligol', 'Lodel', and 'Rajka' grown in Lithuanian climatic conditions. The conditions for triterpenic compound extraction were optimized: the solvent of the extraction was 100% (v/v) acetone, and the extraction was performed in an ultrasound bath for 10 min. Isocratic elution (the eluents ratio being 88% (solvent A) and 12% (solvent B)) for a rapid separation of triterpenic compounds was performed. The validation of the methodology was performed on the basis of the ICH recommendations. The following characteristics of validation were evaluated: the selectivity of the method (specificity), precision, the detection and quantitation limits of the analytes, and linearity. The obtained parameters values confirm suitability of methodology to perform analysis of triterpenic compounds. Using the optimised and validated HPLC technique, four triterpenic compounds were separated and identified, and their specificity was confirmed. These compounds were corosolic acid, betulinic acid, oleanolic acid, and ursolic acid. Ursolic acid was the dominant compound in all the tested apple samples. The detected amount of betulinic acid was the lowest of all the identified triterpenic compounds. The greatest amounts of triterpenic compounds were detected in whole apple and apple peel samples of the 'Lodel' cultivar, and thus apples and apple extracts of this cultivar are potentially valuable for use in medical practice, for the prevention of various diseases, for adjunct therapy, for the isolation of individual compounds with a specific biological effect, and for the development and production of dietary supplements and functional food enriched in biologically active compounds. Acknowledgements. This work was supported by a grant from the Research Council of Lithuania, project No. MIP-17-8.Keywords: apples, HPLC, triterpenic compounds, validation
Procedia PDF Downloads 17326532 Remaining Useful Life (RUL) Assessment Using Progressive Bearing Degradation Data and ANN Model
Authors: Amit R. Bhende, G. K. Awari
Abstract:
Remaining useful life (RUL) prediction is one of key technologies to realize prognostics and health management that is being widely applied in many industrial systems to ensure high system availability over their life cycles. The present work proposes a data-driven method of RUL prediction based on multiple health state assessment for rolling element bearings. Bearing degradation data at three different conditions from run to failure is used. A RUL prediction model is separately built in each condition. Feed forward back propagation neural network models are developed for prediction modeling.Keywords: bearing degradation data, remaining useful life (RUL), back propagation, prognosis
Procedia PDF Downloads 43626531 Spatio-Temporal Data Mining with Association Rules for Lake Van
Authors: Tolga Aydin, M. Fatih Alaeddinoğlu
Abstract:
People, throughout the history, have made estimates and inferences about the future by using their past experiences. Developing information technologies and the improvements in the database management systems make it possible to extract useful information from knowledge in hand for the strategic decisions. Therefore, different methods have been developed. Data mining by association rules learning is one of such methods. Apriori algorithm, one of the well-known association rules learning algorithms, is not commonly used in spatio-temporal data sets. However, it is possible to embed time and space features into the data sets and make Apriori algorithm a suitable data mining technique for learning spatio-temporal association rules. Lake Van, the largest lake of Turkey, is a closed basin. This feature causes the volume of the lake to increase or decrease as a result of change in water amount it holds. In this study, evaporation, humidity, lake altitude, amount of rainfall and temperature parameters recorded in Lake Van region throughout the years are used by the Apriori algorithm and a spatio-temporal data mining application is developed to identify overflows and newly-formed soil regions (underflows) occurring in the coastal parts of Lake Van. Identifying possible reasons of overflows and underflows may be used to alert the experts to take precautions and make the necessary investments.Keywords: apriori algorithm, association rules, data mining, spatio-temporal data
Procedia PDF Downloads 37426530 Process Data-Driven Representation of Abnormalities for Efficient Process Control
Authors: Hyun-Woo Cho
Abstract:
Unexpected operational events or abnormalities of industrial processes have a serious impact on the quality of final product of interest. In terms of statistical process control, fault detection and diagnosis of processes is one of the essential tasks needed to run the process safely. In this work, nonlinear representation of process measurement data is presented and evaluated using a simulation process. The effect of using different representation methods on the diagnosis performance is tested in terms of computational efficiency and data handling. The results have shown that the nonlinear representation technique produced more reliable diagnosis results and outperforms linear methods. The use of data filtering step improved computational speed and diagnosis performance for test data sets. The presented scheme is different from existing ones in that it attempts to extract the fault pattern in the reduced space, not in the original process variable space. Thus this scheme helps to reduce the sensitivity of empirical models to noise.Keywords: fault diagnosis, nonlinear technique, process data, reduced spaces
Procedia PDF Downloads 24726529 Estimating Knowledge Flow Patterns of Business Method Patents with a Hidden Markov Model
Authors: Yoonjung An, Yongtae Park
Abstract:
Knowledge flows are a critical source of faster technological progress and stouter economic growth. Knowledge flows have been accelerated dramatically with the establishment of a patent system in which each patent is required by law to disclose sufficient technical information for the invention to be recreated. Patent analysis, thus, has been widely used to help investigate technological knowledge flows. However, the existing research is limited in terms of both subject and approach. Particularly, in most of the previous studies, business method (BM) patents were not covered although they are important drivers of knowledge flows as other patents. In addition, these studies usually focus on the static analysis of knowledge flows. Some use approaches that incorporate the time dimension, yet they still fail to trace a true dynamic process of knowledge flows. Therefore, we investigate dynamic patterns of knowledge flows driven by BM patents using a Hidden Markov Model (HMM). An HMM is a popular statistical tool for modeling a wide range of time series data, with no general theoretical limit in regard to statistical pattern classification. Accordingly, it enables characterizing knowledge patterns that may differ by patent, sector, country and so on. We run the model in sets of backward citations and forward citations to compare the patterns of knowledge utilization and knowledge dissemination.Keywords: business method patents, dynamic pattern, Hidden-Markov Model, knowledge flow
Procedia PDF Downloads 32826528 Text-to-Speech in Azerbaijani Language via Transfer Learning in a Low Resource Environment
Authors: Dzhavidan Zeinalov, Bugra Sen, Firangiz Aslanova
Abstract:
Most text-to-speech models cannot operate well in low-resource languages and require a great amount of high-quality training data to be considered good enough. Yet, with the improvements made in ASR systems, it is now much easier than ever to collect data for the design of custom text-to-speech models. In this work, our work on using the ASR model to collect data to build a viable text-to-speech system for one of the leading financial institutions of Azerbaijan will be outlined. NVIDIA’s implementation of the Tacotron 2 model was utilized along with the HiFiGAN vocoder. As for the training, the model was first trained with high-quality audio data collected from the Internet, then fine-tuned on the bank’s single speaker call center data. The results were then evaluated by 50 different listeners and got a mean opinion score of 4.17, displaying that our method is indeed viable. With this, we have successfully designed the first text-to-speech model in Azerbaijani and publicly shared 12 hours of audiobook data for everyone to use.Keywords: Azerbaijani language, HiFiGAN, Tacotron 2, text-to-speech, transfer learning, whisper
Procedia PDF Downloads 44