Search results for: continuous data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26706

Search results for: continuous data

25416 Investigations on Pyrolysis Model for Radiatively Dominant Diesel Pool Fire Using Fire Dynamic Simulator

Authors: Siva K. Bathina, Sudheer Siddapureddy

Abstract:

Pool fires are formed when the flammable liquid accidentally spills on the ground or water and ignites. Pool fire is a kind of buoyancy-driven and diffusion flame. There have been many pool fire accidents caused during processing, handling and storing of liquid fuels in chemical and oil industries. Such kind of accidents causes enormous damage to property as well as the loss of lives. Pool fires are complex in nature due to the strong interaction among the combustion, heat and mass transfers and pyrolysis at the fuel surface. Moreover, the experimental study of such large complex fires involves fire safety issues and difficulties in performing experiments. In the present work, large eddy simulations are performed to study such complex fire scenarios using fire dynamic simulator. A 1 m diesel pool fire is considered for the studied cases, and diesel is chosen as it is most commonly involved fuel in fire accidents. Fire simulations are performed by specifying two different boundary conditions: one the fuel is in liquid state and pyrolysis model is invoked, and the other by assuming the fuel is initially in a vapor state and thereby prescribing the mass loss rate. A domain of size 11.2 m × 11.2 m × 7.28 m with uniform structured grid is chosen for the numerical simulations. Grid sensitivity analysis is performed, and a non-dimensional grid size of 12 corresponding to 8 cm grid size is considered. Flame properties like mass burning rate, irradiance, and time-averaged axial flame temperature profile are predicted. The predicted steady-state mass burning rate is 40 g/s and is within the uncertainty limits of the previously reported experimental data (39.4 g/s). Though the profile of the irradiance at a distance from the fire along the height is somewhat in line with the experimental data and the location of the maximum value of irradiance is shifted to a higher location. This may be due to the lack of sophisticated models for the species transportation along with combustion and radiation in the continuous zone. Furthermore, the axial temperatures are not predicted well (for any of the boundary conditions) in any of the zones. The present study shows that the existing models are not sufficient enough for modeling blended fuels like diesel. The predictions are strongly dependent on the experimental values of the soot yield. Future experiments are necessary for generalizing the soot yield for different fires.

Keywords: burning rate, fire accidents, fire dynamic simulator, pyrolysis

Procedia PDF Downloads 196
25415 Congenital Diaphragmatic Hernia Outcomes in a Low-Volume Center

Authors: Michael Vieth, Aric Schadler, Hubert Ballard, J. A. Bauer, Pratibha Thakkar

Abstract:

Introduction: Congenital diaphragmatic hernia (CDH) is a condition characterized by the herniation of abdominal contents into the thoracic cavity requiring postnatal surgical repair. Previous literature suggests improved CDH outcomes at high-volume regional referral centers compared to low-volume centers. The purpose of this study was to examine CDH outcomes at Kentucky Children’s Hospital (KCH), a low-volume center, compared to the Congenital Diaphragmatic Hernia Study Group (CDHSG). Methods: A retrospective chart review was performed at KCH from 2007-2019 for neonates with CDH, and then subdivided into two cohorts: those requiring ECMO therapy and those not requiring ECMO therapy. Basic demographic data and measures of mortality and morbidity including ventilator days and length of stay were compared to the CDHSG. Measures of morbidity for the ECMO cohort including duration of ECMO, clinical bleeding, intracranial hemorrhage, sepsis, need for continuous renal replacement therapy (CRRT), need for sildenafil at discharge, timing of surgical repair, and total ventilator days were collected. Statistical analysis was performed using IBM SPSS Statistics version 28. One-sample t-tests and one-sample Wilcoxon Signed Rank test were utilized as appropriate.Results: There were a total of 27 neonatal patients with CDH at KCH from 2007-2019; 9 of the 27 required ECMO therapy. The birth weight and gestational age were similar between KCH and the CDHSG (2.99 kg vs 2.92 kg, p =0.655; 37.0 weeks vs 37.4 weeks, p =0.51). About half of the patients were inborn in both cohorts (52% vs 56%, p =0.676). KCH cohort had significantly more Caucasian patients (96% vs 55%, p=<0.001). Unadjusted mortality was similar in both groups (KCH 70% vs CDHSG 72%, p =0.857). Using ECMO utilization (KCH 78% vs CDHSG 52%, p =0.118) and need for surgical repair (KCH 95% vs CDHSG 85%, p =0.060) as proxy for severity, both groups’ mortality were comparable. No significant difference was noted for pulmonary outcomes such as average ventilator days (KCH 43.2 vs. CDHSG 17.3, p =0.078) and home oxygen dependency (KCH 44% vs. CDHSG 24%, p =0.108). Average length of hospital stay for patients treated at KCH was similar to CDHSG (64.4 vs 49.2, p=1.000). Conclusion: Our study demonstrates that outcome in CDH patients is independent of center’s case volume status. Management of CDH with a standardized approach in a low-volume center can yield similar outcomes. This data supports the treatment of patients with CDH at low-volume centers as opposed to transferring to higher-volume centers.

Keywords: ECMO, case volume, congenital diaphragmatic hernia, congenital diaphragmatic hernia study group, neonate

Procedia PDF Downloads 96
25414 The Impact of Sedimentary Heterogeneity on Oil Recovery in Basin-plain Turbidite: An Outcrop Analogue Simulation Case Study

Authors: Bayonle Abiola Omoniyi

Abstract:

In turbidite reservoirs with volumetrically significant thin-bedded turbidites (TBTs), thin-pay intervals may be underestimated during calculation of reserve volume due to poor vertical resolution of conventional well logs. This paper demonstrates the strong control of bed-scale sedimentary heterogeneity on oil recovery using six facies distribution scenarios that were generated from outcrop data from the Eocene Itzurun Formation, Basque Basin (northern Spain). The variable net sand volume in these scenarios serves as a primary source of sedimentary heterogeneity impacting sandstone-mudstone ratio, sand and shale geometry and dimensions, lateral and vertical variations in bed thickness, and attribute indices. The attributes provided input parameters for modeling the scenarios. The models are 20-m (65.6 ft) thick. Simulation of the scenarios reveals that oil production is markedly enhanced where degree of sedimentary heterogeneity and resultant permeability contrast are low, as exemplified by Scenarios 1, 2, and 3. In these scenarios, bed architecture encourages better apparent vertical connectivity across intervals of laterally continuous beds. By contrast, low net-to-gross Scenarios 4, 5, and 6, have rapidly declining oil production rates and higher water cut with more oil effectively trapped in low-permeability layers. These scenarios may possess enough lateral connectivity to enable injected water to sweep oil to production well; such sweep is achieved at a cost of high-water production. It is therefore imperative to consider not only net-to-gross threshold but also facies stack pattern and related attribute indices to better understand how to effectively manage water production for optimum oil recovery from basin-plain reservoirs.

Keywords: architecture, connectivity, modeling, turbidites

Procedia PDF Downloads 24
25413 A Machine Learning Model for Dynamic Prediction of Chronic Kidney Disease Risk Using Laboratory Data, Non-Laboratory Data, and Metabolic Indices

Authors: Amadou Wurry Jallow, Adama N. S. Bah, Karamo Bah, Shih-Ye Wang, Kuo-Chung Chu, Chien-Yeh Hsu

Abstract:

Chronic kidney disease (CKD) is a major public health challenge with high prevalence, rising incidence, and serious adverse consequences. Developing effective risk prediction models is a cost-effective approach to predicting and preventing complications of chronic kidney disease (CKD). This study aimed to develop an accurate machine learning model that can dynamically identify individuals at risk of CKD using various kinds of diagnostic data, with or without laboratory data, at different follow-up points. Creatinine is a key component used to predict CKD. These models will enable affordable and effective screening for CKD even with incomplete patient data, such as the absence of creatinine testing. This retrospective cohort study included data on 19,429 adults provided by a private research institute and screening laboratory in Taiwan, gathered between 2001 and 2015. Univariate Cox proportional hazard regression analyses were performed to determine the variables with high prognostic values for predicting CKD. We then identified interacting variables and grouped them according to diagnostic data categories. Our models used three types of data gathered at three points in time: non-laboratory, laboratory, and metabolic indices data. Next, we used subgroups of variables within each category to train two machine learning models (Random Forest and XGBoost). Our machine learning models can dynamically discriminate individuals at risk for developing CKD. All the models performed well using all three kinds of data, with or without laboratory data. Using only non-laboratory-based data (such as age, sex, body mass index (BMI), and waist circumference), both models predict chronic kidney disease as accurately as models using laboratory and metabolic indices data. Our machine learning models have demonstrated the use of different categories of diagnostic data for CKD prediction, with or without laboratory data. The machine learning models are simple to use and flexible because they work even with incomplete data and can be applied in any clinical setting, including settings where laboratory data is difficult to obtain.

Keywords: chronic kidney disease, glomerular filtration rate, creatinine, novel metabolic indices, machine learning, risk prediction

Procedia PDF Downloads 105
25412 Road Accidents Bigdata Mining and Visualization Using Support Vector Machines

Authors: Usha Lokala, Srinivas Nowduri, Prabhakar K. Sharma

Abstract:

Useful information has been extracted from the road accident data in United Kingdom (UK), using data analytics method, for avoiding possible accidents in rural and urban areas. This analysis make use of several methodologies such as data integration, support vector machines (SVM), correlation machines and multinomial goodness. The entire datasets have been imported from the traffic department of UK with due permission. The information extracted from these huge datasets forms a basis for several predictions, which in turn avoid unnecessary memory lapses. Since data is expected to grow continuously over a period of time, this work primarily proposes a new framework model which can be trained and adapt itself to new data and make accurate predictions. This work also throws some light on use of SVM’s methodology for text classifiers from the obtained traffic data. Finally, it emphasizes the uniqueness and adaptability of SVMs methodology appropriate for this kind of research work.

Keywords: support vector mechanism (SVM), machine learning (ML), support vector machines (SVM), department of transportation (DFT)

Procedia PDF Downloads 274
25411 Effect of Electromagnetic Fields on Protein Extraction from Shrimp By-Products for Electrospinning Process

Authors: Guido Trautmann-Sáez, Mario Pérez-Won, Vilbett Briones, María José Bugueño, Gipsy Tabilo-Munizaga, Luis Gonzáles-Cavieres

Abstract:

Shrimp by-products are a valuable source of protein. However, traditional protein extraction methods have limitations in terms of their efficiency. Protein extraction from shrimp (Pleuroncodes monodon) industrial by-products assisted with ohmic heating (OH), microwave (MW) and pulsed electric field (PEF). It was performed by chemical method (using NaOH and HCl 2M) assisted with OH, MW and PEF in a continuous flow system (5 ml/s). Protein determination, differential scanning calorimetry (DSC) and Fourier-transform infrared (FTIR). Results indicate a 19.25% (PEF) 3.65% (OH) and 28.19% (MW) improvement in protein extraction efficiency. The most efficient method was selected for the electrospinning process and obtaining fiber.

Keywords: electrospinning process, emerging technology, protein extraction, shrimp by-products

Procedia PDF Downloads 90
25410 A Relational Data Base for Radiation Therapy

Authors: Raffaele Danilo Esposito, Domingo Planes Meseguer, Maria Del Pilar Dorado Rodriguez

Abstract:

As far as we know, it is still unavailable a commercial solution which would allow to manage, openly and configurable up to user needs, the huge amount of data generated in a modern Radiation Oncology Department. Currently, available information management systems are mainly focused on Record & Verify and clinical data, and only to a small extent on physical data. Thus, results in a partial and limited use of the actually available information. In the present work we describe the implementation at our department of a centralized information management system based on a web server. Our system manages both information generated during patient planning and treatment, and information of general interest for the whole department (i.e. treatment protocols, quality assurance protocols etc.). Our objective it to be able to analyze in a simple and efficient way all the available data and thus to obtain quantitative evaluations of our treatments. This would allow us to improve our work flow and protocols. To this end we have implemented a relational data base which would allow us to use in a practical and efficient way all the available information. As always we only use license free software.

Keywords: information management system, radiation oncology, medical physics, free software

Procedia PDF Downloads 242
25409 A Study of Safety of Data Storage Devices of Graduate Students at Suan Sunandha Rajabhat University

Authors: Komol Phaisarn, Natcha Wattanaprapa

Abstract:

This research is a survey research with an objective to study the safety of data storage devices of graduate students of academic year 2013, Suan Sunandha Rajabhat University. Data were collected by questionnaire on the safety of data storage devices according to CIA principle. A sample size of 81 was drawn from population by purposive sampling method. The results show that most of the graduate students of academic year 2013 at Suan Sunandha Rajabhat University use handy drive to store their data and the safety level of the devices is at good level.

Keywords: security, safety, storage devices, graduate students

Procedia PDF Downloads 353
25408 Advanced Electric Motor Design Using Hollow Conductors for Maximizing Power, Density and Degree of Efficiency

Authors: Michael Naderer, Manuel Hartong, Raad Al-Kinani

Abstract:

The use of hollow conductors is known in electric generators of large MW scale. The application of motors of small scale between 50 and 200kW is new. The latest results in the practical application and set up of machines show that the power density can be raised significantly and the common problem of derating of the motors is prevented. Furthermore, new design dimensions can be realised as continuous current densities up to 75A/mm² are achievable. This paper shows the results of the application of hollow conductors for a motor design used for automotive traction machines comparing common coolings with hollow conductor cooling.

Keywords: degree of efficiency, electric motor design, hollow conductors, power density

Procedia PDF Downloads 197
25407 Simulation of a Cost Model Response Requests for Replication in Data Grid Environment

Authors: Kaddi Mohammed, A. Benatiallah, D. Benatiallah

Abstract:

Data grid is a technology that has full emergence of new challenges, such as the heterogeneity and availability of various resources and geographically distributed, fast data access, minimizing latency and fault tolerance. Researchers interested in this technology address the problems of the various systems related to the industry such as task scheduling, load balancing and replication. The latter is an effective solution to achieve good performance in terms of data access and grid resources and better availability of data cost. In a system with duplication, a coherence protocol is used to impose some degree of synchronization between the various copies and impose some order on updates. In this project, we present an approach for placing replicas to minimize the cost of response of requests to read or write, and we implement our model in a simulation environment. The placement techniques are based on a cost model which depends on several factors, such as bandwidth, data size and storage nodes.

Keywords: response time, query, consistency, bandwidth, storage capacity, CERN

Procedia PDF Downloads 271
25406 Prompt Design for Code Generation in Data Analysis Using Large Language Models

Authors: Lu Song Ma Li Zhi

Abstract:

With the rapid advancement of artificial intelligence technology, large language models (LLMs) have become a milestone in the field of natural language processing, demonstrating remarkable capabilities in semantic understanding, intelligent question answering, and text generation. These models are gradually penetrating various industries, particularly showcasing significant application potential in the data analysis domain. However, retraining or fine-tuning these models requires substantial computational resources and ample downstream task datasets, which poses a significant challenge for many enterprises and research institutions. Without modifying the internal parameters of the large models, prompt engineering techniques can rapidly adapt these models to new domains. This paper proposes a prompt design strategy aimed at leveraging the capabilities of large language models to automate the generation of data analysis code. By carefully designing prompts, data analysis requirements can be described in natural language, which the large language model can then understand and convert into executable data analysis code, thereby greatly enhancing the efficiency and convenience of data analysis. This strategy not only lowers the threshold for using large models but also significantly improves the accuracy and efficiency of data analysis. Our approach includes requirements for the precision of natural language descriptions, coverage of diverse data analysis needs, and mechanisms for immediate feedback and adjustment. Experimental results show that with this prompt design strategy, large language models perform exceptionally well in multiple data analysis tasks, generating high-quality code and significantly shortening the data analysis cycle. This method provides an efficient and convenient tool for the data analysis field and demonstrates the enormous potential of large language models in practical applications.

Keywords: large language models, prompt design, data analysis, code generation

Procedia PDF Downloads 40
25405 Comparison of Different Methods to Produce Fuzzy Tolerance Relations for Rainfall Data Classification in the Region of Central Greece

Authors: N. Samarinas, C. Evangelides, C. Vrekos

Abstract:

The aim of this paper is the comparison of three different methods, in order to produce fuzzy tolerance relations for rainfall data classification. More specifically, the three methods are correlation coefficient, cosine amplitude and max-min method. The data were obtained from seven rainfall stations in the region of central Greece and refers to 20-year time series of monthly rainfall height average. Three methods were used to express these data as a fuzzy relation. This specific fuzzy tolerance relation is reformed into an equivalence relation with max-min composition for all three methods. From the equivalence relation, the rainfall stations were categorized and classified according to the degree of confidence. The classification shows the similarities among the rainfall stations. Stations with high similarity can be utilized in water resource management scenarios interchangeably or to augment data from one to another. Due to the complexity of calculations, it is important to find out which of the methods is computationally simpler and needs fewer compositions in order to give reliable results.

Keywords: classification, fuzzy logic, tolerance relations, rainfall data

Procedia PDF Downloads 314
25404 Design and Fabrication of Pulse Detonation Engine Based on Numerical Simulation

Authors: Vishal Shetty, Pranjal Khasnis, Saptarshi Mandal

Abstract:

This work explores the design and fabrication of a fundamental pulse detonation engine (PDE) prototype on the basis of pressure and temperature pulse obtained from numerical simulation of the same. PDE is an advanced propulsion system that utilizes detonation waves for thrust generation. PDEs use a fuel-air mixture ignited to create a supersonic detonation wave, resulting in rapid energy release, high pressures, and high temperatures. The operational cycle includes fuel injection, ignition, detonation, exhaust of combustion products, and purging of the chamber for the next cycle. This work presents details of the core operating principles of a PDE, highlighting its potential advantages over traditional jet engines that rely on continuous combustion. The design focuses on a straightforward, valve-controlled system for fuel and oxidizer injection into a detonation tube. The detonation was initiated using an electronically controlled spark plug or similar high-energy ignition source. Following the detonation, a purge valve was employed to expel the combusted gases and prepare the tube for the next cycle. Key considerations for the design include material selection for the detonation tube to withstand the high temperatures and pressures generated during detonation. Fabrication techniques prioritized readily available machining methods to create a functional prototype. This work detailed the testing procedures for verifying the functionality of the PDE prototype. Emphasis was given to the measurement of thrust generation and capturing of pressure data within the detonation tube. The numerical analysis presents performance evaluation and potential areas for future design optimization.

Keywords: pulse detonation engine, ignition, detonation, combustion

Procedia PDF Downloads 20
25403 Advances of Image Processing in Precision Agriculture: Using Deep Learning Convolution Neural Network for Soil Nutrient Classification

Authors: Halimatu S. Abdullahi, Ray E. Sheriff, Fatima Mahieddine

Abstract:

Agriculture is essential to the continuous existence of human life as they directly depend on it for the production of food. The exponential rise in population calls for a rapid increase in food with the application of technology to reduce the laborious work and maximize production. Technology can aid/improve agriculture in several ways through pre-planning and post-harvest by the use of computer vision technology through image processing to determine the soil nutrient composition, right amount, right time, right place application of farm input resources like fertilizers, herbicides, water, weed detection, early detection of pest and diseases etc. This is precision agriculture which is thought to be solution required to achieve our goals. There has been significant improvement in the area of image processing and data processing which has being a major challenge. A database of images is collected through remote sensing, analyzed and a model is developed to determine the right treatment plans for different crop types and different regions. Features of images from vegetations need to be extracted, classified, segmented and finally fed into the model. Different techniques have been applied to the processes from the use of neural network, support vector machine, fuzzy logic approach and recently, the most effective approach generating excellent results using the deep learning approach of convolution neural network for image classifications. Deep Convolution neural network is used to determine soil nutrients required in a plantation for maximum production. The experimental results on the developed model yielded results with an average accuracy of 99.58%.

Keywords: convolution, feature extraction, image analysis, validation, precision agriculture

Procedia PDF Downloads 316
25402 Customer Satisfaction and Effective HRM Policies: Customer and Employee Satisfaction

Authors: S. Anastasiou, C. Nathanailides

Abstract:

The purpose of this study is to examine the possible link between employee and customer satisfaction. The service provided by employees, help to build a good relationship with customers and can help at increasing their loyalty. Published data for job satisfaction and indicators of customer services were gathered from relevant published works which included data from five different countries. The reviewed data indicate a significant correlation between indicators of customer and employee satisfaction in the Banking sector. There was a significant correlation between the two parameters (Pearson correlation R2=0.52 P<0.05) The reviewed data provide evidence that there is some practical evidence which links these two parameters.

Keywords: job satisfaction, job performance, customer’ service, banks, human resources management

Procedia PDF Downloads 321
25401 Access the Knowledge, Awareness, and Factors Associated With Hypertension Among the Residents of Modeca District of Tiko, South West Region of Cameroon, in the Middle of a Separatist Violence Since 2017

Authors: Franck Kem Acho

Abstract:

The trends of diseases have been changed from the last few years, now the burden of non-communicable diseases is increasing day by day. In all the non-communicable diseases, Hypertension is one of the leading causes of premature death and morbidity worldwide. This disease is a silent killer, it mostly affects the people with no obvious symptoms. Not only the heart it also increases the risk of brain, kidney and other diseases, now a days it is a serious medical problem. Over a billion people near about 1 in 4 men and 1 in 5 women having hypertension. In this case study men and women of ages between 30-80 years with Hypertension were identified in community remote area with their Health status being checked and monitored for one week and Health Education was provided for the importance of regular Health checkup alongside the continuous taking of medications.

Keywords: hypertension, health status, health check up, health education

Procedia PDF Downloads 67
25400 Evaluation of Australian Open Banking Regulation: Balancing Customer Data Privacy and Innovation

Authors: Suman Podder

Abstract:

As Australian ‘Open Banking’ allows customers to share their financial data with accredited Third-Party Providers (‘TPPs’), it is necessary to evaluate whether the regulators have achieved the balance between protecting customer data privacy and promoting data-related innovation. Recognising the need to increase customers’ influence on their own data, and the benefits of data-related innovation, the Australian Government introduced ‘Consumer Data Right’ (‘CDR’) to the banking sector through Open Banking regulation. Under Open Banking, TPPs can access customers’ banking data that allows the TPPs to tailor their products and services to meet customer needs at a more competitive price. This facilitated access and use of customer data will promote innovation by providing opportunities for new products and business models to emerge and grow. However, the success of Open Banking depends on the willingness of the customers to share their data, so the regulators have augmented the protection of data by introducing new privacy safeguards to instill confidence and trust in the system. The dilemma in policymaking is that, on the one hand, lenient data privacy laws will help the flow of information, but at the risk of individuals’ loss of privacy, on the other hand, stringent laws that adequately protect privacy may dissuade innovation. Using theoretical and doctrinal methods, this paper examines whether the privacy safeguards under Open Banking will add to the compliance burden of the participating financial institutions, resulting in the undesirable effect of stifling other policy objectives such as innovation. The contribution of this research is three-fold. In the emerging field of customer data sharing, this research is one of the few academic studies on the objectives and impact of Open Banking in the Australian context. Additionally, Open Banking is still in the early stages of implementation, so this research traces the evolution of Open Banking through policy debates regarding the desirability of customer data-sharing. Finally, the research focuses not only on the customers’ data privacy and juxtaposes it with another important objective of promoting innovation, but it also highlights the critical issues facing the data-sharing regime. This paper argues that while it is challenging to develop a regulatory framework for protecting data privacy without impeding innovation and jeopardising yet unknown opportunities, data privacy and innovation promote different aspects of customer welfare. This paper concludes that if a regulation is appropriately designed and implemented, the benefits of data-sharing will outweigh the cost of compliance with the CDR.

Keywords: consumer data right, innovation, open banking, privacy safeguards

Procedia PDF Downloads 140
25399 Generation of Automated Alarms for Plantwide Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.

Keywords: detection, monitoring, process data, noise

Procedia PDF Downloads 252
25398 Meanings and Concepts of Standardization in Systems Medicine

Authors: Imme Petersen, Wiebke Sick, Regine Kollek

Abstract:

In systems medicine, high-throughput technologies produce large amounts of data on different biological and pathological processes, including (disturbed) gene expressions, metabolic pathways and signaling. The large volume of data of different types, stored in separate databases and often located at different geographical sites have posed new challenges regarding data handling and processing. Tools based on bioinformatics have been developed to resolve the upcoming problems of systematizing, standardizing and integrating the various data. However, the heterogeneity of data gathered at different levels of biological complexity is still a major challenge in data analysis. To build multilayer disease modules, large and heterogeneous data of disease-related information (e.g., genotype, phenotype, environmental factors) are correlated. Therefore, a great deal of attention in systems medicine has been put on data standardization, primarily to retrieve and combine large, heterogeneous datasets into standardized and incorporated forms and structures. However, this data-centred concept of standardization in systems medicine is contrary to the debate in science and technology studies (STS) on standardization that rather emphasizes the dynamics, contexts and negotiations of standard operating procedures. Based on empirical work on research consortia that explore the molecular profile of diseases to establish systems medical approaches in the clinic in Germany, we trace how standardized data are processed and shaped by bioinformatics tools, how scientists using such data in research perceive such standard operating procedures and which consequences for knowledge production (e.g. modeling) arise from it. Hence, different concepts and meanings of standardization are explored to get a deeper insight into standard operating procedures not only in systems medicine, but also beyond.

Keywords: data, science and technology studies (STS), standardization, systems medicine

Procedia PDF Downloads 341
25397 Design an Expert System to Assess the Hydraulic System in Thermal and Hydrodynamic Aspect

Authors: Ahmad Abdul-Razzak Aboudi Al-Issa

Abstract:

Thermal and Hydrodynamic are basic aspects in any hydraulic system and therefore, they must be assessed with regard to this aspect before constructing the system. This assessment needs a good expertise in this aspect to obtain an efficient hydraulic system. Therefore, this study aims to build an expert system called Hydraulic System Calculations (HSC) to ensure a smooth operation for the hydraulic system. The expert system (HSC) had been designed and coded in an user-friendly interactive program called Microsoft Visual Basic 2010. The suggested code provides the designer with a number of choices to resolve the problem of hydraulic oil overheating which may arise during the continuous operation of the hydraulic unit. As a result, the HSC can minimize the human errors, effort, time and cost of hydraulic machine design.

Keywords: fluid power, hydraulic system, thermal and hydrodynamic, expert system

Procedia PDF Downloads 446
25396 Integrated On-Board Diagnostic-II and Direct Controller Area Network Access for Vehicle Monitoring System

Authors: Kavian Khosravinia, Mohd Khair Hassan, Ribhan Zafira Abdul Rahman, Syed Abdul Rahman Al-Haddad

Abstract:

The CAN (controller area network) bus is introduced as a multi-master, message broadcast system. The messages sent on the CAN are used to communicate state information, referred as a signal between different ECUs, which provides data consistency in every node of the system. OBD-II Dongles that are based on request and response method is the wide-spread solution for extracting sensor data from cars among researchers. Unfortunately, most of the past researches do not consider resolution and quantity of their input data extracted through OBD-II technology. The maximum feasible scan rate is only 9 queries per second which provide 8 data points per second with using ELM327 as well-known OBD-II dongle. This study aims to develop and design a programmable, and latency-sensitive vehicle data acquisition system that improves the modularity and flexibility to extract exact, trustworthy, and fresh car sensor data with higher frequency rates. Furthermore, the researcher must break apart, thoroughly inspect, and observe the internal network of the vehicle, which may cause severe damages to the expensive ECUs of the vehicle due to intrinsic vulnerabilities of the CAN bus during initial research. Desired sensors data were collected from various vehicles utilizing Raspberry Pi3 as computing and processing unit with using OBD (request-response) and direct CAN method at the same time. Two types of data were collected for this study. The first, CAN bus frame data that illustrates data collected for each line of hex data sent from an ECU and the second type is the OBD data that represents some limited data that is requested from ECU under standard condition. The proposed system is reconfigurable, human-readable and multi-task telematics device that can be fitted into any vehicle with minimum effort and minimum time lag in the data extraction process. The standard operational procedure experimental vehicle network test bench is developed and can be used for future vehicle network testing experiment.

Keywords: CAN bus, OBD-II, vehicle data acquisition, connected cars, telemetry, Raspberry Pi3

Procedia PDF Downloads 204
25395 Big Data in Construction Project Management: The Colombian Northeast Case

Authors: Sergio Zabala-Vargas, Miguel Jiménez-Barrera, Luz VArgas-Sánchez

Abstract:

In recent years, information related to project management in organizations has been increasing exponentially. Performance data, management statistics, indicator results have forced the collection, analysis, traceability, and dissemination of project managers to be essential. In this sense, there are current trends to facilitate efficient decision-making in emerging technology projects, such as: Machine Learning, Data Analytics, Data Mining, and Big Data. The latter is the most interesting in this project. This research is part of the thematic line Construction methods and project management. Many authors present the relevance that the use of emerging technologies, such as Big Data, has taken in recent years in project management in the construction sector. The main focus is the optimization of time, scope, budget, and in general mitigating risks. This research was developed in the northeastern region of Colombia-South America. The first phase was aimed at diagnosing the use of emerging technologies (Big-Data) in the construction sector. In Colombia, the construction sector represents more than 50% of the productive system, and more than 2 million people participate in this economic segment. The quantitative approach was used. A survey was applied to a sample of 91 companies in the construction sector. Preliminary results indicate that the use of Big Data and other emerging technologies is very low and also that there is interest in modernizing project management. There is evidence of a correlation between the interest in using new data management technologies and the incorporation of Building Information Modeling BIM. The next phase of the research will allow the generation of guidelines and strategies for the incorporation of technological tools in the construction sector in Colombia.

Keywords: big data, building information modeling, tecnology, project manamegent

Procedia PDF Downloads 128
25394 Risk Management and Resiliency: Evaluating Walmart’s Global Supply Chain Leadership Using the Supply Chain Resilience Assessment and Management Framework

Authors: Meghan Biallas, Amanda Hoffman, Tamara Miller, Kimmy Schnibben, Janaina Siegler

Abstract:

This paper assesses Walmart’s supply chain resiliency amidst continuous supply chain disruptions. It aims to evaluate how Walmart can use supply chain resiliency theory to retain its status as a global supply chain leader. The Bloomberg terminal was used to organize Walmart’s 754 Tier-1 suppliers by the size of their relationship to Walmart. Additional data from IBISWorld and Statista was also used in the analysis. This research focused on the top ten Tier-1 suppliers, with the greatest percentage of their revenue attributed to Walmart. This paper also applied the firm’s information to the Supply Chain Resilience Assessment and Management (SCRAM) framework for supply chain resiliency to evaluate the firm’s capabilities, vulnerabilities, and gaps. A rubric was created to quantify Walmart’s risks using four pillars: flexibility, velocity, visibility, and collaboration. Information and examples were reported from Walmart’s 10k filing. For each example, a rating of 1 indicated “high” resiliency, 0 indicated “medium” resiliency, and -1 indicated “low” resiliency. Findings from this study include the following: (1) Walmart has maintained its leadership through its ability to remain resilient with regard to visibility, efficiency, capacity, and collaboration. (2) Walmart is experiencing increases in supply chain costs due to internal factors affecting the company and external factors affecting its suppliers. (3) There are a number of emerging supply chain risks with Walmart’s suppliers, which could cause issues for Walmart to remain a supply chain leader in the future. Using the SCRAM framework, this paper assesses how Walmart measures up to the Supply Chain Resiliency Theory, identifying areas of strength as well as areas where Walmart can improve in order to remain a global supply chain leader.

Keywords: supply chain resiliency, zone of balanced resilience, supply chain resilience assessment and management, supply chain theory.

Procedia PDF Downloads 127
25393 Understanding the Excited State Dynamics of a Phase Transformable Photo-Active Metal-Organic Framework MIP 177 through Time-Resolved Infrared Spectroscopy

Authors: Aneek Kuila, Yaron Paz

Abstract:

MIP 177 LT and HT are two-phase transformable metal organic frameworks consisting of a Ti12O15 oxocluster and a tetracarboxylate ligand that exhibits robust chemical stability and improved photoactivity. LT to HT only shows the changes in dimensionality from 0D to 1D without any change in the overall chemical structure. In terms of chemical and photoactivity MIP 177 LT is found to perform better than the MIP 177HT. Step-scan Fourier transform absorption difference time-resolved spectroscopy has been used to collect mid-IR time-resolved infrared spectra of the transient electronic excited states of a nano-porous metal–organic framework MIP 177-LT and HT with 2.5 ns time resolution. Analyzing the time-resolved vibrational data after 355nm LASER excitation reveals the presence of the temporal changes of ν (O-Ti-O) of Ti-O metal cluster and ν (-COO) of the ligand concluding the fact that these moieties are the ultimate acceptors of the excited charges which are localized over those regions on the nanosecond timescale. A direct negative correlation between the differential absorbance (Δ Absorbance) reveals the charge transfer relation among these two moieties. A longer-lived transient signal up to 180ns for MIP 177 LT compared to the 100 ns of MIP 177 HT shows the extended lifetime of the reactive charges over the surface that exerts in their effectivity. An ultrafast change of bidentate to monodentate bridging in the -COO-Ti-O ligand-metal coordination environment was observed after the photoexcitation of MIP 177 LT which remains and lives with for seconds after photoexcitation is halted. This phenomenon is very unique to MIP 177 LT but not observed with HT. This in-situ change in the coordination denticity during the photoexcitation was not observed previously which can rationalize the reason behind the ability of MIP 177 LT to accumulate electrons during continuous photoexcitation leading to a superior photocatalytic activity.

Keywords: time resolved FTIR, metal organic framework, denticity, photoacatalysis

Procedia PDF Downloads 59
25392 Minimum Data of a Speech Signal as Special Indicators of Identification in Phonoscopy

Authors: Nazaket Gazieva

Abstract:

Voice biometric data associated with physiological, psychological and other factors are widely used in forensic phonoscopy. There are various methods for identifying and verifying a person by voice. This article explores the minimum speech signal data as individual parameters of a speech signal. Monozygotic twins are believed to be genetically identical. Using the minimum data of the speech signal, we came to the conclusion that the voice imprint of monozygotic twins is individual. According to the conclusion of the experiment, we can conclude that the minimum indicators of the speech signal are more stable and reliable for phonoscopic examinations.

Keywords: phonogram, speech signal, temporal characteristics, fundamental frequency, biometric fingerprints

Procedia PDF Downloads 144
25391 Impact of Primary Care on Sexual and Reproductive Health for Migrant Women in Medellín Colombia

Authors: Alexis Piedrahita, Ludi Valencia, Aura Gutierrez

Abstract:

The migration crisis that is currently being experienced in the world is a continuous phenomenon that has had solutions in form but not in substance, violating the international humanitarian law of people who are in transit through countries foreign to their roots, especially women of age reproductive, this has caused different governments and organizations worldwide to meet around this problem to define concise actions to protect the rights of migrant women in the world. This research compiles the stories of migrant women who arrive in Colombia seeking better opportunities, such as accessibility to comprehensive and quality health services, including primary health care. This is the gateway to the offer of health promotion and disease prevention services.

Keywords: accessibility, primary health care, sexual and reproductive health, sustainable development goals, women migrant

Procedia PDF Downloads 77
25390 Separation of Water/Organic Mixtures Using Micro- and Nanostructured Membranes of Special Type of Wettability

Authors: F. R. Sultanov Ch. Daulbayev, B. Bakbolat, Z. A. Mansurov, A. A. Zhurintaeva, R. I. Gadilshina, A. B. Dugali

Abstract:

Both hydrophilic-oleophobic and hydrophobic-oleophilic membranes were obtained by coating of the substrate of membranes, presented by stainless steel meshes with various dimensions of their openings, with a composition that forms the special type of their surface wettability via spray-coating method. The surface morphology of resulting membranes was studied using SEM, the type of their wettability was identified by measuring the contact angle between the surface of membrane and a drop of studied liquid (water or organic liquid) and efficiency of continuous separation of water and organic liquid was studied on self-assembled setup.

Keywords: membrane, stainless steel mesh, oleophobicity, hydrophobicity, separation, water, organic liquids

Procedia PDF Downloads 167
25389 A Non-parametric Clustering Approach for Multivariate Geostatistical Data

Authors: Francky Fouedjio

Abstract:

Multivariate geostatistical data have become omnipresent in the geosciences and pose substantial analysis challenges. One of them is the grouping of data locations into spatially contiguous clusters so that data locations within the same cluster are more similar while clusters are different from each other, in some sense. Spatially contiguous clusters can significantly improve the interpretation that turns the resulting clusters into meaningful geographical subregions. In this paper, we develop an agglomerative hierarchical clustering approach that takes into account the spatial dependency between observations. It relies on a dissimilarity matrix built from a non-parametric kernel estimator of the spatial dependence structure of data. It integrates existing methods to find the optimal cluster number and to evaluate the contribution of variables to the clustering. The capability of the proposed approach to provide spatially compact, connected and meaningful clusters is assessed using bivariate synthetic dataset and multivariate geochemical dataset. The proposed clustering method gives satisfactory results compared to other similar geostatistical clustering methods.

Keywords: clustering, geostatistics, multivariate data, non-parametric

Procedia PDF Downloads 477
25388 A Data Mining Approach for Analysing and Predicting the Bank's Asset Liability Management Based on Basel III Norms

Authors: Nidhin Dani Abraham, T. K. Sri Shilpa

Abstract:

Asset liability management is an important aspect in banking business. Moreover, the today’s banking is based on BASEL III which strictly regulates on the counterparty default. This paper focuses on prediction and analysis of counter party default risk, which is a type of risk occurs when the customers fail to repay the amount back to the lender (bank or any financial institutions). This paper proposes an approach to reduce the counterparty risk occurring in the financial institutions using an appropriate data mining technique and thus predicts the occurrence of NPA. It also helps in asset building and restructuring quality. Liability management is very important to carry out banking business. To know and analyze the depth of liability of bank, a suitable technique is required. For that a data mining technique is being used to predict the dormant behaviour of various deposit bank customers. Various models are implemented and the results are analyzed of saving bank deposit customers. All these data are cleaned using data cleansing approach from the bank data warehouse.

Keywords: data mining, asset liability management, BASEL III, banking

Procedia PDF Downloads 552
25387 Parallel Coordinates on a Spiral Surface for Visualizing High-Dimensional Data

Authors: Chris Suma, Yingcai Xiao

Abstract:

This paper presents Parallel Coordinates on a Spiral Surface (PCoSS), a parallel coordinate based interactive visualization method for high-dimensional data, and a test implementation of the method. Plots generated by the test system are compared with those generated by XDAT, a software implementing traditional parallel coordinates. Traditional parallel coordinate plots can be cluttered when the number of data points is large or when the dimensionality of the data is high. PCoSS plots display multivariate data on a 3D spiral surface and allow users to see the whole picture of high-dimensional data with less cluttering. Taking advantage of the 3D display environment in PCoSS, users can further reduce cluttering by zooming into an axis of interest for a closer view or by moving vantage points and by reorienting the viewing angle to obtain a desired view of the plots.

Keywords: human computer interaction, parallel coordinates, spiral surface, visualization

Procedia PDF Downloads 11