Search results for: continuous speed profile data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30089

Search results for: continuous speed profile data

25709 The Experimental and Numerical Analysis of the Joining Processes for Air Conditioning Systems

Authors: M.St. Węglowski, D. Miara, S. Błacha, J. Dworak, J. Rykała, K. Kwieciński, J. Pikuła, G. Ziobro, A. Szafron, P. Zimierska-Nowak, M. Richert, P. Noga

Abstract:

In the paper the results of welding of car’s air-conditioning elements are presented. These systems based on, mainly, the environmental unfriendly refrigerants. Thus, the producers of cars will have to stop using traditional refrigerant and to change it to carbon dioxide (R744). This refrigerant is environmental friendly. However, it should be noted that the air condition system working with R744 refrigerant operates at high temperature (up to 150 °C) and high pressure (up to 130 bar). These two parameters are much higher than for other refrigerants. Thus new materials, design as well as joining technologies are strongly needed for these systems. AISI 304 and 316L steels as well as aluminium alloys 5xxx are ranked among the prospective materials. As a joining process laser welding, plasma welding, electron beam welding as well as high rotary friction welding can be applied. In the study, the metallographic examination based on light microscopy as well as SEM was applied to estimate the quality of welded joints. The analysis of welding was supported by numerical modelling based on Sysweld software. The results indicated that using laser, plasma and electron beam welding, it is possible to obtain proper quality of welds in stainless steel. Moreover, high rotary friction welding allows to guarantee the metallic continuity in the aluminium welded area. The metallographic examination revealed that the grain growth in the heat affected zone (HAZ) in laser and electron beam welded joints were not observed. It is due to low heat input and short welding time. The grain growth and subgrains can be observed at room temperature when the solidification mode is austenitic. This caused low microstructural changes during solidification. The columnar grain structure was found in the weld metal. Meanwhile, the equiaxed grains were detected in the interface. The numerical modelling of laser welding process allowed to estimate the temperature profile in the welded joint as well as predicts the dimensions of welds. The agreement between FEM analysis and experimental data was achieved.  

Keywords: car’s air–conditioning, microstructure, numerical modelling, welding

Procedia PDF Downloads 408
25708 Characterization of a Pure Diamond-Like Carbon Film Deposited by Nanosecond Pulsed Laser Deposition

Authors: Camilla G. Goncalves, Benedito Christ, Walter Miyakawa, Antonio J. Abdalla

Abstract:

This work aims to investigate the properties and microstructure of diamond-like carbon film deposited by pulsed laser deposition by ablation of a graphite target in a vacuum chamber on a steel substrate. The equipment was mounted to provide one laser beam. The target of high purity graphite and the steel substrate were polished. The mechanical and tribological properties of the film were characterized using Raman spectroscopy, nanoindentation test, scratch test, roughness profile, tribometer, optical microscopy and SEM images. It was concluded that the pulsed laser deposition (PLD) technique associated with the low-pressure chamber and a graphite target provides a good fraction of sp3 bonding, that the process variable as surface polishing and laser parameter have great influence in tribological properties and in adherence tests performance. The optical microscopy images are efficient to identify the metallurgical bond.

Keywords: characterization, DLC, mechanical properties, pulsed laser deposition

Procedia PDF Downloads 153
25707 Control Strategy for a Solar Vehicle Race

Authors: Francois Defay, Martim Calao, Jean Francois Dassieu, Laurent Salvetat

Abstract:

Electrical vehicles are a solution for reducing the pollution using green energy. The shell Eco-Marathon provides rules in order to minimize the battery use for the race. The use of solar panel combined with efficient motor control and race strategy allow driving a 60kg vehicle with one pilot using only the solar energy in the best case. This paper presents a complete modelization of a solar vehicle used for the shell eco-marathon. This project called Helios is cooperation between non-graduated students, academic institutes, and industrials. The prototype is an ultra-energy-efficient vehicle based on one-meter square solar panel and an own-made brushless controller to optimize the electrical part. The vehicle is equipped with sensors and embedded system to provide all the data in real time in order to evaluate the best strategy for the course. A complete modelization with Matlab/Simulink is used to test the optimal strategy to increase the global endurance. Experimental results are presented to validate the different parts of the model: mechanical, aerodynamics, electrical, solar panel. The major finding of this study is to provide solutions to identify the model parameters (Rolling Resistance Coefficient, drag coefficient, motor torque coefficient, etc.) by means of experimental results combined with identification techniques. One time the coefficients are validated, the strategy to optimize the consumption and the average speed can be tested first in simulation before to be implanted for the race. The paper describes all the simulation and experimental parts and provides results in order to optimize the global efficiency of the vehicle. This works have been started four years ago and evolved many students for the experimental and theoretical parts and allow to increase the knowledge on electrical self-efficient vehicle.

Keywords: electrical vehicle, endurance, optimization, shell eco-marathon

Procedia PDF Downloads 266
25706 Data Security and Privacy Challenges in Cloud Computing

Authors: Amir Rashid

Abstract:

Cloud Computing frameworks empower organizations to cut expenses by outsourcing computation resources on-request. As of now, customers of Cloud service providers have no methods for confirming the privacy and ownership of their information and data. To address this issue we propose the platform of a trusted cloud computing program (TCCP). TCCP empowers Infrastructure as a Service (IaaS) suppliers, for example, Amazon EC2 to give a shout box execution condition that ensures secret execution of visitor virtual machines. Also, it permits clients to bear witness to the IaaS supplier and decide if the administration is secure before they dispatch their virtual machines. This paper proposes a Trusted Cloud Computing Platform (TCCP) for guaranteeing the privacy and trustworthiness of computed data that are outsourced to IaaS service providers. The TCCP gives the deliberation of a shut box execution condition for a client's VM, ensuring that no cloud supplier's authorized manager can examine or mess up with its data. Furthermore, before launching the VM, the TCCP permits a client to dependably and remotely acknowledge that the provider at backend is running a confided in TCCP. This capacity extends the verification of whole administration, and hence permits a client to confirm the data operation in secure mode.

Keywords: cloud security, IaaS, cloud data privacy and integrity, hybrid cloud

Procedia PDF Downloads 299
25705 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record

Authors: Raghavi C. Janaswamy

Abstract:

In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.

Keywords: electronic health record, graph neural network, heterogeneous data, prediction

Procedia PDF Downloads 86
25704 A Proposal to Tackle Security Challenges of Distributed Systems in the Healthcare Sector

Authors: Ang Chia Hong, Julian Khoo Xubin, Burra Venkata Durga Kumar

Abstract:

Distributed systems offer many benefits to the healthcare industry. From big data analysis to business intelligence, the increased computational power and efficiency from distributed systems serve as an invaluable resource in the healthcare sector to utilize. However, as the usage of these distributed systems increases, many issues arise. The main focus of this paper will be on security issues. Many security issues stem from distributed systems in the healthcare industry, particularly information security. The data of people is especially sensitive in the healthcare industry. If important information gets leaked (Eg. IC, credit card number, address, etc.), a person’s identity, financial status, and safety might get compromised. This results in the responsible organization losing a lot of money in compensating these people and even more resources expended trying to fix the fault. Therefore, a framework for a blockchain-based healthcare data management system for healthcare was proposed. In this framework, the usage of a blockchain network is explored to store the encryption key of the patient’s data. As for the actual data, it is encrypted and its encrypted data, called ciphertext, is stored in a cloud storage platform. Furthermore, there are some issues that have to be emphasized and tackled for future improvements, such as a multi-user scheme that could be proposed, authentication issues that have to be tackled or migrating the backend processes into the blockchain network. Due to the nature of blockchain technology, the data will be tamper-proof, and its read-only function can only be accessed by authorized users such as doctors and nurses. This guarantees the confidentiality and immutability of the patient’s data.

Keywords: distributed, healthcare, efficiency, security, blockchain, confidentiality and immutability

Procedia PDF Downloads 184
25703 Toluene Methylation with Methanol Using Synthesized HZSM-5 Catalysts Modified by Silylation and Dealumination

Authors: Weerachit Pulsawas, Thirasak Rirksomboon

Abstract:

Due to its abundance from catalytic reforming and thermal cracking of naphtha, toluene could become more value-added compound if it is converted into xylenes, particularly p-xylene, via toluene methylation. Attractively, toluene methylation with methanol is an alternative route to produce xylenes in the absence of other hydrocarbon by-products for which appropriate catalyst would be utilized. In this study, HZSM-5 catalysts with Si/Al molar ratio of 100 were synthesized via hydrothermal treatment and modified by either chemical liquid deposition using tetraethyl-orthosilicate or dealumination with steam. The modified catalysts were characterized by several techniques and tested for their catalytic activity in a continuous down-flow fixed bed reactor. Various operating conditions including WHSV’s of 5 to 20 h-1, reaction temperatures of 400 to 500 °C, and toluene-to-methanol molar ratios (T/M) of 1 to 4 were investigated for attaining possible highest p-xylene selectivity. As a result, the catalytic activity of parent HZSM-5 with temperature of 400 °C, T/M of 4 and WHSV of 24 h-1 showed 65.36% in p-xylene selectivity and 11.90% in toluene conversion as demonstrated for 4 h on stream.

Keywords: toluene methylaion, HZSM-5, silylation, dealumination

Procedia PDF Downloads 195
25702 Nutritional Benefits of Soy: An Implication for Health Education

Authors: Mbadugha Esther Ifeoma

Abstract:

Soybeans, like other legumes are rich in nutrients. However, the nutrient profile of soybeans differs in some important ways from most other legumes. Among other nutrients, soy is high in protein, carbohydrates, and fibers, is rich in vitamins, minerals and unsaturated fatty acids and is low in saturated fatty acids. Because of its high nutritional value, it has been rated to be equivalent to meats, eggs and milk. Soy has many health benefits including prevention of coronary heart disease, prevention of cancer growth, improvement of cognitive function, promotion of bone health, prevention of obesity, prevention of type II diabetes and promotion of growth of normal floras in the colon. Soybean consumption is also associated with some side effects which include allergy, flatulence and abdominal discomfort. Nurses/health care providers should therefore, educate clients on the precautionary measures to be taken in preparing soy food products in order to reduce to the barest minimum the side effects, while encouraging them to include soy as part of their daily meals for optimal health and vitality.

Keywords: health benefit, health education, nutritional benefit, soybeans

Procedia PDF Downloads 490
25701 Effects of Acupuncture Treatment in Gait Parameters in Parkinson's Disease

Authors: Catarina Isabel Ramos Pereira, Jorge Machado, Begona Alonso Criado, Maria João Santos

Abstract:

Introduction: Gait disorders are one of the symptoms that have severe implications on the quality of life in Parkinson's disease (PD). Currently, there is no therapy to reverse or treat this condition. None of the drugs used in conventional medical treatment is entirely efficient, and all have a high incidence of side effects. Acupuncture treatment is believed to improve motor ability, but there is still little scientific evidence in individuals with PD. Aim: The aim of the study is to investigate the acute effect of acupuncture on gait parameters in Parkinson's disease. Methods: This is a randomized and controlled crossover study. The same individual patient was part of both the experimental (real acupuncture) and control group (false acupuncture/sham), and the sequence was randomized. Gait parameters were measured at two different moments, before and after treatment, using four force platforms as well as the collection of 3D markers positions taken by 11 cameras. Images were quantitatively analyzed using Qualisys Track Manager software that let us extract data related to the quality of gait and balance. Seven patients with the diagnosis of Parkinson's disease were included in the study. Results: Statistically significant differences were found in gait speed (p = 0.016), gait cadence (p = 0.006), support base width (p = 0.0001), medio-lateral oscillation (p = 0.017), left-right step length (p = 0.0002), and stride length: right-right (p = 0.0000) and left-left (p = 0.0018), time of left support phase (p = 0.029), right support phase (p = 0.025) and double support phase (p = 0.015), between the initial and final moments for the experimental group. Differences in right-left stride length were found for both groups. Conclusion: Our results show that acupuncture could enhance gait in Parkinson's disease patients. Deep research involving a larger number of volunteers should be accomplished to validate these encouraging findings.

Keywords: acupuncture, traditional Chinese medicine, Parkinson's disease, gait

Procedia PDF Downloads 171
25700 Empirical Investigation of Barriers to Industrial Energy Conservation Measures in the Manufacturing Small and Medium Enterprises (SME's) of Pakistan

Authors: Muhammad Tahir Hassan, Stas Burek, Muhammad Asif, Mohamed Emad

Abstract:

Industrial sector in Pakistan accounts for 25% of total energy consumption in the country. The performance of this sector has been severely affected due to the adverse effect of current energy crises in the country. Energy conservation potentials of Pakistan’s industrial sectors through energy management can save wasted energy which would ultimately leads to economic and environmental benefits. However due to lack of financial incentives of energy efficiency and absence of energy benchmarking within same industrial sectors are some of the main challenges in the implementation of energy management. In Pakistan, this area has not been adequately explored, and there is a lack of focus on the need for industrial energy efficiency and proper management. The main objective of this research is to evaluate the current energy management performance of Pakistani industrial sector and empirical investigation of the existence of various barriers to industrial energy efficiency. Data was collected from the respondents of 192 small and medium-sized enterprises (SME’s) of Pakistan i.e. foundries, textile, plastic industries, light engineering, auto and spare parts and ceramic manufacturers and analysed using Statistical Package for the Social Sciences (SPSS) software. Current energy management performance of manufacturing SME’s in Pakistan has been evaluated by employing two significant indicators, ‘Energy Management Matrix’ and ‘pay-off criteria’, with modified approach. Using the energy management matrix, energy management profiles of overall industry and the individual sectors have been drawn to assess the energy management performance and identify the weak and strong areas as well. Results reveal that, energy management practices in overall surveyed industries are at very low level. Energy management profiles drawn against each sector suggest that performance of textile sector is better among all the surveyed manufacturing SME’s. The empirical barriers to industrial energy efficiency have also been ranked according to the overall responses. The results further reveal that there is a significant relationship exists among the industrial size, sector type and nature of barriers to industrial energy efficiency for the manufacturing SME’s in Pakistan. The findings of this study may help the industries and policy makers in Pakistan to formulate a sustainable energy policy to support industrial energy efficiency keeping in view the actual existing energy efficiency scenario in the industrial sector.

Keywords: barriers, energy conservation, energy management profile, environment, manufacturing SME's of Pakistan

Procedia PDF Downloads 290
25699 Design and Implementation of a Geodatabase and WebGIS

Authors: Sajid Ali, Dietrich Schröder

Abstract:

The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.

Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application

Procedia PDF Downloads 341
25698 Integration of “FAIR” Data Principles in Longitudinal Mental Health Research in Africa: Lessons from a Landscape Analysis

Authors: Bylhah Mugotitsa, Jim Todd, Agnes Kiragga, Jay Greenfield, Evans Omondi, Lukoye Atwoli, Reinpeter Momanyi

Abstract:

The INSPIRE network aims to build an open, ethical, sustainable, and FAIR (Findable, Accessible, Interoperable, Reusable) data science platform, particularly for longitudinal mental health (MH) data. While studies have been done at the clinical and population level, there still exists limitations in data and research in LMICs, which pose a risk of underrepresentation of mental disorders. It is vital to examine the existing longitudinal MH data, focusing on how FAIR datasets are. This landscape analysis aimed to provide both overall level of evidence of availability of longitudinal datasets and degree of consistency in longitudinal studies conducted. Utilizing prompters proved instrumental in streamlining the analysis process, facilitating access, crafting code snippets, categorization, and analysis of extensive data repositories related to depression, anxiety, and psychosis in Africa. While leveraging artificial intelligence (AI), we filtered through over 18,000 scientific papers spanning from 1970 to 2023. This AI-driven approach enabled the identification of 228 longitudinal research papers meeting inclusion criteria. Quality assurance revealed 10% incorrectly identified articles and 2 duplicates, underscoring the prevalence of longitudinal MH research in South Africa, focusing on depression. From the analysis, evaluating data and metadata adherence to FAIR principles remains crucial for enhancing accessibility and quality of MH research in Africa. While AI has the potential to enhance research processes, challenges such as privacy concerns and data security risks must be addressed. Ethical and equity considerations in data sharing and reuse are also vital. There’s need for collaborative efforts across disciplinary and national boundaries to improve the Findability and Accessibility of data. Current efforts should also focus on creating integrated data resources and tools to improve Interoperability and Reusability of MH data. Practical steps for researchers include careful study planning, data preservation, machine-actionable metadata, and promoting data reuse to advance science and improve equity. Metrics and recognition should be established to incentivize adherence to FAIR principles in MH research

Keywords: longitudinal mental health research, data sharing, fair data principles, Africa, landscape analysis

Procedia PDF Downloads 89
25697 An Experimental Investigation of Microscopic and Macroscopic Displacement Behaviors of Branched-Preformed Particle Gel in High Temperature Reservoirs

Authors: Weiyao Zhu, Bingbing Li, Yajing Liu, Zhiyong Song

Abstract:

Branched-preformed particle gel (B-PPG) is a newly developed profile control and oil displacement agent for enhanced oil recovery in major oilfields. To provide a better understanding of the performance of B-PPG in high temperature reservoirs, a comprehensive experimental investigation was conducted by utilizing glass micromodel and synthetic core. The microscopic experimental results show that the B-PPG can selectively flow and plug in large pores. In terms of enhanced oil recovery, the decrease of residual oil in the margin regions (24.6%) was higher than that in the main stream (13.7%), which indicates it enlarged the sweep area. In addition, the effects of B-PPG injection concentration and injection rate on enhanced oil recovery were implemented by core flooding. The macroscopic experimental results indicate that the enhanced oil recovery increased with the increasing of injection concentration. However, the injection rate had a peak value. It is significant to get insight into the behaviors of B-PPG in reservoirs.

Keywords: branched-preformed particle gel, enhanced oil recovery, micromodel, core flooding

Procedia PDF Downloads 198
25696 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 68
25695 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study

Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos

Abstract:

This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.

Keywords: in-place devices, IoT, human-centred data-analytics, spatial design

Procedia PDF Downloads 197
25694 A Unique Multi-Class Support Vector Machine Algorithm Using MapReduce

Authors: Aditi Viswanathan, Shree Ranjani, Aruna Govada

Abstract:

With data sizes constantly expanding, and with classical machine learning algorithms that analyze such data requiring larger and larger amounts of computation time and storage space, the need to distribute computation and memory requirements among several computers has become apparent. Although substantial work has been done in developing distributed binary SVM algorithms and multi-class SVM algorithms individually, the field of multi-class distributed SVMs remains largely unexplored. This research seeks to develop an algorithm that implements the Support Vector Machine over a multi-class data set and is efficient in a distributed environment. For this, we recursively choose the best binary split of a set of classes using a greedy technique. Much like the divide and conquer approach. Our algorithm has shown better computation time during the testing phase than the traditional sequential SVM methods (One vs. One, One vs. Rest) and out-performs them as the size of the data set grows. This approach also classifies the data with higher accuracy than the traditional multi-class algorithms.

Keywords: distributed algorithm, MapReduce, multi-class, support vector machine

Procedia PDF Downloads 401
25693 Information Management Approach in the Prediction of Acute Appendicitis

Authors: Ahmad Shahin, Walid Moudani, Ali Bekraki

Abstract:

This research aims at presenting a predictive data mining model to handle an accurate diagnosis of acute appendicitis with patients for the purpose of maximizing the health service quality, minimizing morbidity/mortality, and reducing cost. However, acute appendicitis is the most common disease which requires timely accurate diagnosis and needs surgical intervention. Although the treatment of acute appendicitis is simple and straightforward, its diagnosis is still difficult because no single sign, symptom, laboratory or image examination accurately confirms the diagnosis of acute appendicitis in all cases. This contributes in increasing morbidity and negative appendectomy. In this study, the authors propose to generate an accurate model in prediction of patients with acute appendicitis which is based, firstly, on the segmentation technique associated to ABC algorithm to segment the patients; secondly, on applying fuzzy logic to process the massive volume of heterogeneous and noisy data (age, sex, fever, white blood cell, neutrophilia, CRP, urine, ultrasound, CT, appendectomy, etc.) in order to express knowledge and analyze the relationships among data in a comprehensive manner; and thirdly, on applying dynamic programming technique to reduce the number of data attributes. The proposed model is evaluated based on a set of benchmark techniques and even on a set of benchmark classification problems of osteoporosis, diabetes and heart obtained from the UCI data and other data sources.

Keywords: healthcare management, acute appendicitis, data mining, classification, decision tree

Procedia PDF Downloads 350
25692 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery

Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene

Abstract:

Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.

Keywords: multi-objective, analysis, data flow, freight delivery, methodology

Procedia PDF Downloads 180
25691 Low-Complexity Multiplication Using Complement and Signed-Digit Recoding Methods

Authors: Te-Jen Chang, I-Hui Pan, Ping-Sheng Huang, Shan-Jen Cheng

Abstract:

In this paper, a fast multiplication computing method utilizing the complement representation method and canonical recoding technique is proposed. By performing complements and canonical recoding technique, the number of partial products can be reduced. Based on these techniques, we propose an algorithm that provides an efficient multiplication method. On average, our proposed algorithm is to reduce the number of k-bit additions from (0.25k+logk/k+2.5) to (k/6 +logk/k+2.5), where k is the bit-length of the multiplicand A and multiplier B. We can therefore efficiently speed up the overall performance of the multiplication. Moreover, if we use the new proposes to compute common-multiplicand multiplication, the computational complexity can be reduced from (0.5 k+2 logk/k+5) to (k/3+2 logk/k+5) k-bit additions.

Keywords: algorithm design, complexity analysis, canonical recoding, public key cryptography, common-multiplicand multiplication

Procedia PDF Downloads 435
25690 Minimization of Denial of Services Attacks in Vehicular Adhoc Networking by Applying Different Constraints

Authors: Amjad Khan

Abstract:

The security of Vehicular ad hoc networking is of great importance as it involves serious life threats. Thus to provide secure communication amongst Vehicles on road, the conventional security system is not enough. It is necessary to prevent the network resources from wastage and give them protection against malicious nodes so that to ensure the data bandwidth availability to the legitimate nodes of the network. This work is related to provide a non conventional security system by introducing some constraints to minimize the DoS (Denial of services) especially data and bandwidth. The data packets received by a node in the network will pass through a number of tests and if any of the test fails, the node will drop those data packets and will not forward it anymore. Also if a node claims to be the nearest node for forwarding emergency messages then the sender can effectively identify the true or false status of the claim by using these constraints. Consequently the DoS(Denial of Services) attack is minimized by the instant availability of data without wasting the network resources.

Keywords: black hole attack, grey hole attack, intransient traffic tempering, networking

Procedia PDF Downloads 284
25689 Traffic Prediction with Raw Data Utilization and Context Building

Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao

Abstract:

Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.

Keywords: traffic prediction, raw data utilization, context building, data reduction

Procedia PDF Downloads 127
25688 Fuzzy Logic Driven PID Controller for PWM Based Buck Converter

Authors: Bandreddy Anand Babu, Mandadi Srinivasa Rao, Chintala Pradeep Reddy

Abstract:

The main theme of this paper is to design fuzzy logic Proportional Integral Derivative controller for controlling of Pulse Width Modulator (PWM) based DCDC buck converter in continuous conduction mode of operation and comparing the results of FPID and ANFIS. Simulation is done to fuzzy the given input variables and membership functions of input values, creating the interference rules linking the input and output variables and after then defuzzfies the output variables. Fuzzy logic is simple for nonlinear models like buck converter. Fuzzy logic based PID controller technique is to control, nonlinear plants like buck converters in switching variables of power electronics. The characteristics of FPID are in terms of rise time, settling time, rise time, steady state errors for different inputs and load disturbances.

Keywords: fuzzy logic, PID controller, DC-DC buck converter, pulse width modulation

Procedia PDF Downloads 1013
25687 Wheeled Robot Stable Braking Process under Asymmetric Traction Coefficients

Authors: Boguslaw Schreyer

Abstract:

During the wheeled robot’s braking process, the extra dynamic vertical forces act on all wheels: left, right, front or rear. Those forces are directed downward on the front wheels while directed upward on the rear wheels. In order to maximize the deceleration, therefore, minimize the braking time and braking distance, we need to calculate a correct torque distribution: the front braking torque should be increased, and rear torque should be decreased. At the same time, we need to provide better transversal stability. In a simple case of all adhesion coefficients being the same under all wheels, the torque distribution may secure the optimal (maximal) control of the robot braking process, securing the minimum braking distance and a minimum braking time. At the same time, the transversal stability is relatively good. At any time, we control the transversal acceleration. In the case of the transversal movement, we stop the braking process and re-apply braking torque after a defined period of time. If we correctly calculate the value of the torques, we may secure the traction coefficient under the front and rear wheels close to its maximum. Also, in order to provide an optimum braking control, we need to calculate the timing of the braking torque application and the timing of its release. The braking torques should be released shortly after the wheels passed a maximum traction coefficient (while a wheels’ slip increases) and applied again after the wheels pass a maximum of traction coefficient (while the slip decreases). The correct braking torque distribution secures the front and rear wheels, passing this maximum at the same time. It guarantees an optimum deceleration control, therefore, minimum braking time. In order to calculate a correct torque distribution, a control unit should receive the input signals of a rear torque value (which changes independently), the robot’s deceleration, and values of the vertical front and rear forces. In order to calculate the timing of torque application and torque release, more signals are needed: speed of the robot: angular speed, and angular deceleration of the wheels. In case of different adhesion coefficients under the left and right wheels, but the same under each pair of wheels- the same under right wheels and the same under left wheels, the Select-Low (SL) and select high (SH) methods are applied. The SL method is suggested if transversal stability is more important than braking efficiency. Often in the case of the robot, more important is braking efficiency; therefore, the SH method is applied with some control of the transversal stability. In the case that all adhesion coefficients are different under all wheels, the front-rear torque distribution is maintained as in all previous cases. However, the timing of the braking torque application and release is controlled by the rear wheels’ lowest adhesion coefficient. The Lagrange equations have been used to describe robot dynamics. Matlab has been used in order to simulate the process of wheeled robot braking, and in conclusion, the braking methods have been selected.

Keywords: wheeled robots, braking, traction coefficient, asymmetric

Procedia PDF Downloads 165
25686 Seismic Interpretation and Petrophysical Evaluation of SM Field, Libya

Authors: Abdalla Abdelnabi, Yousf Abushalah

Abstract:

The G Formation is a major gas producing reservoir in the SM Field, eastern, Libya. It is called G limestone because it consists of shallow marine limestone. Well data and 3D-Seismic in conjunction with the results of a previous study were used to delineate the hydrocarbon reservoir of Middle Eocene G-Formation of SM Field area. The data include three-dimensional seismic data acquired in 2009. It covers approximately an area of 75 mi² and with more than 9 wells penetrating the reservoir. Seismic data are used to identify any stratigraphic and structural and features such as channels and faults and which may play a significant role in hydrocarbon traps. The well data are used to calculation petrophysical analysis of S field. The average porosity of the Middle Eocene G Formation is very good with porosity reaching 24% especially around well W 6. Average water saturation was calculated for each well from porosity and resistivity logs using Archie’s formula. The average water saturation for the whole well is 25%. Structural mapping of top and bottom of Middle Eocene G formation revealed the highest area in the SM field is at 4800 ft subsea around wells W4, W5, W6, and W7 and the deepest point is at 4950 ft subsea. Correlation between wells using well data and structural maps created from seismic data revealed that net thickness of G Formation range from 0 ft in the north part of the field to 235 ft in southwest and south part of the field. The gas water contact is found at 4860 ft using the resistivity log. The net isopach map using both the trapezoidal and pyramid rules are used to calculate the total bulk volume. The original gas in place and the recoverable gas were calculated volumetrically to be 890 Billion Standard Cubic Feet (BSCF) and 630 (BSCF) respectively.

Keywords: 3D seismic data, well logging, petrel, kingdom suite

Procedia PDF Downloads 150
25685 Analysis of Spatial and Temporal Data Using Remote Sensing Technology

Authors: Kapil Pandey, Vishnu Goyal

Abstract:

Spatial and temporal data analysis is very well known in the field of satellite image processing. When spatial data are correlated with time, series analysis it gives the significant results in change detection studies. In this paper the GIS and Remote sensing techniques has been used to find the change detection using time series satellite imagery of Uttarakhand state during the years of 1990-2010. Natural vegetation, urban area, forest cover etc. were chosen as main landuse classes to study. Landuse/ landcover classes within several years were prepared using satellite images. Maximum likelihood supervised classification technique was adopted in this work and finally landuse change index has been generated and graphical models were used to present the changes.

Keywords: GIS, landuse/landcover, spatial and temporal data, remote sensing

Procedia PDF Downloads 433
25684 Acceleration of Adsorption Kinetics by Coupling Alternating Current with Adsorption Process onto Several Adsorbents

Authors: A. Kesraoui, M. Seffen

Abstract:

Applications of adsorption onto activated carbon for water treatment are well known. The process has been demonstrated to be widely effective for removing dissolved organic substances from wastewaters, but this treatment has a major drawback is the high operating cost. The main goal of our research work is to improve the retention capacity of Tunisian biomass for the depollution of industrial wastewater and retention of pollutants considered toxic. The biosorption process is based on the retention of molecules and ions onto a solid surface composed of biological materials. The evaluation of the potential use of these materials is important to propose as an alternative to the adsorption process generally expensive, used to remove organic compounds. Indeed, these materials are very abundant in nature and are low cost. Certainly, the biosorption process is effective to remove the pollutants, but it presents a slow kinetics. The improvement of the biosorption rates is a challenge to make this process competitive with respect to oxidation and adsorption onto lignocellulosic fibers. In this context, the alternating current appears as a new alternative, original and a very interesting phenomenon in the acceleration of chemical reactions. Our main goal is to increase the retention acceleration of dyes (indigo carmine, methylene blue) and phenol by using a new alternative: alternating current. The adsorption experiments have been performed in a batch reactor by adding some of the adsorbents in 150 mL of pollutants solution with the desired concentration and pH. The electrical part of the mounting comprises a current source which delivers an alternating current voltage of 2 to 15 V. It is connected to a voltmeter that allows us to read the voltage. In a 150 mL capacity cell, we plunged two zinc electrodes and the distance between two Zinc electrodes has been 4 cm. Thanks to alternating current, we have succeeded to improve the performance of activated carbon by increasing the speed of the indigo carmine adsorption process and reducing the treatment time. On the other hand, we have studied the influence of the alternating current on the biosorption rate of methylene blue onto Luffa cylindrica fibers and the hybrid material (Luffa cylindrica-ZnO). The results showed that the alternating current accelerated the biosorption rate of methylene blue onto the Luffa cylindrica and the Luffa cylindrica-ZnO hybrid material and increased the adsorbed amount of methylene blue on both adsorbents. In order to improve the removal of phenol, we performed the coupling between the alternating current and the biosorption onto two adsorbents: Luffa cylindrica and the hybrid material (Luffa cylindrica-ZnO). In fact, the alternating current has succeeded to improve the performance of adsorbents by increasing the speed of the adsorption process and the adsorption capacity and reduce the processing time.

Keywords: adsorption, alternating current, dyes, modeling

Procedia PDF Downloads 160
25683 Kinematical Analysis of Tai Chi Chuan Players during Gait and Balance Test and Implication in Rehabilitation Exercise

Authors: Bijad Alqahtani, Graham Arnold, Weijie Wang

Abstract:

Background—Tai Chi Chuan (TCC) is a type of traditional Chinese martial art and is considered a benefiting physical fitness. Advanced techniques of motion analysis have been routinely used in the clinical assessment. However, so far, little research has been done on the biomechanical assessment of TCC players in terms of gait and balance using motion analysis. Objectives—The aim of this study was to investigate whether TCC improves the lower limb conditions and balance ability using the state of the art motion analysis technologies, i.e. motion capture system, electromyography and force platform. Methods—Twenty TCC (9 male, 11 female) with age between (42-77) years old and weight (56.2-119 Kg), and eighteen Non-TCC participants (7 male, 11 female), weight (50-110 Kg) with age (43- 78) years old at the matched age as a control group were recruited in this study. Their gait and balance were collected using Vicon Nexus® to obtain the gait parameters, and kinematic parameters of hip, knee, and ankle joints in three planes of both limbs. Participants stood on force platforms to perform a single leg balance test. Then, they were asked to walk along a 10 m walkway at their comfortable speed. Participants performed 5 trials of single-leg balance for the dominant side. Also, the participants performed 3 trials of four square step balance and 10 trials of walking. From the recorded trials, three good ones were analyzed using the Vicon Plug-in-Gait model to obtain gait parameters, e.g. walking speed, cadence, stride length, and joint parameters, e.g. joint angle, force, moments, etc. Result— The temporal-spatial variables of TCC subjects were compared with the Non-TCC subjects, it was found that there was a significant difference (p < 0.05) between the groups. Moreover, it was observed that participants of TCC have significant differences in ankle, hip, and knee joints’ kinematics in the sagittal, coronal, and transverse planes such as ankle angle (19.90±19.54 deg) for TCC while (15.34±6.50 deg) for Non-TCC, and knee angle (14.96±6.40 deg) for TCC while (17.63±5.79 deg) for Non-TCC in the transverse plane. Also, the result showed that there was a significant difference between groups in the single-leg balance test, e.g. maintaining single leg stance time in the TCC participants showed longer duration (20.85±10.53 s) in compared to Non-TCC people group (13.39±8.78 s). While the result showed that there was no significant difference between groups in the four square step balance. Conclusion—Our result showed that there are significant differences between Tai Chi Chuan and Non-Tai Chi Chuan participants in the various aspects of gait analysis and balance test, as a consequence of these findings some of biomechanical parameters such as joints kinematics, gait parameters and single leg stance balance test, the Tai Chi Chuan could improve the lower limb conditions and could reduce a risk of fall for the elderly with ageing.

Keywords: gait analysis, kinematics, single leg stance, Tai Chi Chuan

Procedia PDF Downloads 127
25682 Determining the Frequency of Pneumonia Emerging in COVID-19 Infection

Authors: Zoirov Amirdin Olimovich, Akbarov Elbek Elmurodovich

Abstract:

Introduction: Pneumonia that occurs during COVID-19 infection is common among patients. This research was conducted to determine the frequency of symptoms occurring during pneumonia according to the purpose. Objective and Task: The goal of our research is to develop clinical concepts of pneumonia that occur during COVID-19 infection. Our main task is to analyze the results of blood tests and understand the progression of the disease. Research Materials and Methods: The research was conducted among patients admitted to the Tashkent Medical Academy multi-profile clinic in the department of infectious diseases undergoing stationary treatment with a diagnosis of COVID-19. The analyzed patients had an average age of 46, with a total of 48 patients, 23 of whom were female and 25 were male. Research Results: The research results showed the development of pneumonia within three days in 27 patients after COVID-19 infection. During the observation period, 24 patients (88.8%) recovered completely. The X-ray revealed no signs of pneumonia in those who fully recovered. The remaining three patients showed a persistent form of pneumonia. Conclusion: The conclusion of the research indicates that pneumonia during COVID-19 infection develops in many patients, and 88.8% of patients recover completely without any lingering symptoms.

Keywords: COVID-19, pneumonia, the X-ray, blood, TTA

Procedia PDF Downloads 63
25681 Influence of Inertial Forces of Large Bearings Utilized in Wind Energy Assemblies

Authors: S. Barabas, F. Sarbu, B. Barabas, A. Fota

Abstract:

Main objective of this paper is to establish a link between inertial forces of the bearings used in construction of wind power plant and its behavior. Using bearings with lower inertial forces has the immediate effect of decreasing inertia rotor system, with significant results in increased energy efficiency, due to decreased friction forces between rollers and raceways. The FEM analysis shows the appearance of uniform contact stress at the ends of the rollers, demonstrated the necessity of production of low mass bearings. Favorable results are expected in the economic field, by reducing material consumption and by increasing the durability of bearings. Using low mass bearings with hollow rollers instead of solid rollers has an impact on working temperature, on vibrations and noise which decrease. Implementation of types of hollow rollers of cylindrical tubular type, instead of expensive rollers with logarithmic profile, will bring significant inertial forces decrease with large benefits in behavior of wind power plant.

Keywords: inertial forces, Von Mises stress, hollow rollers, wind turbine

Procedia PDF Downloads 354
25680 An Empirical Investigation of the Challenges of Secure Edge Computing Adoption in Organizations

Authors: Hailye Tekleselassie

Abstract:

Edge computing is a spread computing outline that transports initiative applications closer to data sources such as IoT devices or local edge servers, and possible happenstances would skull the action of new technologies. However, this investigation was attained to investigation the consciousness of technology and communications organization workers and computer users who support the service cloud. Surveys were used to achieve these objectives. Surveys were intended to attain these aims, and it is the functional using survey. Enquiries about confidence are also a key question. Problems like data privacy, integrity, and availability are the factors affecting the company’s acceptance of the service cloud.

Keywords: IoT, data, security, edge computing

Procedia PDF Downloads 83