Search results for: data comparison
26517 Utilization of Oat in Rabbit Feed for the Development of Healthier Rabbit Meat and Its Impact on Human Blood Lipid Profile
Authors: Muhammad Rizwan Tariq, Muhammad Issa Khan, Zulfiqar Ahmad, Muhammad Adnan Nasir, Muhammad Sameem Javed, Sheraz Ahmed
Abstract:
Functional foods may be a good tool that can be simply utilized in reducing community health expenses. Regular consumption of rabbit meat can offer patrons with bioactive components because the manipulation in rabbit feed is much successful to raise the levels of conjugated linoleic acid, ecosapentaenoic acid, decosahexaenoic acid, polyunsaturated fatty acids, selenium, tocopherol etc. and to reduce the ω-3/ω-6 ratio which is performing a major role in curing of cardiovascular and several other diseases. In comparison to the meats of other species, rabbit meat has higher amounts of protein with essential amino acids, especially in the muscles and low cholesterol contents that also have elevated digestibility. The present study was carried out to develop the functional rabbit meat by modifying feed ingredient of rabbit diet. Thirty-day old rabbits were fed with feeds containing 2 % and 4 % oat. The feeding trial was carried out for eight weeks. Rabbits were divided into three different groups and reared for the period of two months. T0 rabbits were considered control group while T1 rabbits were reared on 4% oat, and T2 were on 2% oat in the feed. At the end of the 8 weeks, the rabbits were slaughtered. Results presented in this study concluded that 4 % oat seed supplementation enhanced n-3 PUFA in meat. It was observed that oat seed supplementation also reduced fat percentage in the meat. Utilization of oat in the feed of rabbits significantly affected the pH, protein, fat, textural and concentration of polyunsaturated fatty acids. A study trial was conducted in order to examine the impact of functional meat on the blood lipid profile of human subjects. They were given rabbit meat in comparison to the chicken meat for the period of one month. The cholesterol, triglycerides and low density lipoprotein were found to be lower in blood serum of human subject group treated with 4 % oat meat.Keywords: functional food, functional rabbit meat, meat quality, rabbit
Procedia PDF Downloads 36826516 Validation Study of Radial Aircraft Engine Model
Authors: Lukasz Grabowski, Tytus Tulwin, Michal Geca, P. Karpinski
Abstract:
This paper presents the radial aircraft engine model which has been created in AVL Boost software. This model is a one-dimensional physical model of the engine, which enables us to investigate the impact of an ignition system design on engine performance (power, torque, fuel consumption). In addition, this model allows research under variable environmental conditions to reflect varied flight conditions (altitude, humidity, cruising speed). Before the simulation research the identifying parameters and validating of model were studied. In order to verify the feasibility to take off power of gasoline radial aircraft engine model, some validation study was carried out. The first stage of the identification was completed with reference to the technical documentation provided by manufacturer of engine and the experiments on the test stand of the real engine. The second stage involved a comparison of simulation results with the results of the engine stand tests performed on a WSK ’PZL-Kalisz’. The engine was loaded by a propeller in a special test bench. Identifying the model parameters referred to a comparison of the test results to the simulation in terms of: pressure behind the throttles, pressure in the inlet pipe, and time course for pressure in the first inlet pipe, power, and specific fuel consumption. Accordingly, the required coefficients and error of simulation calculation relative to the real-object experiments were determined. Obtained the time course for pressure and its value is compatible with the experimental results. Additionally the engine power and specific fuel consumption tends to be significantly compatible with the bench tests. The mapping error does not exceed 1.5%, which verifies positively the model of combustion and allows us to predict engine performance if the process of combustion will be modified. The next conducted tests verified completely model. The maximum mapping error for the pressure behind the throttles and the inlet pipe pressure is 4 %, which proves the model of the inlet duct in the engine with the charging compressor to be correct.Keywords: 1D-model, aircraft engine, performance, validation
Procedia PDF Downloads 33626515 Multi-Objective Optimization for the Green Vehicle Routing Problem: Approach to Case Study of the Newspaper Distribution Problem
Authors: Julio C. Ferreira, Maria T. A. Steiner
Abstract:
The aim of this work is to present a solution procedure referred to here as the Multi-objective Optimization for Green Vehicle Routing Problem (MOOGVRP) to provide solutions for a case study. The proposed methodology consists of three stages to resolve Scenario A. Stage 1 consists of the “treatment” of data; Stage 2 consists of applying mathematical models of the p-Median Capacitated Problem (with the objectives of minimization of distances and homogenization of demands between groups) and the Asymmetric Traveling Salesman Problem (with the objectives of minimizing distances and minimizing time). The weighted method was used as the multi-objective procedure. In Stage 3, an analysis of the results is conducted, taking into consideration the environmental aspects related to the case study, more specifically with regard to fuel consumption and air pollutant emission. This methodology was applied to a (partial) database that addresses newspaper distribution in the municipality of Curitiba, Paraná State, Brazil. The preliminary findings for Scenario A showed that it was possible to improve the distribution of the load, reduce the mileage and the greenhouse gas by 17.32% and the journey time by 22.58% in comparison with the current scenario. The intention for future works is to use other multi-objective techniques and an expanded version of the database and explore the triple bottom line of sustainability.Keywords: Asymmetric Traveling Salesman Problem, Green Vehicle Routing Problem, Multi-objective Optimization, p-Median Capacitated Problem
Procedia PDF Downloads 11226514 To Be a Nurse in Turkey: A Comparison Based on International Labour Organization's Nursing Personnel Recommendation
Authors: Arzu K. Harmanci Seren, Feride Eskin Bacaksiz
Abstract:
The shortage of nursing personnel is considered one of the most important labour force issues in health sector of developed countries since early 1970s. International Labour Organization developed standards for working conditions of nurses in collaboration with World Health Organization with the aim of helping to solve nursing shortage problem all over the world. As a result of this collaboration, ILO Nursing Personnel Convention (C. 149), and the accompanying Recommendation (R. 157) were adopted in 1977. Turkey as a country that has a serious nurse shortage problem, has been a member of ILO since 1932, and has not signed this convention yet. This study was planned to compare some of the working standards in Convention with the present working conditions of nurses in Turkey. The data were collected by an on line survey between 19 January-16 February 2015 for this cross-sectional study. Participants were reached through social network accounts in collaboration with nursing associations. Totally 828 nurses from the 57 provinces of Turkey participated in the study. Survey was consisted of 14 open ended questions related to working conditions of nurses and 34 Likert statements related to nursing policies of the facilities they are working in. The data were analysed using the IBM SPSS 21.0 (licensed to Istanbul University) software. Descriptive and comparative statistics were performed. Most of the participants (81.5%) were staff and 18.5% of them were manager nurses. Most of them had baccalaureate (57.9%) or master (27.4%) degree in nursing. 18.5% of the participants were working in private hospitals, 34.9% of them in university hospitals and 46.6% of them were in Ministry of Health Hospitals. It was found that monthly working schedules were announced mostly 7 days ago (18%), working time of nurses was at least 8 hours (41.5%) and at most 24 hours (22.8%) in a day and had time for lunch or dinner 25.18 (SD=16.66), for resting 21.02 (SD=29.25) minutes. On the other hand, it was determined that 316 (43.2%) nurses did not have time for lunch and 61 (7.9%) of them could not find time for eating anything. It was also explored they were working 15-96 hours in a week (mean=48.28, SD=8.89 hours), 4-29 days in a month (mean=19.29, SD=5.03 days) and 597 (72%) nurses overworked changing form 1 hour to 150 hours (32.80, SD=23.42 hours) before the month in which surveys were filled. Most of the participants did not leave the job due to the sickness (47.5%) even if they felt sick. Also most of them did not leave the job due to any excuse (67.2%) or education (57.3%). This study has significance because of nurses from different provinces participated in and it provides brief information about the working conditions of nurses nationwide. It was explored that nurses in Turkey were working at worse conditions according the International Labour Organization’s recommendations.Keywords: nurse, international labour organization, recommendations for nurses, working conditions
Procedia PDF Downloads 25426513 Data Security and Privacy Challenges in Cloud Computing
Authors: Amir Rashid
Abstract:
Cloud Computing frameworks empower organizations to cut expenses by outsourcing computation resources on-request. As of now, customers of Cloud service providers have no methods for confirming the privacy and ownership of their information and data. To address this issue we propose the platform of a trusted cloud computing program (TCCP). TCCP empowers Infrastructure as a Service (IaaS) suppliers, for example, Amazon EC2 to give a shout box execution condition that ensures secret execution of visitor virtual machines. Also, it permits clients to bear witness to the IaaS supplier and decide if the administration is secure before they dispatch their virtual machines. This paper proposes a Trusted Cloud Computing Platform (TCCP) for guaranteeing the privacy and trustworthiness of computed data that are outsourced to IaaS service providers. The TCCP gives the deliberation of a shut box execution condition for a client's VM, ensuring that no cloud supplier's authorized manager can examine or mess up with its data. Furthermore, before launching the VM, the TCCP permits a client to dependably and remotely acknowledge that the provider at backend is running a confided in TCCP. This capacity extends the verification of whole administration, and hence permits a client to confirm the data operation in secure mode.Keywords: cloud security, IaaS, cloud data privacy and integrity, hybrid cloud
Procedia PDF Downloads 29926512 The Comparison between Resistance and Aerobic Exercise Training on Metabolic Syndrome Components in Overweight Sedentary Female
Authors: Marzieh Sayyad, Mohsen Salesi
Abstract:
Metabolic syndrome (MetS), a collection of cardiometabolic risk factors, is linked to the development of cardiovascular disease (CVD) and diabetes. The prevalence of MetS is on the rise with more women affected than men. The goal of this study was to compare the effects of resistance and aerobic exercise training on metabolic syndrome components in non-athlete, middle-aged woman. 51 non-athlete overweight female participated voluntarily in this study. Participants were divided randomly into three groups including resistance, aerobic and control group (number of each group 17). 24 hours before the beginning of training program, the blood sample was taken in fasting state. The two training groups participated in sport activities for eight weeks, three times a week duration 60-90 minutes. Two days following the end of the 8th week, all the measurements were performed similar to the pretest phase. The data was analyzed using one-way analysis of variance. The results showed that aerobic exercise training significantly decreased weight (p=.05), triglyceride (p<0.01) and systolic blood pressure (p<0.02) and HDL-c (p<0.01) was significantly increased. Also in resistance exercise training group TG decreased significantly (p<0.01) and HDL-c (p<0.05) was significantly increased. This study demonstrated that a regular physical activity program improved several metabolic and physiological parameters in healthy, previously sedentary subjects with the metabolic syndrome. In conclusion, it seems that this type of training can be efficient, safe and inexpensive way in order to reduce and prevent metabolic syndrome.Keywords: aerobic exercise, metabolic syndrome, overweight sedentary female, resistance exercise
Procedia PDF Downloads 30526511 Risk of Fractures at Different Anatomic Sites in Patients with Irritable Bowel Syndrome: A Nationwide Population-Based Cohort Study
Authors: Herng-Sheng Lee, Chi-Yi Chen, Wan-Ting Huang, Li-Jen Chang, Solomon Chih-Cheng Chen, Hsin-Yi Yang
Abstract:
A variety of gastrointestinal disorders, such as Crohn’s disease, ulcerative colitis, and coeliac disease, are recognized as risk factors for osteoporosis and osteoporotic fractures. One recent study suggests that individuals with irritable bowel syndrome (IBS) might also be at increased risk of osteoporosis and osteoporotic fractures. Up to now, the association between IBS and the risk of fractures at different anatomic sites occurrences is not completely clear. We conducted a population-based cohort analysis to investigate the fracture risk of IBS in comparison with non-IBS group. We identified 29,505 adults aged ≥ 20 years with newly diagnosed IBS using the Taiwan National Health Insurance Research Database in 2000-2012. A comparison group was constructed of patients without IBS who were matched according to gender and age. The occurrence of fracture was monitored until the end of 2013. We analyzed the risk of fracture events to occur in IBS by using Cox proportional hazards regression models. Patients with IBS had a higher incidence of osteoporotic fractures compared with non-IBS group (12.34 versus 9.45 per 1,000 person-years) and an increased risk of osteoporotic fractures (adjusted hazard ratio [aHR] = 1.27, 95 % confidence interval [CI] = 1.20 – 1.35). Site specific analysis showed that the IBS group had a higher risk of fractures for spine, forearm, hip and hand than did the non-IBS group. With further stratification for gender and age, a higher aHR value for osteoporotic fractures in IBS group was seen across all age groups in males, but seen in elderly females. In addition, female, elderly, low income, hypertension, coronary artery disease, cerebrovascular disease, and depressive disorders as independent osteoporotic fracture risk factors in IBS patients. The IBS is considered as a risk factor for osteoporotic fractures, particularly in female individuals and fracture sites located at the spine, forearm, hip and hand.Keywords: irritable bowel syndrome, fracture, gender difference, longitudinal health insurance database, public health
Procedia PDF Downloads 22926510 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record
Authors: Raghavi C. Janaswamy
Abstract:
In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.Keywords: electronic health record, graph neural network, heterogeneous data, prediction
Procedia PDF Downloads 8626509 A Proposal to Tackle Security Challenges of Distributed Systems in the Healthcare Sector
Authors: Ang Chia Hong, Julian Khoo Xubin, Burra Venkata Durga Kumar
Abstract:
Distributed systems offer many benefits to the healthcare industry. From big data analysis to business intelligence, the increased computational power and efficiency from distributed systems serve as an invaluable resource in the healthcare sector to utilize. However, as the usage of these distributed systems increases, many issues arise. The main focus of this paper will be on security issues. Many security issues stem from distributed systems in the healthcare industry, particularly information security. The data of people is especially sensitive in the healthcare industry. If important information gets leaked (Eg. IC, credit card number, address, etc.), a person’s identity, financial status, and safety might get compromised. This results in the responsible organization losing a lot of money in compensating these people and even more resources expended trying to fix the fault. Therefore, a framework for a blockchain-based healthcare data management system for healthcare was proposed. In this framework, the usage of a blockchain network is explored to store the encryption key of the patient’s data. As for the actual data, it is encrypted and its encrypted data, called ciphertext, is stored in a cloud storage platform. Furthermore, there are some issues that have to be emphasized and tackled for future improvements, such as a multi-user scheme that could be proposed, authentication issues that have to be tackled or migrating the backend processes into the blockchain network. Due to the nature of blockchain technology, the data will be tamper-proof, and its read-only function can only be accessed by authorized users such as doctors and nurses. This guarantees the confidentiality and immutability of the patient’s data.Keywords: distributed, healthcare, efficiency, security, blockchain, confidentiality and immutability
Procedia PDF Downloads 18426508 Erectile Dysfunction among Bangladeshi Men with Diabetes
Authors: Shahjada Selim
Abstract:
Background: Erectile dysfunction (ED) is an important impediment to quality of life of men. ED is approximate, three times more common in diabetic than non-diabetic men, and diabetic men develop ED earlier than age-matched non-diabetic subjects. Glycemic control and other factors may contribute in developing and or deteriorating ED. Aim: The aim of the study was to determine the prevalence of ED and its risk factors in type 2 diabetic (T2DM) men in Bangladesh. Methods: During 2013-2014, 3980 diabetic men aged 30-69 years were interviewed at the out-patient departments of seven diabetic centers in Dhaka by using the validated Bengali version of the questionnaire of the International index of erectile function (IIEF) for evaluation of baseline erectile function (EF). The indexes indicate a very high correlation between the items and the questionnaire is consistently reliable. Data were analyzed with Chi-squared (χ²) test using SPSS software. P ≤ 0.05 was considered significant. Results: Out of 3790, ED was found in 2046 (53.98%) of T2DM men. The prevalence of ED was increased with age from 10.5% in men aged 30-39 years to 33.6% in those aged over 60 years (P < 0.001). In comparison with patients with reported diabetes lasting ≤ 5 years (26.4%), the prevalence of ED was less than in those with diabetes of 6-11 years (35.3%) and of 12-30 years (42.5%, P <0.001). ED increased significantly in those who had poor glycemic control. The prevalence of ED in patients with good, fair and poor glycemic control was 22.8%, 42.5% and 47.9% respectively (P = 0.004). Treatment modalities (medical nutrition therapy, oral agents, insulin, and insulin plus oral agents) had significant association with ED and its severity (P < 0.001). Conclusion: Prevalence of ED is very high among T2DM men in Bangladesh and can be reduced the burden by improving glycemic status. Glycemic control, duration of diabetes, treatment modalities, increasing age are associated with ED.Keywords: erectile dysfunction, diabetes, men, Bangladesh
Procedia PDF Downloads 26526507 Design and Implementation of a Geodatabase and WebGIS
Authors: Sajid Ali, Dietrich Schröder
Abstract:
The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application
Procedia PDF Downloads 34126506 Integration of “FAIR” Data Principles in Longitudinal Mental Health Research in Africa: Lessons from a Landscape Analysis
Authors: Bylhah Mugotitsa, Jim Todd, Agnes Kiragga, Jay Greenfield, Evans Omondi, Lukoye Atwoli, Reinpeter Momanyi
Abstract:
The INSPIRE network aims to build an open, ethical, sustainable, and FAIR (Findable, Accessible, Interoperable, Reusable) data science platform, particularly for longitudinal mental health (MH) data. While studies have been done at the clinical and population level, there still exists limitations in data and research in LMICs, which pose a risk of underrepresentation of mental disorders. It is vital to examine the existing longitudinal MH data, focusing on how FAIR datasets are. This landscape analysis aimed to provide both overall level of evidence of availability of longitudinal datasets and degree of consistency in longitudinal studies conducted. Utilizing prompters proved instrumental in streamlining the analysis process, facilitating access, crafting code snippets, categorization, and analysis of extensive data repositories related to depression, anxiety, and psychosis in Africa. While leveraging artificial intelligence (AI), we filtered through over 18,000 scientific papers spanning from 1970 to 2023. This AI-driven approach enabled the identification of 228 longitudinal research papers meeting inclusion criteria. Quality assurance revealed 10% incorrectly identified articles and 2 duplicates, underscoring the prevalence of longitudinal MH research in South Africa, focusing on depression. From the analysis, evaluating data and metadata adherence to FAIR principles remains crucial for enhancing accessibility and quality of MH research in Africa. While AI has the potential to enhance research processes, challenges such as privacy concerns and data security risks must be addressed. Ethical and equity considerations in data sharing and reuse are also vital. There’s need for collaborative efforts across disciplinary and national boundaries to improve the Findability and Accessibility of data. Current efforts should also focus on creating integrated data resources and tools to improve Interoperability and Reusability of MH data. Practical steps for researchers include careful study planning, data preservation, machine-actionable metadata, and promoting data reuse to advance science and improve equity. Metrics and recognition should be established to incentivize adherence to FAIR principles in MH researchKeywords: longitudinal mental health research, data sharing, fair data principles, Africa, landscape analysis
Procedia PDF Downloads 9126505 Comparison Between the Radiation Resistance of n/p and p/n InP Solar Cell
Authors: Mazouz Halima, Belghachi Abdrahmane
Abstract:
Effects of electron irradiation-induced deep level defects have been studied on both n/p and p/n indium phosphide solar cells with very thin emitters. The simulation results show that n/p structure offers a somewhat better short circuit current but the p/n structure offers improved circuit voltage, not only before electron irradiation, but also after 1MeV electron irradiation with 5.1015 fluence. The simulation also shows that n/p solar cell structure is more resistant than that of p/n structure.Keywords: InP solar cell, p/n and n/p structure, electron irradiation, output parameters
Procedia PDF Downloads 55026504 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads
Authors: Gaurav Kumar Sinha
Abstract:
In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies
Procedia PDF Downloads 6826503 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study
Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos
Abstract:
This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.Keywords: in-place devices, IoT, human-centred data-analytics, spatial design
Procedia PDF Downloads 19726502 A Unique Multi-Class Support Vector Machine Algorithm Using MapReduce
Authors: Aditi Viswanathan, Shree Ranjani, Aruna Govada
Abstract:
With data sizes constantly expanding, and with classical machine learning algorithms that analyze such data requiring larger and larger amounts of computation time and storage space, the need to distribute computation and memory requirements among several computers has become apparent. Although substantial work has been done in developing distributed binary SVM algorithms and multi-class SVM algorithms individually, the field of multi-class distributed SVMs remains largely unexplored. This research seeks to develop an algorithm that implements the Support Vector Machine over a multi-class data set and is efficient in a distributed environment. For this, we recursively choose the best binary split of a set of classes using a greedy technique. Much like the divide and conquer approach. Our algorithm has shown better computation time during the testing phase than the traditional sequential SVM methods (One vs. One, One vs. Rest) and out-performs them as the size of the data set grows. This approach also classifies the data with higher accuracy than the traditional multi-class algorithms.Keywords: distributed algorithm, MapReduce, multi-class, support vector machine
Procedia PDF Downloads 40126501 Outcome of Unilateral Retinoblastoma: A Ten Years Experience of Children's Cancer, Hospital Egypt
Authors: Ahmed Elhussein, Hossam El-Zomor, Adel Alieldin, Mahmoud A. Afifi, Abdullah Elhusseiny, Hala Taha, Amal Refaat, Soha Ahmed, Mohamed S. Zagloul
Abstract:
Background: A majority of children with retinoblastoma (60%) have a disease in one eye only (unilateral disease). This is a retrospective study to evaluate two different treatment modalities in those patients for saving their lives and vision. Methods: Four hundred and four patients were diagnosed with unilateral intraocular retinoblastoma at Children’s Cancer, Hospital Egypt (CCHE) through the period of July/2007 until December/2017. Management strategies included primary enucleation versus ocular salvage treatment. Results: Patients presented with mean age 24.5 months with range (1.2-154.3 months). According to the international retinoblastoma classification, Group D (n=172, 42%) was the most common, followed by group E (n=142, 35%), group C (n=63, 16%), and group B (n=27, 7%). All patients were alive at the end of the study except four patients who died, with 5-years overall survival 98.3% [CI, (96.5-100%)]. Patients presented with advanced disease and poor visual prognosis (n=241, 59.6%) underwent primary enucleation with 6 cycles adjuvant chemotherapy if they had high-risk features in the enucleated eye; only four patients out of 241 ended-up either with extraocular metastasis (n=3) or death (n=1). While systemic chemotherapy and focal therapy were the primary treatment for those who presented with favorable disease status and good visual prognosis (n=163, 40.4%); seventy-seven patients of them (47%) ended up with a pre-defined event (enucleation, EBRT, off protocol chemotherapy or 2ry malignancy). Ocular survival for patients received primary chemotherapy + focal therapy was [50.9% (CI, 43.5-59.6%)] at 3 years and [46.9% (CI,39.3-56%)] at 5 years. Comparison between upfront enucleation and primary chemotherapy for occurrence of extraocular metastasis revealed that there was no statistical difference between them except in group D (p value). While for occurrence of death, no statistical difference in all classification groups. Conclusion: In retinoblastoma, primary chemotherapy is a reasonable option and has a good probability for ocular salvage without increasing the risk of metastasis in comparison to upfront enucleation except in group D.Keywords: CCHE, chemotherapy, enucleation, retinoblastoma
Procedia PDF Downloads 15526500 Information Management Approach in the Prediction of Acute Appendicitis
Authors: Ahmad Shahin, Walid Moudani, Ali Bekraki
Abstract:
This research aims at presenting a predictive data mining model to handle an accurate diagnosis of acute appendicitis with patients for the purpose of maximizing the health service quality, minimizing morbidity/mortality, and reducing cost. However, acute appendicitis is the most common disease which requires timely accurate diagnosis and needs surgical intervention. Although the treatment of acute appendicitis is simple and straightforward, its diagnosis is still difficult because no single sign, symptom, laboratory or image examination accurately confirms the diagnosis of acute appendicitis in all cases. This contributes in increasing morbidity and negative appendectomy. In this study, the authors propose to generate an accurate model in prediction of patients with acute appendicitis which is based, firstly, on the segmentation technique associated to ABC algorithm to segment the patients; secondly, on applying fuzzy logic to process the massive volume of heterogeneous and noisy data (age, sex, fever, white blood cell, neutrophilia, CRP, urine, ultrasound, CT, appendectomy, etc.) in order to express knowledge and analyze the relationships among data in a comprehensive manner; and thirdly, on applying dynamic programming technique to reduce the number of data attributes. The proposed model is evaluated based on a set of benchmark techniques and even on a set of benchmark classification problems of osteoporosis, diabetes and heart obtained from the UCI data and other data sources.Keywords: healthcare management, acute appendicitis, data mining, classification, decision tree
Procedia PDF Downloads 35126499 Transcriptome Analysis of Protestia brevitarsis seulensis with Focus On Wing Development and Metamorphosis in Developmental Stages
Authors: Jihye Hwang, Eun Hwa Choi, Su Youn Baek, Bia Park, Gyeongmin Kim, Chorong Shin, Joon Ha Lee, Jae-Sam Hwang, Ui Wook Hwang
Abstract:
White-spotted flower chafers are widely distributed in Asian countries and traditionally used for the treatment of chronic fatigue, blood circulation, and paralysis in the oriental medicine field. The evolution and development of insect wings and metamorphosis remain under-discovered subjects in arthropod evolutionary researches. Gene expression abundance analyses along with developmental stages based on the large-scale RNA-seq data are also still rarely done. Here we report the de novo assembly of a Protestia brevitarsis seulensis transcriptome along four different developmental stages (egg, larva, pupa, and adult) to explore its development and evolution of wings and metamorphosis. The de novo transcriptome assembly consists of 23,551 high-quality transcripts and is approximately 96.7% complete. Out of 8,545 transcripts, 5,183 correspond to the possible orthologs with Drosophila melanogaster. As a result, we could found 265 genes related to wing development and 19 genes related to metamorphosis. The comparison of transcript expression abundance with different developmental stages revealed developmental stage-specific transcripts especially working at the stage of wing development and metamorphosis of P. b. seulensis. This transcriptome quantification along the developmental stages may provide some meaningful clues to elucidate the genetic modulation mechanism of wing development and metamorphosis obtained during the insect evolution.Keywords: white-spotted flower chafers, transcriptomics, RNA-seq, network biology, wing development, metamorphosis
Procedia PDF Downloads 22926498 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery
Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene
Abstract:
Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.Keywords: multi-objective, analysis, data flow, freight delivery, methodology
Procedia PDF Downloads 18026497 Minimization of Denial of Services Attacks in Vehicular Adhoc Networking by Applying Different Constraints
Authors: Amjad Khan
Abstract:
The security of Vehicular ad hoc networking is of great importance as it involves serious life threats. Thus to provide secure communication amongst Vehicles on road, the conventional security system is not enough. It is necessary to prevent the network resources from wastage and give them protection against malicious nodes so that to ensure the data bandwidth availability to the legitimate nodes of the network. This work is related to provide a non conventional security system by introducing some constraints to minimize the DoS (Denial of services) especially data and bandwidth. The data packets received by a node in the network will pass through a number of tests and if any of the test fails, the node will drop those data packets and will not forward it anymore. Also if a node claims to be the nearest node for forwarding emergency messages then the sender can effectively identify the true or false status of the claim by using these constraints. Consequently the DoS(Denial of Services) attack is minimized by the instant availability of data without wasting the network resources.Keywords: black hole attack, grey hole attack, intransient traffic tempering, networking
Procedia PDF Downloads 28526496 Traffic Prediction with Raw Data Utilization and Context Building
Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao
Abstract:
Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.Keywords: traffic prediction, raw data utilization, context building, data reduction
Procedia PDF Downloads 12826495 Seismic Interpretation and Petrophysical Evaluation of SM Field, Libya
Authors: Abdalla Abdelnabi, Yousf Abushalah
Abstract:
The G Formation is a major gas producing reservoir in the SM Field, eastern, Libya. It is called G limestone because it consists of shallow marine limestone. Well data and 3D-Seismic in conjunction with the results of a previous study were used to delineate the hydrocarbon reservoir of Middle Eocene G-Formation of SM Field area. The data include three-dimensional seismic data acquired in 2009. It covers approximately an area of 75 mi² and with more than 9 wells penetrating the reservoir. Seismic data are used to identify any stratigraphic and structural and features such as channels and faults and which may play a significant role in hydrocarbon traps. The well data are used to calculation petrophysical analysis of S field. The average porosity of the Middle Eocene G Formation is very good with porosity reaching 24% especially around well W 6. Average water saturation was calculated for each well from porosity and resistivity logs using Archie’s formula. The average water saturation for the whole well is 25%. Structural mapping of top and bottom of Middle Eocene G formation revealed the highest area in the SM field is at 4800 ft subsea around wells W4, W5, W6, and W7 and the deepest point is at 4950 ft subsea. Correlation between wells using well data and structural maps created from seismic data revealed that net thickness of G Formation range from 0 ft in the north part of the field to 235 ft in southwest and south part of the field. The gas water contact is found at 4860 ft using the resistivity log. The net isopach map using both the trapezoidal and pyramid rules are used to calculate the total bulk volume. The original gas in place and the recoverable gas were calculated volumetrically to be 890 Billion Standard Cubic Feet (BSCF) and 630 (BSCF) respectively.Keywords: 3D seismic data, well logging, petrel, kingdom suite
Procedia PDF Downloads 15026494 Analysis of Spatial and Temporal Data Using Remote Sensing Technology
Authors: Kapil Pandey, Vishnu Goyal
Abstract:
Spatial and temporal data analysis is very well known in the field of satellite image processing. When spatial data are correlated with time, series analysis it gives the significant results in change detection studies. In this paper the GIS and Remote sensing techniques has been used to find the change detection using time series satellite imagery of Uttarakhand state during the years of 1990-2010. Natural vegetation, urban area, forest cover etc. were chosen as main landuse classes to study. Landuse/ landcover classes within several years were prepared using satellite images. Maximum likelihood supervised classification technique was adopted in this work and finally landuse change index has been generated and graphical models were used to present the changes.Keywords: GIS, landuse/landcover, spatial and temporal data, remote sensing
Procedia PDF Downloads 43326493 The Influence of Intrinsic Motivation on the Second Language Learners’ Writing Skill: The Case of Third Year Students of English at Constantine 1 University
Authors: Chadia Nasri
Abstract:
Researches in the field of foreign language learning have indicated the importance of the mastery of the four language skills; speaking, listening, writing and reading. As far as writing is concerned, recent studies have shown that this skill is unavoidable for learning a second language successfully. Writing is characterized as a complex system not easy to achieve. Writing has been proved to be affected by a variety of factors, particularly psychological ones; anxiety, intrinsic motivation, aptitude, etc. Intrinsic motivation is said to be the most influential factors in the foreign language learning process and is considered as the key factor for success. To investigate these two aspects; writing and intrinsic motivation, and the positive correlation between them, our hypothesis is designed on the basis that the degree of learners’ intrinsic motivation helps in facilitating their engagement in the writing tasks. Two questionnaires, one for teachers and the other for students, have been carried out to check the validity of the research hypothesis. As for the teachers’ questionnaire, the results have indicated their awareness of the importance of intrinsic motivation in the learning process and the role it plays in the mastery of their students’ writing skill. In addition, teachers have mentioned various procedures aiming at raising their students’ intrinsic motivation to write. The students’ questionnaire, on the other hand, has investigated students’ reasons for learning a foreign language with regard to their attitudes towards writing as an important skill that they need to master. Their answers to the questionnaire together with the marks they got in the second term test they have had in the writing module have been compared to see whether students’ writing proficiency can be determined by the degree of their intrinsic motivation. The comparison of the collected data has shown the positive correlation between both aspects.Keywords: foreign language learning, intrinsic motivation, motivation, writing proficiency
Procedia PDF Downloads 29326492 Artificial Neural Networks Application on Nusselt Number and Pressure Drop Prediction in Triangular Corrugated Plate Heat Exchanger
Authors: Hany Elsaid Fawaz Abdallah
Abstract:
This study presents a new artificial neural network(ANN) model to predict the Nusselt Number and pressure drop for the turbulent flow in a triangular corrugated plate heat exchanger for forced air and turbulent water flow. An experimental investigation was performed to create a new dataset for the Nusselt Number and pressure drop values in the following range of dimensionless parameters: The plate corrugation angles (from 0° to 60°), the Reynolds number (from 10000 to 40000), pitch to height ratio (from 1 to 4), and Prandtl number (from 0.7 to 200). Based on the ANN performance graph, the three-layer structure with {12-8-6} hidden neurons has been chosen. The training procedure includes back-propagation with the biases and weight adjustment, the evaluation of the loss function for the training and validation dataset and feed-forward propagation of the input parameters. The linear function was used at the output layer as the activation function, while for the hidden layers, the rectified linear unit activation function was utilized. In order to accelerate the ANN training, the loss function minimization may be achieved by the adaptive moment estimation algorithm (ADAM). The ‘‘MinMax’’ normalization approach was utilized to avoid the increase in the training time due to drastic differences in the loss function gradients with respect to the values of weights. Since the test dataset is not being used for the ANN training, a cross-validation technique is applied to the ANN network using the new data. Such procedure was repeated until loss function convergence was achieved or for 4000 epochs with a batch size of 200 points. The program code was written in Python 3.0 using open-source ANN libraries such as Scikit learn, TensorFlow and Keras libraries. The mean average percent error values of 9.4% for the Nusselt number and 8.2% for pressure drop for the ANN model have been achieved. Therefore, higher accuracy compared to the generalized correlations was achieved. The performance validation of the obtained model was based on a comparison of predicted data with the experimental results yielding excellent accuracy.Keywords: artificial neural networks, corrugated channel, heat transfer enhancement, Nusselt number, pressure drop, generalized correlations
Procedia PDF Downloads 8826491 Hidden Markov Model for Financial Limit Order Book and Its Application to Algorithmic Trading Strategy
Authors: Sriram Kashyap Prasad, Ionut Florescu
Abstract:
This study models the intraday asset prices as driven by Markov process. This work identifies the latent states of the Hidden Markov model, using limit order book data (trades and quotes) to continuously estimate the states throughout the day. This work builds a trading strategy using estimated states to generate signals. The strategy utilizes current state to recalibrate buy/ sell levels and the transition between states to trigger stop-loss when adverse price movements occur. The proposed trading strategy is tested on the Stevens High Frequency Trading (SHIFT) platform. SHIFT is a highly realistic market simulator with functionalities for creating an artificial market simulation by deploying agents, trading strategies, distributing initial wealth, etc. In the implementation several assets on the NASDAQ exchange are used for testing. In comparison to a strategy with static buy/ sell levels, this study shows that the number of limit orders that get matched and executed can be increased. Executing limit orders earns rebates on NASDAQ. The system can capture jumps in the limit order book prices, provide dynamic buy/sell levels and trigger stop loss signals to improve the PnL (Profit and Loss) performance of the strategy.Keywords: algorithmic trading, Hidden Markov model, high frequency trading, limit order book learning
Procedia PDF Downloads 15126490 Cryptography Based Authentication Methods
Authors: Mohammad A. Alia, Abdelfatah Aref Tamimi, Omaima N. A. Al-Allaf
Abstract:
This paper reviews a comparison study on the most common used authentication methods. Some of these methods are actually based on cryptography. In this study, we show the main cryptographic services. Also, this study presents a specific discussion about authentication service, since the authentication service is classified into several categorizes according to their methods. However, this study gives more about the real life example for each of the authentication methods. It talks about the simplest authentication methods as well about the available biometric authentication methods such as voice, iris, fingerprint, and face authentication.Keywords: information security, cryptography, system access control, authentication, network security
Procedia PDF Downloads 47126489 An Empirical Investigation of the Challenges of Secure Edge Computing Adoption in Organizations
Authors: Hailye Tekleselassie
Abstract:
Edge computing is a spread computing outline that transports initiative applications closer to data sources such as IoT devices or local edge servers, and possible happenstances would skull the action of new technologies. However, this investigation was attained to investigation the consciousness of technology and communications organization workers and computer users who support the service cloud. Surveys were used to achieve these objectives. Surveys were intended to attain these aims, and it is the functional using survey. Enquiries about confidence are also a key question. Problems like data privacy, integrity, and availability are the factors affecting the company’s acceptance of the service cloud.Keywords: IoT, data, security, edge computing
Procedia PDF Downloads 8326488 Multi Tier Data Collection and Estimation, Utilizing Queue Model in Wireless Sensor Networks
Authors: Amirhossein Mohajerzadeh, Abolghasem Mohajerzadeh
Abstract:
In this paper, target parameter is estimated with desirable precision in hierarchical wireless sensor networks (WSN) while the proposed algorithm also tries to prolong network lifetime as much as possible, using efficient data collecting algorithm. Target parameter distribution function is considered unknown. Sensor nodes sense the environment and send the data to the base station called fusion center (FC) using hierarchical data collecting algorithm. FC builds underlying phenomena based on collected data. Considering the aggregation level, x, the goal is providing the essential infrastructure to find the best value for aggregation level in order to prolong network lifetime as much as possible, while desirable accuracy is guaranteed (required sample size is fully depended on desirable precision). First, the sample size calculation algorithm is discussed, second, the average queue length based on M/M[x]/1/K queue model is determined and it is used for energy consumption calculation. Nodes can decrease transmission cost by aggregating incoming data. Furthermore, the performance of the new algorithm is evaluated in terms of lifetime and estimation accuracy.Keywords: aggregation, estimation, queuing, wireless sensor network
Procedia PDF Downloads 186