Search results for: data reduction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28530

Search results for: data reduction

26250 Increasing the System Availability of Data Centers by Using Virtualization Technologies

Authors: Chris Ewe, Naoum Jamous, Holger Schrödl

Abstract:

Like most entrepreneurs, data center operators pursue goals such as profit-maximization, improvement of the company’s reputation or basically to exist on the market. Part of those aims is to guarantee a given quality of service. Quality characteristics are specified in a contract called the service level agreement. Central part of this agreement is non-functional properties of an IT service. The system availability is one of the most important properties as it will be shown in this paper. To comply with availability requirements, data center operators can use virtualization technologies. A clear model to assess the effect of virtualization functions on the parts of a data center in relation to the system availability is still missing. This paper aims to introduce a basic model that shows these connections, and consider if the identified effects are positive or negative. Thus, this work also points out possible disadvantages of the technology. In consequence, the paper shows opportunities as well as risks of data center virtualization in relation to system availability.

Keywords: availability, cloud computing IT service, quality of service, service level agreement, virtualization

Procedia PDF Downloads 531
26249 Using Crowd-Sourced Data to Assess Safety in Developing Countries: The Case Study of Eastern Cairo, Egypt

Authors: Mahmoud Ahmed Farrag, Ali Zain Elabdeen Heikal, Mohamed Shawky Ahmed, Ahmed Osama Amer

Abstract:

Crowd-sourced data refers to data that is collected and shared by a large number of individuals or organizations, often through the use of digital technologies such as mobile devices and social media. The shortage in crash data collection in developing countries makes it difficult to fully understand and address road safety issues in these regions. In developing countries, crowd-sourced data can be a valuable tool for improving road safety, particularly in urban areas where the majority of road crashes occur. This study is -to our best knowledge- the first to develop safety performance functions using crowd-sourced data by adopting a negative binomial structure model and the Full Bayes model to investigate traffic safety for urban road networks and provide insights into the impact of roadway characteristics. Furthermore, as a part of the safety management process, network screening has been undergone through applying two different methods to rank the most hazardous road segments: PCR method (adopted in the Highway Capacity Manual HCM) as well as a graphical method using GIS tools to compare and validate. Lastly, recommendations were suggested for policymakers to ensure safer roads.

Keywords: crowdsourced data, road crashes, safety performance functions, Full Bayes models, network screening

Procedia PDF Downloads 43
26248 Fijian Women’s Role in Disaster Risk Management: Climate Change

Authors: Priyatma Singh, Manpreet Kaur

Abstract:

Climate change is progressively being identified as a global crisis and this has immediate repercussions for Fiji Islands due to its geographical location being prone to natural disasters. In the Pacific, it is common to find significant differences between men and women, in terms of their roles and responsibilities. In the pursuit of prudent preparedness before disasters, Fijian women’s engagement is constrained due to socially constructed roles and expectation of women here in Fiji. This vulnerability is aggravated by viewing women as victims, rather than as key people who have vital information of their society, economy, and environment, as well as useful skills, which, when recognized and used, can be effective in disaster risk reduction. The focus of this study on disaster management is to outline ways in which Fijian women can be actively engaged in disaster risk management, articulating in decision-making, negating the perceived ideology of women’s constricted roles in Fiji and unveiling social constraints that limit women’s access to practical disaster management strategic plan. This paper outlines the importance of gender mainstreaming in disaster risk reduction and the ways of mainstreaming gender based on a literature review. It analyses theoretical study of academic literature as well as papers and reports produced by various national and international institutions and explores ways to better inform and engage women for climate change per ser disaster management in Fiji. The empowerment of women is believed to be a critical element in constructing disaster resilience as women are often considered to be the designers of community resilience at the local level. Gender mainstreaming as a way of bringing a gender perspective into climate related disasters can be applied to distinguish the varying needs and capacities of women, and integrate them into climate change adaptation strategies. This study will advocate women articulation in disaster risk management, thus giving equal standing to females in Fiji and also identify the gaps and inform national and local Disaster Risk Management authorities to implement processes that enhance gender equality and women’s empowerment towards a more equitable and effective disaster practice.

Keywords: disaster risk management, climate change, gender mainstreaming, women empowerment

Procedia PDF Downloads 385
26247 Demonstration of Land Use Changes Simulation Using Urban Climate Model

Authors: Barbara Vojvodikova, Katerina Jupova, Iva Ticha

Abstract:

Cities in their historical evolution have always adapted their internal structure to the needs of society (for example protective city walls during classicism era lost their defense function, became unnecessary, were demolished and gave space for new features such as roads, museums or parks). Today it is necessary to modify the internal structure of the city in order to minimize the impact of climate changes on the environment of the population. This article discusses the results of the Urban Climate model owned by VITO, which was carried out as part of a project from the European Union's Horizon grant agreement No 730004 Pan-European Urban Climate Services Climate-Fit city. The use of the model was aimed at changes in land use and land cover in cities related to urban heat islands (UHI). The task of the application was to evaluate possible land use change scenarios in connection with city requirements and ideas. Two pilot areas in the Czech Republic were selected. One is Ostrava and the other Hodonín. The paper provides a demonstration of the application of the model for various possible future development scenarios. It contains an assessment of the suitability or inappropriateness of scenarios of future development depending on the temperature increase. Cities that are preparing to reconstruct the public space are interested in eliminating proposals that would lead to an increase in temperature stress as early as in the assignment phase. If they have evaluation on the unsuitability of some type of design, they can limit it into the proposal phases. Therefore, especially in the application of models on Local level - in 1 m spatial resolution, it was necessary to show which type of proposals would create a significant temperature island in its implementation. Such a type of proposal is considered unsuitable. The model shows that the building itself can create a shady place and thus contribute to the reduction of the UHI. If it sensitively approaches the protection of existing greenery, this new construction may not pose a significant problem. More massive interventions leading to the reduction of existing greenery create a new heat island space.

Keywords: climate model, heat islands, Hodonin, land use changes, Ostrava

Procedia PDF Downloads 138
26246 Review of Different Machine Learning Algorithms

Authors: Syed Romat Ali Shah, Bilal Shoaib, Saleem Akhtar, Munib Ahmad, Shahan Sadiqui

Abstract:

Classification is a data mining technique, which is recognizedon Machine Learning (ML) algorithm. It is used to classifythe individual articlein a knownofinformation into a set of predefinemodules or group. Web mining is also a portion of that sympathetic of data mining methods. The main purpose of this paper to analysis and compare the performance of Naïve Bayse Algorithm, Decision Tree, K-Nearest Neighbor (KNN), Artificial Neural Network (ANN)and Support Vector Machine (SVM). This paper consists of different ML algorithm and their advantages and disadvantages and also define research issues.

Keywords: Data Mining, Web Mining, classification, ML Algorithms

Procedia PDF Downloads 296
26245 Using Genetic Algorithms and Rough Set Based Fuzzy K-Modes to Improve Centroid Model Clustering Performance on Categorical Data

Authors: Rishabh Srivastav, Divyam Sharma

Abstract:

We propose an algorithm to cluster categorical data named as ‘Genetic algorithm initialized rough set based fuzzy K-Modes for categorical data’. We propose an amalgamation of the simple K-modes algorithm, the Rough and Fuzzy set based K-modes and the Genetic Algorithm to form a new algorithm,which we hypothesise, will provide better Centroid Model clustering results, than existing standard algorithms. In the proposed algorithm, the initialization and updation of modes is done by the use of genetic algorithms while the membership values are calculated using the rough set and fuzzy logic.

Keywords: categorical data, fuzzy logic, genetic algorithm, K modes clustering, rough sets

Procedia PDF Downloads 244
26244 Forecasting Amman Stock Market Data Using a Hybrid Method

Authors: Ahmad Awajan, Sadam Al Wadi

Abstract:

In this study, a hybrid method based on Empirical Mode Decomposition and Holt-Winter (EMD-HW) is used to forecast Amman stock market data. First, the data are decomposed by EMD method into Intrinsic Mode Functions (IMFs) and residual components. Then, all components are forecasted by HW technique. Finally, forecasting values are aggregated together to get the forecasting value of stock market data. Empirical results showed that the EMD- HW outperform individual forecasting models. The strength of this EMD-HW lies in its ability to forecast non-stationary and non- linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy comparing with eight existing forecasting methods based on the five forecast error measures.

Keywords: Holt-Winter method, empirical mode decomposition, forecasting, time series

Procedia PDF Downloads 123
26243 Analysis of Poverty Reduction Strategies as Mechanism for Development in Nigeria from 1999 to 2014

Authors: Ahmed Usman Egye, Hamza Muhammad

Abstract:

Poverty alleviation is one of the most difficult challenges facing third world countries in their development efforts. Evidences in Nigeria showed that the number of those in poverty has continued to increase. This paper is aimed at analyzing the performance of poverty alleviation measures undertaken by successive administrations in Nigeria with a view to addressing the quagmire. The study identified the whole gamut of factors that served as stumbling blocks to the implementation of each of the strategies and recommended the involvement of local people in the identification and design of projects so that sufficient participation could be achieved.

Keywords: poverty, development, strategies, Nigeria

Procedia PDF Downloads 423
26242 Building Information Modeling-Based Information Exchange to Support Facilities Management Systems

Authors: Sandra T. Matarneh, Mark Danso-Amoako, Salam Al-Bizri, Mark Gaterell

Abstract:

Today’s facilities are ever more sophisticated and the need for available and reliable information for operation and maintenance activities is vital. The key challenge for facilities managers is to have real-time accurate and complete information to perform their day-to-day activities and to provide their senior management with accurate information for decision-making process. Currently, there are various technology platforms, data repositories, or database systems such as Computer-Aided Facility Management (CAFM) that are used for these purposes in different facilities. In most current practices, the data is extracted from paper construction documents and is re-entered manually in one of these computerized information systems. Construction Operations Building information exchange (COBie), is a non-proprietary data format that contains the asset non-geometric data which was captured and collected during the design and construction phases for owners and facility managers use. Recently software vendors developed add-in applications to generate COBie spreadsheet automatically. However, most of these add-in applications are capable of generating a limited amount of COBie data, in which considerable time is still required to enter the remaining data manually to complete the COBie spreadsheet. Some of the data which cannot be generated by these COBie add-ins is essential for facilities manager’s day-to-day activities such as job sheet which includes preventive maintenance schedules. To facilitate a seamless data transfer between BIM models and facilities management systems, we developed a framework that enables automated data generation using the data extracted directly from BIM models to external web database, and then enabling different stakeholders to access to the external web database to enter the required asset data directly to generate a rich COBie spreadsheet that contains most of the required asset data for efficient facilities management operations. The proposed framework is a part of ongoing research and will be demonstrated and validated on a typical university building. Moreover, the proposed framework supplements the existing body of knowledge in facilities management domain by providing a novel framework that facilitates seamless data transfer between BIM models and facilities management systems.

Keywords: building information modeling, BIM, facilities management systems, interoperability, information management

Procedia PDF Downloads 111
26241 Spatio-Temporal Variation of Suspended Sediment Concentration in the near Shore Waters, Southern Karnataka, India

Authors: Ateeth Shetty, K. S. Jayappa, Ratheesh Ramakrishnan, A. S. Rajawat

Abstract:

Suspended Sediment Concentration (SSC) was estimated for the period of four months (November, 2013 to February 2014) using Oceansat-2 (Ocean Colour Monitor) satellite images to understand the coastal dynamics and regional sediment transport, especially distribution and budgeting in coastal waters. The coastal zone undergoes continuous changes due to natural processes and anthropogenic activities. The importance of the coastal zone, with respect to safety, ecology, economy and recreation, demands a management strategy in which each of these aspects is taken into account. Monitoring and understanding the sediment dynamics and suspended sediment transport is an important issue for coastal engineering related activities. A study of the transport mechanism of suspended sediments in the near shore environment is essential not only to safeguard marine installations or navigational channels, but also for the coastal structure design, environmental protection and disaster reduction. Such studies also help in assessment of pollutants and other biological activities in the region. An accurate description of the sediment transport, caused by waves and tidal or wave-induced currents, is of great importance in predicting coastal morphological changes. Satellite-derived SSC data have been found to be useful for Indian coasts because of their high spatial (360 m), spectral and temporal resolutions. The present paper outlines the applications of state‐of‐the‐art operational Indian Remote Sensing satellite, Oceansat-2 to study the dynamics of sediment transport.

Keywords: suspended sediment concentration, ocean colour monitor, sediment transport, case – II waters

Procedia PDF Downloads 248
26240 There's No End in Sight: An Interpretative Phenomenological Analysis of Quality of Life in Burning Syndrome Sufferers

Authors: R. McGrath, A. Trace, S. Curtin, C. McCreary

Abstract:

Introduction: Although, in relation to Burning Mouth Syndrome (BMS), much energy has been expended on its definition and etiology, it still remains a contentious issue. There is agreement on the symptoms, but on little else; and approaches to treatment vary widely. However, it has been established that the condition has a detrimental effect on the sufferer’s quality of life. Much research focus has been put on the physical impact of the syndrome. Recently, some literature has turned the focus to social, functional, and psychological factors. However, there is very little qualitative research on how burning mouth syndrome affects the lives of sufferer’s and the present study seeks to remedy this. Method: The study recruited five male participants who took part in semi-structured interviews lasting between 30 and 50 minutes. Data was analysed using Interpretative Phenomenological Analysis. Results: The study identified four super-ordinate themes: Lack of Control due to Uncertainty about Condition; Disruption to Internal Sense of Self; Negative Future Expectation due to Chronic Symptoms; and Sense of BMS as an Intrusive Force. Aspects of these themes reflect areas of reduction in quality of life. Conclusion: BMS damages an individual’s quality of life in ways that have not been reflected in self-report surveys of health-related quality of life. The condition has serious implications for the individual's sense of self, identity, and future. The study recommends that further qualitative research be carried out in this area. Also, the use of therapeutic interventions with sufferers from BMS is recommended, which would help not only sufferers but best practice in relation to their treatment.

Keywords: burning mouth syndrome, interpretative phenomenological analysis, qualitative research, quality of life

Procedia PDF Downloads 437
26239 Investigation on the Effect of Titanium (Ti) Plus Boron (B) Addition to the Mg-AZ31 Alloy in the as Cast and After Extrusion on Its Metallurgical and Mechanical Characteristics

Authors: Adnan I. O. Zaid, Raghad S. Hemeimat

Abstract:

Magnesium - aluminum alloys are versatile materials which are used in manufacturing a number of engineering and industrial parts in the automobile and aircraft industries due to their strength – to –weight -ratio. Against these preferable characteristics, magnesium is difficult to deform at room temperature therefore it is alloyed with other elements mainly Aluminum and Zinc to add some required properties particularly for their high strength - to -weight ratio. Mg and its alloys oxidize rapidly therefore care should be taken during melting or machining them; but they are not fire hazardous. Grain refinement is an important technology to improve the mechanical properties and the micro structure uniformity of the alloys. Grain refinement has been introduced in early fifties; when Cibula showed that the presence of Ti, and Ti+ B, produced a great refining effect in Al. since then it became an industrial practice to grain refine Al. Most of the published work on grain refinement was directed toward grain refining Al and Zinc alloys; however, the effect of the addition of rare earth material on the grain size or the mechanical behavior of Mg alloys has not been previously investigated. This forms the main objective of the research work; where, the effect of Ti addition on the grain size, mechanical behavior, ductility, and the extrusion force & energy consumed in forward extrusion of Mg-AZ31 alloy is investigated and discussed in two conditions, first in the as cast condition and the second after extrusion. It was found that addition of Ti to Mg- AZ31 alloy has resulted in reduction of its grain size by 14%; the reduction in grain size after extrusion was much higher. However the increase in Vicker’s hardness was 3% after the addition of Ti in the as cast condition, and higher values for Vicker’s hardness were achieved after extrusion. Furthermore, an increase in the strength coefficient by 36% was achieved with the addition of Ti to Mg-AZ31 alloy in the as cast condition. Similarly, the work hardening index was also increased indicating an enhancement of the ductility and formability. As for the extrusion process, it was found that the force and energy required for the extrusion were both reduced by 57% and 59% with the addition of Ti.

Keywords: cast condition, direct extrusion, ductility, MgAZ31 alloy, super - plasticity

Procedia PDF Downloads 452
26238 Waiting Time Reduction in a Government Hospital Emergency Department: A Case Study on AlAdan Hospital, Kuwait

Authors: Bashayer AlRobayaan, Munira Saad, Alaa AlBawab, Fatma AlHamad, Sara AlAwadhi, Sherif Fahmy

Abstract:

This paper addresses the problem of long waiting times in government hospitals emergency departments (ED). It aims at finding feasible and simple ways of reducing waiting times that do not require a lot of resources and/or expenses. AlAdan Hospital in Kuwait was chosen to be understudy to further understand and capture the problem.

Keywords: healthcare, hospital, Kuwait, waiting times, emergency department

Procedia PDF Downloads 487
26237 Data Security and Privacy Challenges in Cloud Computing

Authors: Amir Rashid

Abstract:

Cloud Computing frameworks empower organizations to cut expenses by outsourcing computation resources on-request. As of now, customers of Cloud service providers have no methods for confirming the privacy and ownership of their information and data. To address this issue we propose the platform of a trusted cloud computing program (TCCP). TCCP empowers Infrastructure as a Service (IaaS) suppliers, for example, Amazon EC2 to give a shout box execution condition that ensures secret execution of visitor virtual machines. Also, it permits clients to bear witness to the IaaS supplier and decide if the administration is secure before they dispatch their virtual machines. This paper proposes a Trusted Cloud Computing Platform (TCCP) for guaranteeing the privacy and trustworthiness of computed data that are outsourced to IaaS service providers. The TCCP gives the deliberation of a shut box execution condition for a client's VM, ensuring that no cloud supplier's authorized manager can examine or mess up with its data. Furthermore, before launching the VM, the TCCP permits a client to dependably and remotely acknowledge that the provider at backend is running a confided in TCCP. This capacity extends the verification of whole administration, and hence permits a client to confirm the data operation in secure mode.

Keywords: cloud security, IaaS, cloud data privacy and integrity, hybrid cloud

Procedia PDF Downloads 295
26236 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record

Authors: Raghavi C. Janaswamy

Abstract:

In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.

Keywords: electronic health record, graph neural network, heterogeneous data, prediction

Procedia PDF Downloads 82
26235 A Proposal to Tackle Security Challenges of Distributed Systems in the Healthcare Sector

Authors: Ang Chia Hong, Julian Khoo Xubin, Burra Venkata Durga Kumar

Abstract:

Distributed systems offer many benefits to the healthcare industry. From big data analysis to business intelligence, the increased computational power and efficiency from distributed systems serve as an invaluable resource in the healthcare sector to utilize. However, as the usage of these distributed systems increases, many issues arise. The main focus of this paper will be on security issues. Many security issues stem from distributed systems in the healthcare industry, particularly information security. The data of people is especially sensitive in the healthcare industry. If important information gets leaked (Eg. IC, credit card number, address, etc.), a person’s identity, financial status, and safety might get compromised. This results in the responsible organization losing a lot of money in compensating these people and even more resources expended trying to fix the fault. Therefore, a framework for a blockchain-based healthcare data management system for healthcare was proposed. In this framework, the usage of a blockchain network is explored to store the encryption key of the patient’s data. As for the actual data, it is encrypted and its encrypted data, called ciphertext, is stored in a cloud storage platform. Furthermore, there are some issues that have to be emphasized and tackled for future improvements, such as a multi-user scheme that could be proposed, authentication issues that have to be tackled or migrating the backend processes into the blockchain network. Due to the nature of blockchain technology, the data will be tamper-proof, and its read-only function can only be accessed by authorized users such as doctors and nurses. This guarantees the confidentiality and immutability of the patient’s data.

Keywords: distributed, healthcare, efficiency, security, blockchain, confidentiality and immutability

Procedia PDF Downloads 180
26234 Factors Affecting Test Automation Stability and Their Solutions

Authors: Nagmani Lnu

Abstract:

Test automation is a vital requirement of any organization to release products faster to their customers. In most cases, an organization has an approach to developing automation but struggles to maintain it. It results in an increased number of Flaky Tests, reducing return on investments and stakeholders’ confidence. Challenges grow in multiple folds when automation is for UI behaviors. This paper describes the approaches taken to identify the root cause of automation instability in an extensive payments application and the best practices to address that using processes, tools, and technologies, resulting in a 75% reduction of effort.

Keywords: automation stability, test stability, Flaky Test, test quality, test automation quality

Procedia PDF Downloads 81
26233 In vitro Inhibitory Action of an Aqueous Extract of Carob on the Release of Myeloperoxidase by Human Neutrophils

Authors: Kais Rtibi, Slimen Selmi, Jamel El-Benna, Lamjed Marzouki, Hichem Sebai

Abstract:

Background: Myeloperoxidase (MPO) is a hemic enzyme found in high concentrations in the primary neutrophils granules. In addition to its peroxidase activity, it has a chlorination activity, using hydrogen peroxide and chloride ions to form hypochlorous acid, a strong oxidant, capable of chlorinating molecules. Bioactive compounds contained in medicinal plants could limit the action of this enzyme to reduce the reactive oxygen species production and its chlorination activity. The purpose of this study is to evaluate the effect of the carob aqueous extract (CAE) on the release of MPO by human neutrophils in vitro and its activity following stimulation of these cells by PMA. Methods: Neutrophils were isolated by simple sedimentation using the Dextran/Ficoll method. After stimulation with phorbol 12-myristate 13-acetate (PMA), neutrophils release the MPO by degranulation. The effect of CAE on the release of MPO was analyzed by the Western blot technique, while, its activity was determined by biochemical method using the method of 3,3', 5,5'- Tetramethylbenzidine (TMB) and hydrogen peroxide. The data were expressed as mean ± SEM. Results: The carob aqueous extract causes a decrease in MPO quantity and activity in a concentration-dependent manner which leads to a reduction of the production of the ROS (reactive oxygen species) and the protection of the molecules against oxidation and chlorination mechanisms. Conclusion: Thanks to its richness in bioactive compounds, the aqueous extract of carob could limit the development of damages related to the uncontrolled activity of MPO.

Keywords: carob, MPO, myeloperoxidase, neutrophils, PMA, phorbol 12-myristate 13-acetate

Procedia PDF Downloads 155
26232 The Different Effects of Mindfulness-Based Relapse Prevention Group Therapy on QEEG Measures in Various Severity Substance Use Disorder Involuntary Clients

Authors: Yu-Chi Liao, Nai-Wen Guo, Chun‑Hung Lee, Yung-Chin Lu, Cheng-Hung Ko

Abstract:

Objective: The incidence of behavioral addictions, especially substance use disorders (SUDs), is gradually be taken seriously with various physical health problems. Mindfulness-based relapse prevention (MBRP) is a treatment option for promoting long-term health behavior change in recent years. MBRP is a structured protocol that integrates formal meditation practices with the cognitive-behavioral approach of relapse prevention treatment by teaching participants not to engage in reappraisal or savoring techniques. However, considering SUDs as a complex brain disease, questionnaires and symptom evaluation are not sufficient to evaluate the effect of MBRP. Neurophysiological biomarkers such as quantitative electroencephalogram (QEEG) may improve accurately represent the curative effects. This study attempted to find out the neurophysiological indicator of MBRP in various severity SUD involuntary clients. Participants and Methods: Thirteen participants (all males) completed 8-week mindfulness-based treatment provided by trained, licensed clinical psychologists. The behavioral data were from the Severity of Dependence Scale (SDS) and Negative Mood Regulation Scale (NMR) before and afterMBRP treatment. The QEEG data were simultaneously recorded with executive attention tasks, called comprehensive nonverbal attention test(CNAT). The two-way repeated-measures (treatment * severity) ANOVA and independent t-test were used for statistical analysis. Results: Thirteen participants regrouped into high substance dependence (HS) and low substance dependence (LS) by SDS cut-off. The HS group showed more SDS total score and lower gamma wave in the Go/No Go task of CNAT at pretest. Both groups showed the main effect that they had a lower frontal theta/beta ratio (TBR) during the simple reaction time task of CNAT. The main effect showed that the delay errors of CNAT were lower after MBRP. There was no other difference in CNAT between groups. However, after MBRP, compared to LS, the HS group have resonant progress in improving SDS and NMR scores. The neurophysiological index, the frontal TBR of the HS during the Go/No Go task of CNATdecreased than that of the LS group. Otherwise, the LS group’s gamma wave was a significant reduction on the Go/No Go task of CNAT. Conclusion: The QEEG data supports the MBRP can restore the prefrontal function of involuntary addicts and lower their errors in executive attention tasks. However, the improvement of MBRPfor the addict with high addiction severity is significantly more than that with low severity, including QEEG’s indicators and negative emotion regulation. Future directions include investigating the reasons for differences in efficacy among different severity of the addiction.

Keywords: mindfulness, involuntary clients, QEEG, emotion regulation

Procedia PDF Downloads 145
26231 Women Entrepreneurship as an Inventive Approach to Ensure a Sustainable Development in Anambre State

Authors: S. Muogbo Uju, Akpunonu Uju,

Abstract:

The prevailing harsh environment factors couple with poverty rate and unemployment propels a high rate of entrepreneurial activities in developing countries of the world. Women entrepreneurs operate within gender bias among other constraint that can constitute a threat or create opportunity for women entrepreneurs. This empirical paper investigates and critically examines women entrepreneurship as an inventive approach to sustainable development in Anambra State. The study used descriptive statistics (frequencies, mean, and percentages) to answer the three research questions posed. Hypotheses testing were done with person product moment correlation and multiple regressions were employed in data analysis. SPSS [statistical package for Social Science] software was used to run the analysis. Three hundred and fifty three (353) copies of questionnaires were administered, and one hundred and forty six (146) copies were returned. Consequently, the findings of this study portrayed a significant impact between women entrepreneurship activities, job creation, wealth creation, youth empowerment, poverty reduction, employment generation, and increase in standard of livings of people. Therefore, the findings prescribe that government should ensure that managerial lessons are accompanied with the skill acquisition programs in order for them to understand the rudiment of owing and sustaining a business. The study also recommends that women entrepreneurs that have overcome the inertia of starting a business should come together to create platforms that can help those women who are yet to take a step or kick-start such venture.

Keywords: women entrepreneurship, skill acquisition, sustainability, wealth creation

Procedia PDF Downloads 435
26230 Design and Implementation of a Geodatabase and WebGIS

Authors: Sajid Ali, Dietrich Schröder

Abstract:

The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.

Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application

Procedia PDF Downloads 335
26229 Integration of “FAIR” Data Principles in Longitudinal Mental Health Research in Africa: Lessons from a Landscape Analysis

Authors: Bylhah Mugotitsa, Jim Todd, Agnes Kiragga, Jay Greenfield, Evans Omondi, Lukoye Atwoli, Reinpeter Momanyi

Abstract:

The INSPIRE network aims to build an open, ethical, sustainable, and FAIR (Findable, Accessible, Interoperable, Reusable) data science platform, particularly for longitudinal mental health (MH) data. While studies have been done at the clinical and population level, there still exists limitations in data and research in LMICs, which pose a risk of underrepresentation of mental disorders. It is vital to examine the existing longitudinal MH data, focusing on how FAIR datasets are. This landscape analysis aimed to provide both overall level of evidence of availability of longitudinal datasets and degree of consistency in longitudinal studies conducted. Utilizing prompters proved instrumental in streamlining the analysis process, facilitating access, crafting code snippets, categorization, and analysis of extensive data repositories related to depression, anxiety, and psychosis in Africa. While leveraging artificial intelligence (AI), we filtered through over 18,000 scientific papers spanning from 1970 to 2023. This AI-driven approach enabled the identification of 228 longitudinal research papers meeting inclusion criteria. Quality assurance revealed 10% incorrectly identified articles and 2 duplicates, underscoring the prevalence of longitudinal MH research in South Africa, focusing on depression. From the analysis, evaluating data and metadata adherence to FAIR principles remains crucial for enhancing accessibility and quality of MH research in Africa. While AI has the potential to enhance research processes, challenges such as privacy concerns and data security risks must be addressed. Ethical and equity considerations in data sharing and reuse are also vital. There’s need for collaborative efforts across disciplinary and national boundaries to improve the Findability and Accessibility of data. Current efforts should also focus on creating integrated data resources and tools to improve Interoperability and Reusability of MH data. Practical steps for researchers include careful study planning, data preservation, machine-actionable metadata, and promoting data reuse to advance science and improve equity. Metrics and recognition should be established to incentivize adherence to FAIR principles in MH research

Keywords: longitudinal mental health research, data sharing, fair data principles, Africa, landscape analysis

Procedia PDF Downloads 82
26228 Edible Active Antimicrobial Coatings onto Plastic-Based Laminates and Its Performance Assessment on the Shelf Life of Vacuum Packaged Beef Steaks

Authors: Andrey A. Tyuftin, David Clarke, Malco C. Cruz-Romero, Declan Bolton, Seamus Fanning, Shashi K. Pankaj, Carmen Bueno-Ferrer, Patrick J. Cullen, Joe P. Kerry

Abstract:

Prolonging of shelf-life is essential in order to address issues such as; supplier demands across continents, economical profit, customer satisfaction, and reduction of food wastage. Smart packaging solutions presented in the form of naturally occurred antimicrobially-active packaging may be a solution to these and other issues. Gelatin film forming solution with adding of natural sourced antimicrobials is a promising tool for the active smart packaging. The objective of this study was to coat conventional plastic hydrophobic packaging material with hydrophilic antimicrobial active beef gelatin coating and conduct shelf life trials on beef sub-primal cuts. Minimal inhibition concentration (MIC) of Caprylic acid sodium salt (SO) and commercially available Auranta FV (AFV) (bitter oranges extract with mixture of nutritive organic acids) were found of 1 and 1.5 % respectively against bacterial strains Bacillus cereus, Pseudomonas fluorescens, Escherichia coli, Staphylococcus aureus and aerobic and anaerobic beef microflora. Therefore SO or AFV were incorporated in beef gelatin film forming solution in concentration of two times of MIC which was coated on a conventional plastic LDPE/PA film on the inner cold plasma treated polyethylene surface. Beef samples were vacuum packed in this material and stored under chilling conditions, sampled at weekly intervals during 42 days shelf life study. No significant differences (p < 0.05) in the cook loss was observed among the different treatments compared to control samples until the day 29. Only for AFV coated beef sample it was 3% higher (37.3%) than the control (34.4 %) on the day 36. It was found antimicrobial films did not protect beef against discoloration. SO containing packages significantly (p < 0.05) reduced Total viable bacterial counts (TVC) compared to the control and AFV samples until the day 35. No significant reduction in TVC was observed between SO and AFV films on the day 42 but a significant difference was observed compared to control samples with a 1.40 log of bacteria reduction on the day 42. AFV films significantly (p < 0.05) reduced TVC compared to control samples from the day 14 until the day 42. Control samples reached the set value of 7 log CFU/g on day 27 of testing, AFV films did not reach this set limit until day 35 and SO films until day 42 of testing. The antimicrobial AFV and SO coated films significantly prolonged the shelf-life of beef steaks by 33 or 55% (on 7 and 14 days respectively) compared to control film samples. It is concluded antimicrobial coated films were successfully developed by coating the inner polyethylene layer of conventional LDPE/PA laminated films after plasma surface treatment. The results indicated that the use of antimicrobial active packaging coated with SO or AFV increased significantly (p < 0.05) the shelf life of the beef sub-primal. Overall, AFV or SO containing gelatin coatings have the potential of being used as effective antimicrobials for active packaging applications for muscle-based food products.

Keywords: active packaging, antimicrobials, edible coatings, food packaging, gelatin films, meat science

Procedia PDF Downloads 300
26227 Monitoring of Rice Phenology and Agricultural Practices from Sentinel 2 Images

Authors: D. Courault, L. Hossard, V. Demarez, E. Ndikumana, D. Ho Tong Minh, N. Baghdadi, F. Ruget

Abstract:

In the global change context, efficient management of the available resources has become one of the most important topics, particularly for sustainable crop development. Timely assessment with high precision is crucial for water resource and pest management. Rice cultivated in Southern France in the Camargue region must face a challenge, reduction of the soil salinity by flooding and at the same time reduce the number of herbicides impacting negatively the environment. This context has lead farmers to diversify crop rotation and their agricultural practices. The objective of this study was to evaluate this crop diversity both in crop systems and in agricultural practices applied to rice paddy in order to quantify the impact on the environment and on the crop production. The proposed method is based on the combined use of crop models and multispectral data acquired from the recent Sentinel 2 satellite sensors launched by the European Space Agency (ESA) within the homework of the Copernicus program. More than 40 images at fine spatial resolution (10m in the optical range) were processed for 2016 and 2017 (with a revisit time of 5 days) to map crop types using random forest method and to estimate biophysical variables (LAI) retrieved by inversion of the PROSAIL canopy radiative transfer model. Thanks to the high revisit time of Sentinel 2 data, it was possible to monitor the soil labor before flooding and the second sowing made by some farmers to better control weeds. The temporal trajectories of remote sensing data were analyzed for various rice cultivars for defining the main parameters describing the phenological stages useful to calibrate two crop models (STICS and SAFY). Results were compared to surveys conducted with 10 farms. A large variability of LAI has been observed at farm scale (up to 2-3m²/m²) which induced a significant variability in the yields simulated (up to 2 ton/ha). Observations on more than 300 fields have also been collected on land use. Various maps were elaborated, land use, LAI, flooding and sowing, and harvest dates. All these maps allow proposing a new typology to classify these paddy crop systems. Key phenological dates can be estimated from inverse procedures and were validated against ground surveys. The proposed approach allowed to compare the years and to detect anomalies. The methods proposed here can be applied at different crops in various contexts and confirm the potential of remote sensing acquired at fine resolution such as the Sentinel2 system for agriculture applications and environment monitoring. This study was supported by the French national center of spatial studies (CNES, funded by the TOSCA).

Keywords: agricultural practices, remote sensing, rice, yield

Procedia PDF Downloads 271
26226 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 64
26225 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study

Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos

Abstract:

This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.

Keywords: in-place devices, IoT, human-centred data-analytics, spatial design

Procedia PDF Downloads 192
26224 A Unique Multi-Class Support Vector Machine Algorithm Using MapReduce

Authors: Aditi Viswanathan, Shree Ranjani, Aruna Govada

Abstract:

With data sizes constantly expanding, and with classical machine learning algorithms that analyze such data requiring larger and larger amounts of computation time and storage space, the need to distribute computation and memory requirements among several computers has become apparent. Although substantial work has been done in developing distributed binary SVM algorithms and multi-class SVM algorithms individually, the field of multi-class distributed SVMs remains largely unexplored. This research seeks to develop an algorithm that implements the Support Vector Machine over a multi-class data set and is efficient in a distributed environment. For this, we recursively choose the best binary split of a set of classes using a greedy technique. Much like the divide and conquer approach. Our algorithm has shown better computation time during the testing phase than the traditional sequential SVM methods (One vs. One, One vs. Rest) and out-performs them as the size of the data set grows. This approach also classifies the data with higher accuracy than the traditional multi-class algorithms.

Keywords: distributed algorithm, MapReduce, multi-class, support vector machine

Procedia PDF Downloads 398
26223 Information Management Approach in the Prediction of Acute Appendicitis

Authors: Ahmad Shahin, Walid Moudani, Ali Bekraki

Abstract:

This research aims at presenting a predictive data mining model to handle an accurate diagnosis of acute appendicitis with patients for the purpose of maximizing the health service quality, minimizing morbidity/mortality, and reducing cost. However, acute appendicitis is the most common disease which requires timely accurate diagnosis and needs surgical intervention. Although the treatment of acute appendicitis is simple and straightforward, its diagnosis is still difficult because no single sign, symptom, laboratory or image examination accurately confirms the diagnosis of acute appendicitis in all cases. This contributes in increasing morbidity and negative appendectomy. In this study, the authors propose to generate an accurate model in prediction of patients with acute appendicitis which is based, firstly, on the segmentation technique associated to ABC algorithm to segment the patients; secondly, on applying fuzzy logic to process the massive volume of heterogeneous and noisy data (age, sex, fever, white blood cell, neutrophilia, CRP, urine, ultrasound, CT, appendectomy, etc.) in order to express knowledge and analyze the relationships among data in a comprehensive manner; and thirdly, on applying dynamic programming technique to reduce the number of data attributes. The proposed model is evaluated based on a set of benchmark techniques and even on a set of benchmark classification problems of osteoporosis, diabetes and heart obtained from the UCI data and other data sources.

Keywords: healthcare management, acute appendicitis, data mining, classification, decision tree

Procedia PDF Downloads 347
26222 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery

Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene

Abstract:

Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.

Keywords: multi-objective, analysis, data flow, freight delivery, methodology

Procedia PDF Downloads 175
26221 Improvising Grid Interconnection Capabilities through Implementation of Power Electronics

Authors: Ashhar Ahmed Shaikh, Ayush Tandon

Abstract:

The swift reduction of fossil fuels from nature has crucial need for alternative energy sources to cater vital demand. It is essential to boost alternative energy sources to cover the continuously increasing demand for energy while minimizing the negative environmental impacts. Solar energy is one of the reliable sources that can generate energy. Solar energy is freely available in nature and is completely eco-friendly, and they are considered as the most promising power generating sources due to their easy availability and other advantages for the local power generation. This paper is to review the implementation of power electronic devices through Solar Energy Grid Integration System (SEGIS) to increase the efficiency. This paper will also concentrate on the future grid infrastructure and various other applications in order to make the grid smart. Development and implementation of a power electronic devices such as PV inverters and power controllers play an important role in power supply in the modern energy economy. Solar Energy Grid Integration System (SEGIS) opens pathways for promising solutions for new electronic and electrical components such as advanced innovative inverter/controller topologies and their functions, economical energy management systems, innovative energy storage systems with equipped advanced control algorithms, advanced maximum-power-point tracking (MPPT) suited for all PV technologies, protocols and the associated communications. In addition to advanced grid interconnection capabilities and features, the new hardware design results in small size, less maintenance, and higher reliability. The SEGIS systems will make the 'advanced integrated system' and 'smart grid' evolutionary processes to run in a better way. Since the last few years, there was a major development in the field of power electronics which led to more efficient systems and reduction of the cost per Kilo-watt. The inverters became more efficient and had reached efficiencies in excess of 98%, and commercial solar modules have reached almost 21% efficiency.

Keywords: solar energy grid integration systems, smart grid, advanced integrated system, power electronics

Procedia PDF Downloads 181