Search results for: data loss
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27823

Search results for: data loss

26323 Ergonomics Management and Sustainability: An Exploratory Study Applied to Automaker Industry in South of Brazil

Authors: Giles Balbinotti, Lucas Balbinotti, Paula Hembecker

Abstract:

The management of the productive process project activities, for the conception of future work and for the financial health of the companies, is an important condition in an organizational model that corroborates the management of the human aspects and their variabilities existing in the work. It is important to seek, at all levels of the organization, understanding and consequent cultural change, and so that factors associated with human aspects are considered and prioritized in the projects. In this scenario, the central question of research for this study is placed from the context of the work, in which the managers and project coordinators are inserted, as follows: How is the top management convinced, in the design stages, to take The ‘Ergonomics’ as strategy for the performance and sustainability of the business? In this perspective, this research has as general objective to analyze how the application of the management of the human aspects in a real project of productive process in the automotive industry, including the activity of the manager and coordinator of the project beyond the strategies of convincing to act in the ergonomics of design. For this, the socio-technical and ergonomic approach is adopted, given its anthropocentric premise in the sense of acting on the social system simultaneously to the technical system, besides the support of the Modapts system that measures the non-value-added times and the correlation with the Critical positions. The methodological approach adopted in this study is based on a review of the literature and the analysis of the activity of the project coordinators of an industry, including the management of human aspects in the context of work variability and the strategies applied in project activities. It was observed in the study that the loss of performance of the serial production lines reaches the important number of the order of 30%, which can make the operation with not value-added, and this loss has as one of the causes, the ergonomic problems present in the professional activity.

Keywords: human aspects in production process project, ergonomics in design, sociotechnical project management, sociotechnical, ergonomic principles, sustainability

Procedia PDF Downloads 257
26322 Fuzzy Wavelet Model to Forecast the Exchange Rate of IDR/USD

Authors: Tri Wijayanti Septiarini, Agus Maman Abadi, Muhammad Rifki Taufik

Abstract:

The exchange rate of IDR/USD can be the indicator to analysis Indonesian economy. The exchange rate as a important factor because it has big effect in Indonesian economy overall. So, it needs the analysis data of exchange rate. There is decomposition data of exchange rate of IDR/USD to be frequency and time. It can help the government to monitor the Indonesian economy. This method is very effective to identify the case, have high accurate result and have simple structure. In this paper, data of exchange rate that used is weekly data from December 17, 2010 until November 11, 2014.

Keywords: the exchange rate, fuzzy mamdani, discrete wavelet transforms, fuzzy wavelet

Procedia PDF Downloads 578
26321 Humanising Digital Healthcare to Build Capacity by Harnessing the Power of Patient Data

Authors: Durhane Wong-Rieger, Kawaldip Sehmi, Nicola Bedlington, Nicole Boice, Tamás Bereczky

Abstract:

Patient-generated health data should be seen as the expression of the experience of patients, including the outcomes reflecting the impact a treatment or service had on their physical health and wellness. We discuss how the healthcare system can reach a place where digital is a determinant of health - where data is generated by patients and is respected and which acknowledges their contribution to science. We explore the biggest barriers facing this. The International Experience Exchange with Patient Organisation’s Position Paper is based on a global patient survey conducted in Q3 2021 that received 304 responses. Results were discussed and validated by the 15 patient experts and supplemented with literature research. Results are a subset of this. Our research showed patient communities want to influence how their data is generated, shared, and used. Our study concludes that a reasonable framework is needed to protect the integrity of patient data and minimise abuse, and build trust. Results also demonstrated a need for patient communities to have more influence and control over how health data is generated, shared, and used. The results clearly highlight that the community feels there is a lack of clear policies on sharing data.

Keywords: digital health, equitable access, humanise healthcare, patient data

Procedia PDF Downloads 87
26320 An Unexpected Hand Injury with Pluridigital Fractures Due to Premature Explosion of a Ramadan Cannon

Authors: Hakan Akgul

Abstract:

Purpose: The use of firecrackers (i.e., Ramadan Cannon) during the month of Ramadan is a traditional way of indicating that the fasting period is over in Muslim countries. Here, we report the rehabilitation of a case of hand injury with pluridigital fractures due to premature explosion of a Ramadan cannon. Materials and Methods: A 48-year old man admitted to the Emergency Department due to left hand injury as a result of a premature explosion of a Ramadan cannon. The patient was immediately taken to operation room because of the multiple fractures, tendon loss, and soft tissue loss in the left hand. Range of motion (ROM) of joints was measured with goniometer, pain and oedema were measured and splinting was performed. Results: Rehabilitation team took over the patient at postoperative 9th week. During the 3 month rehabilitation, range of motion increased, oedema was taken under control, pain was reduced, the colour of the skin turned to the normal tone. According to the visual analog scale (VAS), pain decreased from 9 to 4. Oedema, around the metacarpofalangeal (MCP) joints, decreased from 27,5 cm to 23,5 cm. Total active range of motion of the wrist increased from 5 degrees to 50 degrees.Total active range of motion of supination and pronation increased from 55 degrees to 70 degrees. Discussion: The rehabilitation of multiple hand injury is quite difficult. Different aspects of trauma should be taken into consideration when rehabilitation is planned. Factors such as waiting for the bone union, wound healing, and use of external fixators may delay rehabilitation process. Joint mobilization, massage for reducing oedema and preventing scar tissue, exercise within the range of motion are efficient measures. Poor patient compliance to treatment may lead to poor outcome. First of all, oedema and scar formation must be taken under control. Removing fixators should not be delayed depending on the bone union, and exercise within the range of motion should be started.

Keywords: explosion, fracture, hand, injury

Procedia PDF Downloads 245
26319 Use of Machine Learning in Data Quality Assessment

Authors: Bruno Pinto Vieira, Marco Antonio Calijorne Soares, Armando Sérgio de Aguiar Filho

Abstract:

Nowadays, a massive amount of information has been produced by different data sources, including mobile devices and transactional systems. In this scenario, concerns arise on how to maintain or establish data quality, which is now treated as a product to be defined, measured, analyzed, and improved to meet consumers' needs, which is the one who uses these data in decision making and companies strategies. Information that reaches low levels of quality can lead to issues that can consume time and money, such as missed business opportunities, inadequate decisions, and bad risk management actions. The step of selecting, identifying, evaluating, and selecting data sources with significant quality according to the need has become a costly task for users since the sources do not provide information about their quality. Traditional data quality control methods are based on user experience or business rules limiting performance and slowing down the process with less than desirable accuracy. Using advanced machine learning algorithms, it is possible to take advantage of computational resources to overcome challenges and add value to companies and users. In this study, machine learning is applied to data quality analysis on different datasets, seeking to compare the performance of the techniques according to the dimensions of quality assessment. As a result, we could create a ranking of approaches used, besides a system that is able to carry out automatically, data quality assessment.

Keywords: machine learning, data quality, quality dimension, quality assessment

Procedia PDF Downloads 154
26318 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges

Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars

Abstract:

In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.

Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting

Procedia PDF Downloads 156
26317 Performance of Shariah-Based Investment: Evidence from Pakistani Listed Firms

Authors: Mohsin Sadaqat, Hilal Anwar Butt

Abstract:

Following the stock selection guidelines provided by the Sharia Board (SB), we segregate the firms listed at Pakistan Stock Exchange (PSX) into Sharia Compliant (SC) and Non-Sharia Compliant (NSC) stocks. Subsequently, we form portfolios within each group based on market capitalization and volatility. The purpose is to analyze and compare the performance of these two groups as the SC stocks have lesser diversification opportunities due to SB restrictions. Using data ranging from January 2004 until June 2016, our results indicate that in most of the cases the risk-adjusted returns (alphas) for the returns differential between SC and NCS firms are positive. In addition, the SC firms in comparison to their counterparts in PSX provides excess returns that are hedged against the market, size, and value-based systematic risks factors. Overall, these results reconcile with one prevailing notion that the SC stocks that have lower financial leverage and higher investment in real assets are lesser exposed to market-based risks. Further, the SC firms that are more capitalized and less volatile, perform better than lower capitalized and higher volatile SC and NSC firms. To sum up our results, we do not find any substantial evidence for opportunity loss due to limited diversification opportunities in case of SC firms. To optimally utilize scarce resources, investors should consider SC firms as a candidate in portfolio construction.

Keywords: diversification, performance, sharia compliant stocks, risk adjusted returns

Procedia PDF Downloads 201
26316 Hazardous Effects of Metal Ions on the Thermal Stability of Hydroxylammonium Nitrate

Authors: Shweta Hoyani, Charlie Oommen

Abstract:

HAN-based liquid propellants are perceived as potential substitute for hydrazine in space propulsion. Storage stability for long service life in orbit is one of the key concerns for HAN-based monopropellants because of its reactivity with metallic and non-metallic impurities which could entrain from the surface of fuel tanks and the tubes. The end result of this reactivity directly affects the handling, performance and storability of the liquid propellant. Gaseous products resulting from the decomposition of the propellant can lead to deleterious pressure build up in storage vessels. The partial loss of an energetic component can change the ignition and the combustion behavior and alter the performance of the thruster. The effect of largely plausible metals- iron, copper, chromium, nickel, manganese, molybdenum, zinc, titanium and cadmium on the thermal decomposition mechanism of HAN has been investigated in this context. Studies involving different concentrations of metal ions and HAN at different preheat temperatures have been carried out. Effect of metal ions on the decomposition behavior of HAN has been studied earlier in the context of use of HAN as gun propellant. However the current investigation pertains to the decomposition mechanism of HAN in the context of use of HAN as monopropellant for space propulsion. Decomposition onset temperature, rate of weight loss, heat of reaction were studied using DTA- TGA and total pressure rise and rate of pressure rise during decomposition were evaluated using an in-house built constant volume batch reactor. Besides, reaction mechanism and product profile were studied using TGA-FTIR setup. Iron and copper displayed the maximum reaction. Initial results indicate that iron and copper shows sensitizing effect at concentrations as low as 50 ppm with 60% HAN solution at 80°C. On the other hand 50 ppm zinc does not display any effect on the thermal decomposition of even 90% HAN solution at 80°C.

Keywords: hydroxylammonium nitrate, monopropellant, reaction mechanism, thermal stability

Procedia PDF Downloads 427
26315 Nuclear Decay Data Evaluation for 217Po

Authors: S. S. Nafee, A. M. Al-Ramady, S. A. Shaheen

Abstract:

Evaluated nuclear decay data for the 217Po nuclide ispresented in the present work. These data include recommended values for the half-life T1/2, α-, β--, and γ-ray emission energies and probabilities. Decay data from 221Rn α and 217Bi β—decays are presented. Q(α) has been updated based on the recent published work of the Atomic Mass Evaluation AME2012. In addition, the logft values were calculated using the Logft program from the ENSDF evaluation package. Moreover, the total internal conversion electrons has been calculated using Bricc program. Meanwhile, recommendation values or the multi-polarities have been assigned based on recently measurement yield a better intensity balance at the 254 keV and 264 keV gamma transitions.

Keywords: nuclear decay data evaluation, mass evaluation, total converison coefficients, atomic mass evaluation

Procedia PDF Downloads 436
26314 Outbreak of Pulmonary Tuberculosis in Cojutepeque Military Brigade, El Salvador, July 2013

Authors: Juan Santos Garcia

Abstract:

Introduction: Tuberculosis is a chronic granulomatous disease caused by a microorganism called Mycobacterium tuberculosis, it has the capacity to spread from the lungs to other parts of the body. Globally, the rate per 100 thousand inhabitants has varied from 136 in 2007 to 122 in 2012; while in the region of the Americas has been much lower: 32 cases per 100,000 in 2007, to 29 in 2012. In El Salvador have also varied incidence rates from 2007 to 2012, from 27.4 cases per 100 000 population to 32 in the period indicated. Methods: Screening was performed with smear and chest X-ray at 80 military personnel from Military Brigade # 5 of El Salvador. Besides HIV tests were taken at the positive cases, which are also made interview, investigating demographic, clinical, laboratory and risk factors data. Frequencies, percentages and rates were calculated using Excel page. The rates were calculated for each of the 5 military bedrooms (called A, B, C, D, and E). Results: Attack rate was 18.75% in the bedroom C. the index case was identified and two secondary cases, with an exposure period of 59 days. Only the index case presented symptoms: cough, fever and weight loss. The other two cases had no symptoms. Discussion: We found a rate of tuberculosis 526 times higher than the national rate. He was also 12.5 times higher than that found in other studies in closed populations such as school facilities. It was not possible to make association analysis.

Keywords: tuberculosis, outbreak, military brigade, chronic granulomatous disease

Procedia PDF Downloads 263
26313 Environmental Monitoring by Using Unmanned Aerial Vehicle (UAV) Images and Spatial Data: A Case Study of Mineral Exploitation in Brazilian Federal District, Brazil

Authors: Maria De Albuquerque Bercot, Caio Gustavo Mesquita Angelo, Daniela Maria Moreira Siqueira, Augusto Assucena De Vasconcellos, Rodrigo Studart Correa

Abstract:

Mining is an important socioeconomic activity in Brazil although it negatively impacts the environment. Mineral operations cause irreversible changes in topography, removal of vegetation and topsoil, habitat destruction, displacement of fauna, loss of biodiversity, soil erosion, siltation of watercourses and have potential to enhance climate change. Due to the impacts and its pollution potential, mining activity in Brazil is legally subjected to environmental licensing. Unlicensed mining operations or operations that not abide to the terms of an obtained license are taken as environmental crimes in the country. This work reports a case analyzed in the Forensic Institute of the Brazilian Federal District Civil Police. The case consisted of detecting illegal aspects of sand exploitation from a licensed mine in Federal District, nearby Brasilia city. The fieldwork covered an area of roughly 6 ha, which was surveyed with an unmanned aerial vehicle (UAV) (PHANTOM 3 ADVANCED). The overflight with UAV took about 20 min, with maximum flight height of 100 m. 592 UAV georeferenced images were obtained and processed in a photogrammetric software (AGISOFT PHOTOSCAN 1.1.4), which generated a mosaic of geo-referenced images and a 3D model in less than six working hours. The 3D model was analyzed in a forensic software for accurate modeling and volumetric analysis. (MAPTEK I-SITE FORENSIC 2.2). To ensure the 3D model was a true representation of the mine site, coordinates of ten control points and reference measures were taken during fieldwork and compared to respective spatial data in the model. Finally, these spatial data were used for measuring mining area, excavation depth and volume of exploited sand. Results showed that mine holder had not complied with some terms and conditions stated in the granted license, such as sand exploration beyond authorized extension, depth and volume. Easiness, the accuracy and expedition of procedures used in this case highlight the employment of UAV imagery and computational photogrammetry as efficient tools for outdoor forensic exams, especially on environmental issues.

Keywords: computational photogrammetry, environmental monitoring, mining, UAV

Procedia PDF Downloads 322
26312 Geographic Information System Using Google Fusion Table Technology for the Delivery of Disease Data Information

Authors: I. Nyoman Mahayasa Adiputra

Abstract:

Data in the field of health can be useful for the purposes of data analysis, one example of health data is disease data. Disease data is usually in a geographical plot in accordance with the area. Where the data was collected, in the city of Denpasar, Bali. Disease data report is still published in tabular form, disease information has not been mapped in GIS form. In this research, disease information in Denpasar city will be digitized in the form of a geographic information system with the smallest administrative area in the form of district. Denpasar City consists of 4 districts of North Denpasar, East Denpasar, West Denpasar and South Denpasar. In this research, we use Google fusion table technology for map digitization process, where this technology can facilitate from the administrator and from the recipient information. From the administrator side of the input disease, data can be done easily and quickly. From the receiving end of the information, the resulting GIS application can be published in a website-based application so that it can be accessed anywhere and anytime. In general, the results obtained in this study, divided into two, namely: (1) Geolocation of Denpasar and all of Denpasar districts, the process of digitizing the map of Denpasar city produces a polygon geolocation of each - district of Denpasar city. These results can be utilized in subsequent GIS studies if you want to use the same administrative area. (2) Dengue fever mapping in 2014 and 2015. Disease data used in this study is dengue fever case data taken in 2014 and 2015. Data taken from the profile report Denpasar Health Department 2015 and 2016. This mapping can be useful for the analysis of the spread of dengue hemorrhagic fever in the city of Denpasar.

Keywords: geographic information system, Google fusion table technology, delivery of disease data information, Denpasar city

Procedia PDF Downloads 135
26311 Inclusive Practices in Health Sciences: Equity Proofing Higher Education Programs

Authors: Mitzi S. Brammer

Abstract:

Given that the cultural make-up of programs of study in institutions of higher learning is becoming increasingly diverse, much has been written about cultural diversity from a university-level perspective. However, there are little data in the way of specific programs and how they address inclusive practices when teaching and working with marginalized populations. This research study aimed to discover baseline knowledge and attitudes of health sciences faculty, instructional staff, and students related to inclusive teaching/learning and interactions. Quantitative data were collected via an anonymous online survey (one designed for students and another designed for faculty/instructional staff) using a web-based program called Qualtrics. Quantitative data were analyzed amongst the faculty/instructional staff and students, respectively, using descriptive and comparative statistics (t-tests). Additionally, some participants voluntarily engaged in a focus group discussion in which qualitative data were collected around these same variables. Collecting qualitative data to triangulate the quantitative data added trustworthiness to the overall data. The research team analyzed collected data and compared identified categories and trends, comparing those data between faculty/staff and students, and reported results as well as implications for future study and professional practice.

Keywords: inclusion, higher education, pedagogy, equity, diversity

Procedia PDF Downloads 69
26310 An Analysis of Sequential Pattern Mining on Databases Using Approximate Sequential Patterns

Authors: J. Suneetha, Vijayalaxmi

Abstract:

Sequential Pattern Mining involves applying data mining methods to large data repositories to extract usage patterns. Sequential pattern mining methodologies used to analyze the data and identify patterns. The patterns have been used to implement efficient systems can recommend on previously observed patterns, in making predictions, improve usability of systems, detecting events, and in general help in making strategic product decisions. In this paper, identified performance of approximate sequential pattern mining defines as identifying patterns approximately shared with many sequences. Approximate sequential patterns can effectively summarize and represent the databases by identifying the underlying trends in the data. Conducting an extensive and systematic performance over synthetic and real data. The results demonstrate that ApproxMAP effective and scalable in mining large sequences databases with long patterns.

Keywords: multiple data, performance analysis, sequential pattern, sequence database scalability

Procedia PDF Downloads 353
26309 Physics-Informed Neural Network for Predicting Strain Demand in Inelastic Pipes under Ground Movement with Geometric and Soil Resistance Nonlinearities

Authors: Pouya Taraghi, Yong Li, Nader Yoosef-Ghodsi, Muntaseer Kainat, Samer Adeeb

Abstract:

Buried pipelines play a crucial role in the transportation of energy products such as oil, gas, and various chemical fluids, ensuring their efficient and safe distribution. However, these pipelines are often susceptible to ground movements caused by geohazards like landslides, fault movements, lateral spreading, and more. Such ground movements can lead to strain-induced failures in pipes, resulting in leaks or explosions, leading to fires, financial losses, environmental contamination, and even loss of human life. Therefore, it is essential to study how buried pipelines respond when traversing geohazard-prone areas to assess the potential impact of ground movement on pipeline design. As such, this study introduces an approach called the Physics-Informed Neural Network (PINN) to predict the strain demand in inelastic pipes subjected to permanent ground displacement (PGD). This method uses a deep learning framework that does not require training data and makes it feasible to consider more realistic assumptions regarding existing nonlinearities. It leverages the underlying physics described by differential equations to approximate the solution. The study analyzes various scenarios involving different geohazard types, PGD values, and crossing angles, comparing the predictions with results obtained from finite element methods. The findings demonstrate a good agreement between the results of the proposed method and the finite element method, highlighting its potential as a simulation-free, data-free, and meshless alternative. This study paves the way for further advancements, such as the simulation-free reliability assessment of pipes subjected to PGD, as part of ongoing research that leverages the proposed method.

Keywords: strain demand, inelastic pipe, permanent ground displacement, machine learning, physics-informed neural network

Procedia PDF Downloads 64
26308 Medical Knowledge Management since the Integration of Heterogeneous Data until the Knowledge Exploitation in a Decision-Making System

Authors: Nadjat Zerf Boudjettou, Fahima Nader, Rachid Chalal

Abstract:

Knowledge management is to acquire and represent knowledge relevant to a domain, a task or a specific organization in order to facilitate access, reuse and evolution. This usually means building, maintaining and evolving an explicit representation of knowledge. The next step is to provide access to that knowledge, that is to say, the spread in order to enable effective use. Knowledge management in the medical field aims to improve the performance of the medical organization by allowing individuals in the care facility (doctors, nurses, paramedics, etc.) to capture, share and apply collective knowledge in order to make optimal decisions in real time. In this paper, we propose a knowledge management approach based on integration technique of heterogeneous data in the medical field by creating a data warehouse, a technique of extracting knowledge from medical data by choosing a technique of data mining, and finally an exploitation technique of that knowledge in a case-based reasoning system.

Keywords: data warehouse, data mining, knowledge discovery in database, KDD, medical knowledge management, Bayesian networks

Procedia PDF Downloads 399
26307 Brain Atrophy in Alzheimer's Patients

Authors: Tansa Nisan Gunerhan

Abstract:

Dementia comes in different forms, including Alzheimer's disease. The most common dementia diagnosis among elderly individuals is Alzheimer's disease. On average, for patients with Alzheimer’s, life expectancy is around 4-8 years after the diagnosis; however, expectancy can go as high as twenty years or more, depending on the shrinkage of the brain. Normally, along with aging, the brain shrinks at some level but doesn’t lose a vast amount of neurons. However, Alzheimer's patients' neurons are destroyed rapidly; hence problems with loss of memory, communication, and other metabolic activities begin. The toxic changes in the brain affect the stability of the neurons. Beta-amyloid and tau are two proteins that are believed to play a role in the development of Alzheimer's disease through their toxic changes. Beta-amyloid is a protein that is produced in the brain and is normally broken down and removed from the body. However, in people with Alzheimer's disease, the production of beta-amyloid increases, and it begins to accumulate in the brain. These plaques are thought to disrupt communication between nerve cells and may contribute to the death of brain cells. Tau is a protein that helps to stabilize microtubules, which are essential for the transportation of nutrients and other substances within brain cells. In people with Alzheimer's disease, tau becomes abnormal and begins to accumulate inside brain cells, forming neurofibrillary tangles. These tangles disrupt the normal functioning of brain cells and may contribute to their death, forming amyloid plaques which are deposits of a protein called amyloid-beta that build up between nerve cells in the brain. The accumulation of amyloid plaques and neurofibrillary tangles in the brain is thought to contribute to the shrinkage of brain tissue. As the brain shrinks, the size of the brain may decrease, leading to a reduction in brain volume. Brain atrophy in Alzheimer's disease is often accompanied by changes in the structure and function of brain cells and the connections between them, leading to a decline in brain function. These toxic changes that accumulate can cause symptoms such as memory loss, difficulty with thinking and problem-solving, and changes in behavior and personality.

Keywords: Alzheimer, amyloid-beta, brain atrophy, neuron, shrinkage

Procedia PDF Downloads 98
26306 Effects of Transtheoretical Model in Obese and Overweight Women Nutritional Behavior Change and Lose Weight

Authors: Abdmohammad Mousavi, Mohsen Shams, Mehdi Akbartabar Toori, Ali Mousavizadeh, Mohammad Ali Morowatisharifabad

Abstract:

The effectiveness of Transtheoretical Model (TTM) on nutritional behavior change and lose weight has been subject to questions by some studies. The objective of this study was to determine the effect of nutritional behavior change and lose weight interventions based on TTM in obese and overweight women. This experimental study that was a 8 months trial nutritional behavior change and weight loss program based on TTM with two conditions and pre–post intervention measurements weight mean. 299 obese and overweight 20-44 years old women were selected from two health centers include training (142) and control (157) groups in Yasuj, a city in south west of Iran. Data were analyzed using paired T-test and One–Way ANOVA tests. In baseline, adherence with nutritional healthy behavior in training group(9.4%) compare with control(38.8%) were different significantly(p=.003), weight mean of training(Mean=78.02 kg, SD=11.67) compared with control group(Mean=77.23 kg, SD=10.25) were not (P=.66). In post test, adherence with nutritional healthy behavior in training group(70.1%) compare with control (37.4%) were different significantly (p=.000), weight mean of training (Mean=74.65 kg, SD=10.93, p=.000) compare with pre test were different significantly and control (Mean=77.43 kg, SD=10.43, p=.411) were not. The training group has lost 3.37 kg weight, whereas the control group has increased .2 kg weight. These results supported the applicability of the TTM for women weight lose intervention.

Keywords: nutritional behavior, Transtheoretical Model, weight lose, women

Procedia PDF Downloads 487
26305 Cyber Supply Chain Resilient: Enhancing Security through Leadership to Protect National Security

Authors: Katie Wood

Abstract:

Cyber criminals are constantly on the lookout for new opportunities to exploit organisation and cause destruction. This could lead to significant cause of economic loss for organisations in the form of destruction in finances, reputation and even the overall survival of the organization. Additionally, this leads to serious consequences on national security. The threat of possible cyber attacks places further pressure on organisations to ensure they are secure, at a time where international scale cyber attacks have occurred in a range of sectors. Stakeholders are wanting confidence that their data is protected. This is only achievable if a business fosters a resilient supply chain strategy which is implemented throughout its supply chain by having a strong cyber leadership culture. This paper will discuss the essential role and need for organisations to adopt a cyber leadership culture and direction to learn about own internal processes to ensure mitigating systemic vulnerability of its supply chains. This paper outlines that to protect national security there is an urgent need for cyber awareness culture change. This is required in all organisations, regardless of their sector or size, to implementation throughout the whole supplier chain to support and protect economic prosperity to make the UK more resilient to cyber-attacks. Through businesses understanding the supply chain and risk management cycle of their own operates has to be the starting point to ensure effective cyber migration strategies.

Keywords: cyber leadership, cyber migration strategies, resilient supply chain strategy, cybersecurity

Procedia PDF Downloads 244
26304 Mean Shift-Based Preprocessing Methodology for Improved 3D Buildings Reconstruction

Authors: Nikolaos Vassilas, Theocharis Tsenoglou, Djamchid Ghazanfarpour

Abstract:

In this work we explore the capability of the mean shift algorithm as a powerful preprocessing tool for improving the quality of spatial data, acquired from airborne scanners, from densely built urban areas. On one hand, high resolution image data corrupted by noise caused by lossy compression techniques are appropriately smoothed while at the same time preserving the optical edges and, on the other, low resolution LiDAR data in the form of normalized Digital Surface Map (nDSM) is upsampled through the joint mean shift algorithm. Experiments on both the edge-preserving smoothing and upsampling capabilities using synthetic RGB-z data show that the mean shift algorithm is superior to bilateral filtering as well as to other classical smoothing and upsampling algorithms. Application of the proposed methodology for 3D reconstruction of buildings of a pilot region of Athens, Greece results in a significant visual improvement of the 3D building block model.

Keywords: 3D buildings reconstruction, data fusion, data upsampling, mean shift

Procedia PDF Downloads 318
26303 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman Electricity Transmission Company

Authors: Rahma Al Balushi

Abstract:

Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS dept. This paper will describe in detail the GIS data submission process and the journey to develop the current process. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting, and data alterations salso aided to reduce the missing attributes of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the year 2017 and the year 2021. Overall, concluding that by governance, asset information & GIS department can control GIS data process; collect, properly record, and manage asset data and information within OETC network. This control extends to other applications and systems integrated with/related to GIS systems.

Keywords: asset management ISO55001, standard procedures process, governance, geodatabase, NOC, CMMS

Procedia PDF Downloads 211
26302 Importance of Ethics in Cloud Security

Authors: Pallavi Malhotra

Abstract:

This paper examines the importance of ethics in cloud computing. In the modern society, cloud computing is offering individuals and businesses an unlimited space for storing and processing data or information. Most of the data and information stored in the cloud by various users such as banks, doctors, architects, engineers, lawyers, consulting firms, and financial institutions among others require a high level of confidentiality and safeguard. Cloud computing offers centralized storage and processing of data, and this has immensely contributed to the growth of businesses and improved sharing of information over the internet. However, the accessibility and management of data and servers by a third party raise concerns regarding the privacy of clients’ information and the possible manipulations of the data by third parties. This document suggests the approaches various stakeholders should take to address various ethical issues involving cloud-computing services. Ethical education and training is key to all stakeholders involved in the handling of data and information stored or being processed in the cloud.

Keywords: IT ethics, cloud computing technology, cloud privacy and security, ethical education

Procedia PDF Downloads 327
26301 The Feminism of Data Privacy and Protection in Africa

Authors: Olayinka Adeniyi, Melissa Omino

Abstract:

The field of data privacy and data protection in Africa is still an evolving area, with many African countries yet to enact legislation on the subject. While African Governments are bringing their legislation to speed in this field, how patriarchy pervades every sector of African thought and manifests in society needs to be considered. Moreover, the laws enacted ought to be inclusive, especially towards women. This, in a nutshell, is the essence of data feminism. Data feminism is a new way of thinking about data science and data ethics that is informed by the ideas of intersectional feminism. Feminising data privacy and protection will involve thinking women, considering women in the issues of data privacy and protection, particularly in legislation, as is the case in this paper. The line of thought of women inclusion is not uncommon when even international and regional human rights specific for women only came long after the general human rights. The consideration is that these should have been inserted or rather included in the original general instruments in the first instance. Since legislation on data privacy is coming in this century, having seen the rights and shortcomings of earlier instruments, then the cue should be taken to ensure inclusive wholistic legislation for data privacy and protection in the first instance. Data feminism is arguably an area that has been scantily researched, albeit a needful one. With the spate of increase in the violence against women spiraling in the cyber world, compounding the issue of COVID-19 and the needful response of governments, and the effect of these on women and their rights, fast forward, the research on the feminism of data privacy and protection in Africa becomes inevitable. This paper seeks to answer the questions, what is data feminism in the African context, why is it important in the issue of data privacy and protection legislation; what are the laws, if any, existing on data privacy and protection in Africa, are they women inclusive, if not, why; what are the measures put in place for the privacy and protection of women in Africa, and how can this be made possible. The paper aims to investigate the issue of data privacy and protection in Africa, the legal framework, and the protection or provision that it has for women if any. It further aims to research the importance and necessity of feminizing data privacy and protection, the effect of lack of it, the challenges or bottlenecks in attaining this feat and the possibilities of accessing data privacy and protection for African women. The paper also researches the emerging practices of data privacy and protection of women in other jurisprudences. It approaches the research through the methodology of review of papers, analysis of laws, and reports. It seeks to contribute to the existing literature in the field and is explorative in its suggestion. It suggests a draft of some clauses to make any data privacy and protection legislation women inclusive. It would be useful for policymaking, academic, and public enlightenment.

Keywords: feminism, women, law, data, Africa

Procedia PDF Downloads 213
26300 Evaluation of Practicality of On-Demand Bus Using Actual Taxi-Use Data through Exhaustive Simulations

Authors: Jun-ichi Ochiai, Itsuki Noda, Ryo Kanamori, Keiji Hirata, Hitoshi Matsubara, Hideyuki Nakashima

Abstract:

We conducted exhaustive simulations for data assimilation and evaluation of service quality for various setting in a new shared transportation system, called SAVS. Computational social simulation is a key technology to design recent social services like SAVS as new transportation service. One open issue in SAVS was to determine the service scale through the social simulation. Using our exhaustive simulation framework, OACIS, we did data-assimilation and evaluation of effects of SAVS based on actual tax-use data at Tajimi city, Japan. Finally, we get the conditions to realize the new service in a reasonable service quality.

Keywords: on-demand bus sytem, social simulation, data assimilation, exhaustive simulation

Procedia PDF Downloads 324
26299 Optimal Pricing Based on Real Estate Demand Data

Authors: Vanessa Kummer, Maik Meusel

Abstract:

Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.

Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning

Procedia PDF Downloads 292
26298 Wildlife Habitat Corridor Mapping in Urban Environments: A GIS-Based Approach Using Preliminary Category Weightings

Authors: Stefan Peters, Phillip Roetman

Abstract:

The global loss of biodiversity is threatening the benefits nature provides to human populations and has become a more pressing issue than climate change and requires immediate attention. While there have been successful global agreements for environmental protection, such as the Montreal Protocol, these are rare, and we cannot rely on them solely. Thus, it is crucial to take national and local actions to support biodiversity. Australia is one of the 17 countries in the world with a high level of biodiversity, and its cities are vital habitats for endangered species, with more of them found in urban areas than in non-urban ones. However, the protection of biodiversity in metropolitan Adelaide has been inadequate, with over 130 species disappearing since European colonization in 1836. In this research project we conceptualized, developed and implemented a framework for wildlife Habitat Hotspots and Habitat Corridor modelling in an urban context using geographic data and GIS modelling and analysis. We used detailed topographic and other geographic data provided by a local council, including spatial and attributive properties of trees, parcels, water features, vegetated areas, roads, verges, traffic, and census data. Weighted factors considered in our raster-based Habitat Hotspot model include parcel size, parcel shape, population density, canopy cover, habitat quality and proximity to habitats and water features. Weighted factors considered in our raster-based Habitat Corridor model include habitat potential (resulting from the Habitat Hotspot model), verge size, road hierarchy, road widths, human density, and presence of remnant indigenous vegetation species. We developed a GIS model, using Python scripting and ArcGIS-Pro Model-Builder, to establish an automated reproducible and adjustable geoprocessing workflow, adaptable to any study area of interest. Our habitat hotspot and corridor modelling framework allow to determine and map existing habitat hotspots and wildlife habitat corridors. Our research had been applied to the study case of Burnside, a local council in Adelaide, Australia, which encompass an area of 30 km2. We applied end-user expertise-based category weightings to refine our models and optimize the use of our habitat map outputs towards informing local strategic decision-making.

Keywords: biodiversity, GIS modeling, habitat hotspot, wildlife corridor

Procedia PDF Downloads 121
26297 Unlocking the Puzzle of Borrowing Adult Data for Designing Hybrid Pediatric Clinical Trials

Authors: Rajesh Kumar G

Abstract:

A challenging aspect of any clinical trial is to carefully plan the study design to meet the study objective in optimum way and to validate the assumptions made during protocol designing. And when it is a pediatric study, there is the added challenge of stringent guidelines and difficulty in recruiting the necessary subjects. Unlike adult trials, there is not much historical data available for pediatrics, which is required to validate assumptions for planning pediatric trials. Typically, pediatric studies are initiated as soon as approval is obtained for a drug to be marketed for adults, so with the adult study historical information and with the available pediatric pilot study data or simulated pediatric data, the pediatric study can be well planned. Generalizing the historical adult study for new pediatric study is a tedious task; however, it is possible by integrating various statistical techniques and utilizing the advantage of hybrid study design, which will help to achieve the study objective in a smoother way even with the presence of many constraints. This research paper will explain how well the hybrid study design can be planned along with integrated technique (SEV) to plan the pediatric study; In brief the SEV technique (Simulation, Estimation (using borrowed adult data and applying Bayesian methods)) incorporates the use of simulating the planned study data and getting the desired estimates to Validate the assumptions.This method of validation can be used to improve the accuracy of data analysis, ensuring that results are as valid and reliable as possible, which allow us to make informed decisions well ahead of study initiation. With professional precision, this technique based on the collected data allows to gain insight into best practices when using data from historical study and simulated data alike.

Keywords: adaptive design, simulation, borrowing data, bayesian model

Procedia PDF Downloads 81
26296 Analyzing Test Data Generation Techniques Using Evolutionary Algorithms

Authors: Arslan Ellahi, Syed Amjad Hussain

Abstract:

Software Testing is a vital process in software development life cycle. We can attain the quality of software after passing it through software testing phase. We have tried to find out automatic test data generation techniques that are a key research area of software testing to achieve test automation that can eventually decrease testing time. In this paper, we review some of the approaches presented in the literature which use evolutionary search based algorithms like Genetic Algorithm, Particle Swarm Optimization (PSO), etc. to validate the test data generation process. We also look into the quality of test data generation which increases or decreases the efficiency of testing. We have proposed test data generation techniques for model-based testing. We have worked on tuning and fitness function of PSO algorithm.

Keywords: search based, evolutionary algorithm, particle swarm optimization, genetic algorithm, test data generation

Procedia PDF Downloads 196
26295 Comparative Analysis of the Third Generation of Research Data for Evaluation of Solar Energy Potential

Authors: Claudineia Brazil, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Rafael Haag

Abstract:

Renewable energy sources are dependent on climatic variability, so for adequate energy planning, observations of the meteorological variables are required, preferably representing long-period series. Despite the scientific and technological advances that meteorological measurement systems have undergone in the last decades, there is still a considerable lack of meteorological observations that form series of long periods. The reanalysis is a system of assimilation of data prepared using general atmospheric circulation models, based on the combination of data collected at surface stations, ocean buoys, satellites and radiosondes, allowing the production of long period data, for a wide gamma. The third generation of reanalysis data emerged in 2010, among them is the Climate Forecast System Reanalysis (CFSR) developed by the National Centers for Environmental Prediction (NCEP), these data have a spatial resolution of 0.50 x 0.50. In order to overcome these difficulties, it aims to evaluate the performance of solar radiation estimation through alternative data bases, such as data from Reanalysis and from meteorological satellites that satisfactorily meet the absence of observations of solar radiation at global and/or regional level. The results of the analysis of the solar radiation data indicated that the reanalysis data of the CFSR model presented a good performance in relation to the observed data, with determination coefficient around 0.90. Therefore, it is concluded that these data have the potential to be used as an alternative source in locations with no seasons or long series of solar radiation, important for the evaluation of solar energy potential.

Keywords: climate, reanalysis, renewable energy, solar radiation

Procedia PDF Downloads 211
26294 Data Mining Spatial: Unsupervised Classification of Geographic Data

Authors: Chahrazed Zouaoui

Abstract:

In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.

Keywords: mining, GIS, geo-clustering, neighborhood

Procedia PDF Downloads 376