Search results for: regional data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26149

Search results for: regional data

24979 Potential Impacts of Warming Climate on Contributions of Runoff Components from Two Catchments of Upper Indus Basin, Karakoram, Pakistan

Authors: Syed Hammad Ali, Rijan Bhakta Kayastha, Ahuti Shrestha, Iram Bano

Abstract:

The hydrology of Upper Indus basin is not recognized well due to the intricacies in the climate and geography, and the scarcity of data above 5000 meters above sea level where most of the precipitation falls in the form of snow. The main objective of this study is to measure the contributions of different components of runoff in Upper Indus basin. To achieve this goal, the Modified positive degree-day model (MPDDM) was used to simulate the runoff and investigate its components in two catchments of Upper Indus basin, Hunza and Gilgit River basins. These two catchments were selected because of their different glacier coverage, contrasting area distribution at high altitudes and significant impact on the Upper Indus River flow. The components of runoff like snow-ice melt and rainfall-base flow were identified by the model. The simulation results show that the MPDDM shows a good agreement between observed and modeled runoff of these two catchments and the effects of snow-ice are mainly reliant on the catchment characteristics and the glaciated area. For Gilgit River basin, the largest contributor to runoff is rain-base flow, whereas large contribution of snow-ice melt observed in Hunza River basin due to its large fraction of glaciated area. This research will not only contribute to the better understanding of the impacts of climate change on the hydrological response in the Upper Indus, but will also provide guidance for the development of hydropower potential, water resources management and offer a possible evaluation of future water quantity and availability in these catchments.

Keywords: future discharge projection, positive degree day, regional climate model, water resource management

Procedia PDF Downloads 348
24978 Research on Coordinated Development Mechanism of Semi-urbanized Areas under the Background of Guangdong-Hong Kong-Macao Greater Bay Area: A Case Study of 'Baiyun-Nanhai' Pilot Area

Authors: Cheng Fang Wang, Fu Li Gao, Jian Ying Zhou

Abstract:

The '1+4' integration pilot area in the border area of Guangzhou-Foshan is an important platform for Guangzhou-Foshan strategic cooperation, as well as a typical semi-urbanized area with mixed urban and rural landscapes, of which the Baiyun-Nanhai pilot area is one of them. Baiyun district and Nanhai district are only separated by the Pearl River. In this paper, the three dimensions, which include production, living, and ecology, have been put forward, as well as cross-regional multi-agency negotiation mechanism has been discussed. Taking 'Baiyun-Nanhai' pilot area as a case study, POI (Point of Interest) data to analyze the distribution characteristics of 'production-living-ecological space' from the spatial dimension has been introduced in this paper, as well as the land-use change of 'production-living-ecological space' in western region of Baiyun district in 2007 and 2017 from the temporal dimension has been analyzed. Based on the above analysis, the integration development strategy and rethinking of cross-administrative region based on 'production-living-ecological integration' mechanism have been discussed later. It will explore the mechanism of industrial collaborative innovation, infrastructure co-construction, and ecological co-protection in semi-urban areas across borders. And it is expected to provide a reference for the integrated construction of the Guangdong-Hong Kong-Macao Greater Bay Area.

Keywords: semi-urbanization, production-living-ecological integration, multi-agency negotiation, Guangzhou-Foshan integration, synergetic development

Procedia PDF Downloads 144
24977 Fuzzy Wavelet Model to Forecast the Exchange Rate of IDR/USD

Authors: Tri Wijayanti Septiarini, Agus Maman Abadi, Muhammad Rifki Taufik

Abstract:

The exchange rate of IDR/USD can be the indicator to analysis Indonesian economy. The exchange rate as a important factor because it has big effect in Indonesian economy overall. So, it needs the analysis data of exchange rate. There is decomposition data of exchange rate of IDR/USD to be frequency and time. It can help the government to monitor the Indonesian economy. This method is very effective to identify the case, have high accurate result and have simple structure. In this paper, data of exchange rate that used is weekly data from December 17, 2010 until November 11, 2014.

Keywords: the exchange rate, fuzzy mamdani, discrete wavelet transforms, fuzzy wavelet

Procedia PDF Downloads 566
24976 Humanising Digital Healthcare to Build Capacity by Harnessing the Power of Patient Data

Authors: Durhane Wong-Rieger, Kawaldip Sehmi, Nicola Bedlington, Nicole Boice, Tamás Bereczky

Abstract:

Patient-generated health data should be seen as the expression of the experience of patients, including the outcomes reflecting the impact a treatment or service had on their physical health and wellness. We discuss how the healthcare system can reach a place where digital is a determinant of health - where data is generated by patients and is respected and which acknowledges their contribution to science. We explore the biggest barriers facing this. The International Experience Exchange with Patient Organisation’s Position Paper is based on a global patient survey conducted in Q3 2021 that received 304 responses. Results were discussed and validated by the 15 patient experts and supplemented with literature research. Results are a subset of this. Our research showed patient communities want to influence how their data is generated, shared, and used. Our study concludes that a reasonable framework is needed to protect the integrity of patient data and minimise abuse, and build trust. Results also demonstrated a need for patient communities to have more influence and control over how health data is generated, shared, and used. The results clearly highlight that the community feels there is a lack of clear policies on sharing data.

Keywords: digital health, equitable access, humanise healthcare, patient data

Procedia PDF Downloads 80
24975 Congenital Diaphragmatic Hernia Outcomes in a Low-Volume Center

Authors: Michael Vieth, Aric Schadler, Hubert Ballard, J. A. Bauer, Pratibha Thakkar

Abstract:

Introduction: Congenital diaphragmatic hernia (CDH) is a condition characterized by the herniation of abdominal contents into the thoracic cavity requiring postnatal surgical repair. Previous literature suggests improved CDH outcomes at high-volume regional referral centers compared to low-volume centers. The purpose of this study was to examine CDH outcomes at Kentucky Children’s Hospital (KCH), a low-volume center, compared to the Congenital Diaphragmatic Hernia Study Group (CDHSG). Methods: A retrospective chart review was performed at KCH from 2007-2019 for neonates with CDH, and then subdivided into two cohorts: those requiring ECMO therapy and those not requiring ECMO therapy. Basic demographic data and measures of mortality and morbidity including ventilator days and length of stay were compared to the CDHSG. Measures of morbidity for the ECMO cohort including duration of ECMO, clinical bleeding, intracranial hemorrhage, sepsis, need for continuous renal replacement therapy (CRRT), need for sildenafil at discharge, timing of surgical repair, and total ventilator days were collected. Statistical analysis was performed using IBM SPSS Statistics version 28. One-sample t-tests and one-sample Wilcoxon Signed Rank test were utilized as appropriate.Results: There were a total of 27 neonatal patients with CDH at KCH from 2007-2019; 9 of the 27 required ECMO therapy. The birth weight and gestational age were similar between KCH and the CDHSG (2.99 kg vs 2.92 kg, p =0.655; 37.0 weeks vs 37.4 weeks, p =0.51). About half of the patients were inborn in both cohorts (52% vs 56%, p =0.676). KCH cohort had significantly more Caucasian patients (96% vs 55%, p=<0.001). Unadjusted mortality was similar in both groups (KCH 70% vs CDHSG 72%, p =0.857). Using ECMO utilization (KCH 78% vs CDHSG 52%, p =0.118) and need for surgical repair (KCH 95% vs CDHSG 85%, p =0.060) as proxy for severity, both groups’ mortality were comparable. No significant difference was noted for pulmonary outcomes such as average ventilator days (KCH 43.2 vs. CDHSG 17.3, p =0.078) and home oxygen dependency (KCH 44% vs. CDHSG 24%, p =0.108). Average length of hospital stay for patients treated at KCH was similar to CDHSG (64.4 vs 49.2, p=1.000). Conclusion: Our study demonstrates that outcome in CDH patients is independent of center’s case volume status. Management of CDH with a standardized approach in a low-volume center can yield similar outcomes. This data supports the treatment of patients with CDH at low-volume centers as opposed to transferring to higher-volume centers.

Keywords: ECMO, case volume, congenital diaphragmatic hernia, congenital diaphragmatic hernia study group, neonate

Procedia PDF Downloads 94
24974 Use of Machine Learning in Data Quality Assessment

Authors: Bruno Pinto Vieira, Marco Antonio Calijorne Soares, Armando Sérgio de Aguiar Filho

Abstract:

Nowadays, a massive amount of information has been produced by different data sources, including mobile devices and transactional systems. In this scenario, concerns arise on how to maintain or establish data quality, which is now treated as a product to be defined, measured, analyzed, and improved to meet consumers' needs, which is the one who uses these data in decision making and companies strategies. Information that reaches low levels of quality can lead to issues that can consume time and money, such as missed business opportunities, inadequate decisions, and bad risk management actions. The step of selecting, identifying, evaluating, and selecting data sources with significant quality according to the need has become a costly task for users since the sources do not provide information about their quality. Traditional data quality control methods are based on user experience or business rules limiting performance and slowing down the process with less than desirable accuracy. Using advanced machine learning algorithms, it is possible to take advantage of computational resources to overcome challenges and add value to companies and users. In this study, machine learning is applied to data quality analysis on different datasets, seeking to compare the performance of the techniques according to the dimensions of quality assessment. As a result, we could create a ranking of approaches used, besides a system that is able to carry out automatically, data quality assessment.

Keywords: machine learning, data quality, quality dimension, quality assessment

Procedia PDF Downloads 146
24973 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges

Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars

Abstract:

In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.

Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting

Procedia PDF Downloads 152
24972 The Impact of Corporate Finance on Financial Stability in the Western Balkan Countries

Authors: Luan Vardari, Dena Arapi-Vardari

Abstract:

Financial stability is a critical component of economic growth and development, and it has been recognized as a key policy objective in many countries around the world. In the Western Balkans, financial stability has been a key issue in recent years, with a number of challenges facing the region, including high levels of public debt, weak banking systems, and economic volatility. Corporate finance, which refers to the financial management practices of firms, is an important factor that can impact financial stability. This paper aims to investigate corporate finance's impact on financial stability in Western Balkan countries. This study will use a mixed-methods approach to investigate the impact of corporate finance on financial stability in the Western Balkans. The study will begin with a comprehensive review of the existing literature on corporate finance and financial stability, focusing on the Western Balkan region. This will be followed by an empirical analysis of regional corporate finance practices using data from various industries and firms. The analysis will explore the relationship between corporate finance practices and financial stability, taking into account factors such as regulatory frameworks, economic conditions, and firm size. The results of the study are expected to provide insights into the impact of corporate finance on financial stability in the Western Balkans. Specifically, the study will identify the key corporate finance practices that contribute to financial stability in the region, as well as the challenges and obstacles that firms face in implementing effective corporate finance strategies. The study will also provide recommendations for policymakers and firms looking to enhance financial stability and resilience in the region.

Keywords: financial regulation, debt management, investment decisions, dividend policies, economic volatility, banking systems, public debt, prudent financial management, firm size, policy recommendations

Procedia PDF Downloads 74
24971 Consumer Over-Indebtedness in Germany: An Investigation of Key Determinants

Authors: Xiaojing Wang, Ann-Marie Ward, Tony Wall

Abstract:

The problem of over-indebtedness has increased since deregulation of the banking industry in the 1980s, and now it has become a major problem for most countries in Europe, including Germany. Consumer debt issues have attracted not only the attention of academics but also government and debt counselling institutions. Overall, this research aims to contribute to the knowledge gap regarding the causes of consumer over-indebtedness in Germany and to develop predictive models for assessing consumer over-indebtedness risk at consumer level. The situation of consumer over-indebtedness is serious in Germany. The relatively high level of social welfare support in Germany suggests that consumer debt problems are caused by other factors, other than just over-spending and income volatility. Prior literature suggests that the overall stability of the economy and level of welfare support for individuals from the structural environment contributes to consumers’ debt problems. In terms of cultural influence, the conspicuous consumption theory in consumer behaviour suggests that consumers would spend more than their means to be seen as similar profiles to consumers in a higher socio-economic class. This results in consumers taking on more debt than they can afford, and eventually becoming over-indebted. Studies have also shown that financial literacy is negatively related to consumer over-indebtedness risk. Whilst prior literature has examined structural and cultural influences respectively, no study has taken a collective approach. To address this gap, a model is developed to investigate the association between consumer over-indebtedness and proxies for influences from the structural and cultural environment based on the above theories. The model also controls for consumer demographic characteristics identified as being of influence in prior literature, such as gender and age, and adverse shocks, such as divorce or bereavement in the household. Benefiting from SOEP regional data, this study is able to conduct quantitative empirical analysis to test both structural and cultural influences at a localised level. Using German Socio-Economic Panel (SOEP) study data from 2006 to 2016, this study finds that social benefits, financial literacy and the existence of conspicuous consumption all contribute to being over-indebted. Generally speaking, the risk of becoming over-indebted is high when consumers are in a low-welfare community, have little awareness of their own financial situation and always over-spend. In order to tackle the problem of over-indebtedness, countermeasures can be taken, for example, increasing consumers’ financial awareness, and the level of welfare support. By analysing causes of consumer over-indebtedness in Germany, this study also provides new insights on the nature and underlying causes of consumer debt issues in Europe.

Keywords: consumer, debt, financial literacy, socio-economic

Procedia PDF Downloads 211
24970 Nuclear Decay Data Evaluation for 217Po

Authors: S. S. Nafee, A. M. Al-Ramady, S. A. Shaheen

Abstract:

Evaluated nuclear decay data for the 217Po nuclide ispresented in the present work. These data include recommended values for the half-life T1/2, α-, β--, and γ-ray emission energies and probabilities. Decay data from 221Rn α and 217Bi β—decays are presented. Q(α) has been updated based on the recent published work of the Atomic Mass Evaluation AME2012. In addition, the logft values were calculated using the Logft program from the ENSDF evaluation package. Moreover, the total internal conversion electrons has been calculated using Bricc program. Meanwhile, recommendation values or the multi-polarities have been assigned based on recently measurement yield a better intensity balance at the 254 keV and 264 keV gamma transitions.

Keywords: nuclear decay data evaluation, mass evaluation, total converison coefficients, atomic mass evaluation

Procedia PDF Downloads 432
24969 Geographic Information System Using Google Fusion Table Technology for the Delivery of Disease Data Information

Authors: I. Nyoman Mahayasa Adiputra

Abstract:

Data in the field of health can be useful for the purposes of data analysis, one example of health data is disease data. Disease data is usually in a geographical plot in accordance with the area. Where the data was collected, in the city of Denpasar, Bali. Disease data report is still published in tabular form, disease information has not been mapped in GIS form. In this research, disease information in Denpasar city will be digitized in the form of a geographic information system with the smallest administrative area in the form of district. Denpasar City consists of 4 districts of North Denpasar, East Denpasar, West Denpasar and South Denpasar. In this research, we use Google fusion table technology for map digitization process, where this technology can facilitate from the administrator and from the recipient information. From the administrator side of the input disease, data can be done easily and quickly. From the receiving end of the information, the resulting GIS application can be published in a website-based application so that it can be accessed anywhere and anytime. In general, the results obtained in this study, divided into two, namely: (1) Geolocation of Denpasar and all of Denpasar districts, the process of digitizing the map of Denpasar city produces a polygon geolocation of each - district of Denpasar city. These results can be utilized in subsequent GIS studies if you want to use the same administrative area. (2) Dengue fever mapping in 2014 and 2015. Disease data used in this study is dengue fever case data taken in 2014 and 2015. Data taken from the profile report Denpasar Health Department 2015 and 2016. This mapping can be useful for the analysis of the spread of dengue hemorrhagic fever in the city of Denpasar.

Keywords: geographic information system, Google fusion table technology, delivery of disease data information, Denpasar city

Procedia PDF Downloads 127
24968 Inclusive Practices in Health Sciences: Equity Proofing Higher Education Programs

Authors: Mitzi S. Brammer

Abstract:

Given that the cultural make-up of programs of study in institutions of higher learning is becoming increasingly diverse, much has been written about cultural diversity from a university-level perspective. However, there are little data in the way of specific programs and how they address inclusive practices when teaching and working with marginalized populations. This research study aimed to discover baseline knowledge and attitudes of health sciences faculty, instructional staff, and students related to inclusive teaching/learning and interactions. Quantitative data were collected via an anonymous online survey (one designed for students and another designed for faculty/instructional staff) using a web-based program called Qualtrics. Quantitative data were analyzed amongst the faculty/instructional staff and students, respectively, using descriptive and comparative statistics (t-tests). Additionally, some participants voluntarily engaged in a focus group discussion in which qualitative data were collected around these same variables. Collecting qualitative data to triangulate the quantitative data added trustworthiness to the overall data. The research team analyzed collected data and compared identified categories and trends, comparing those data between faculty/staff and students, and reported results as well as implications for future study and professional practice.

Keywords: inclusion, higher education, pedagogy, equity, diversity

Procedia PDF Downloads 66
24967 Leadership Values in Succession Processes

Authors: Peter Heimerl, Alexander Plaikner, Mike Peters

Abstract:

Background and Significance of the Study: Family-run businesses are a decisive economic factor in the Alpine tourism and leisure industry. Within the next years, it is expected that a large number of family-run small and medium-sized businesses will transfer ownership due to demographic developments. Four stages of succession processes can be identified by several empirical studies: (1) the preparation phase, (2) the succession planning phase, (3) the development of the succession concept, (4) and the implementation of the business transfer. Family business research underlines the importance of individual's and family’s values: Especially leadership values address mainly the first phase, which strongly determines the following stages. Aim of the Study: The study aims at answering the following research question: Which leadership values are dominating during succession processes in family-run businesses in Austrian Alpine tourism industry? Methodology: Twenty-two problem-centred individual interviews with 11 transferors and their 11 transferees were conducted. Data analysis was carried out using the software program MAXQDA following an inductive approach to data coding. Major Findings: Data analysis shows that nine values particularly influence succession processes, especially during the vulnerable preparation phase. Participation is the most-dominant value (162 references). It covers a style of cooperation, communication, and controlling. Discipline (142) is especially prevailing from the transferor's perspective. It addresses entrepreneurial honesty and customer orientation. Development (138) is seen as an important value, but it can be distinguished between transferors and transferees. These are mainly focused on strategic positioning and new technologies. Trust (105) is interpreted as a basic prerequisite to run the family firm smoothly. Interviewees underline the importance to be able to take a break from family-business management; however, this is only possible when openness and honesty constitute trust within the family firm. Loyalty (102): Almost all interviewees perceive that they can influence the loyalty of the employees through their own role models. A good work-life balance (90) is very important to most of the transferors, especially for their employees. Despite the communicated importance of a good work-life-balance, but however, mostly the commitment to the company is prioritised. Considerations of regionality (82) and regional responsibility are also frequently raised. Appreciation (75) is of great importance to both the handover and the takeover generation -as appreciation towards the employees in the company and especially in connection with the family. Familiarity (66) and the blurring of the boundaries between private and professional life are very common, especially in family businesses. Familial contact and open communication with employees which is mentioned in almost all handing over. Conclusions: In the preparation phase of succession, successors and incumbents have to consider and discuss their leadership and family values of family-business management. Quite often, assistance is needed to commonly and openly discuss these values in the early stages of succession processes. A large majority of handovers fail because of these values. Implications can be drawn to support family businesses, e.g., consulting initiatives at chambers of commerce and business consultancies must address this problem.

Keywords: leadership values, family business, succession processes, succession phases

Procedia PDF Downloads 97
24966 An Analysis of Sequential Pattern Mining on Databases Using Approximate Sequential Patterns

Authors: J. Suneetha, Vijayalaxmi

Abstract:

Sequential Pattern Mining involves applying data mining methods to large data repositories to extract usage patterns. Sequential pattern mining methodologies used to analyze the data and identify patterns. The patterns have been used to implement efficient systems can recommend on previously observed patterns, in making predictions, improve usability of systems, detecting events, and in general help in making strategic product decisions. In this paper, identified performance of approximate sequential pattern mining defines as identifying patterns approximately shared with many sequences. Approximate sequential patterns can effectively summarize and represent the databases by identifying the underlying trends in the data. Conducting an extensive and systematic performance over synthetic and real data. The results demonstrate that ApproxMAP effective and scalable in mining large sequences databases with long patterns.

Keywords: multiple data, performance analysis, sequential pattern, sequence database scalability

Procedia PDF Downloads 339
24965 Perceptions of Chinese Top-up Students Transitioning through a Regional UK University: A Longitudinal Study Using the U-Curve Model

Authors: Xianghan O'Dea

Abstract:

This article argues an urgent need to better understand the personal experiences of Chinese top-up students studying in the UK since the number of Chinese students taking year-long top-up programmes in the UK has risen rapidly in recent years. This lack of knowledge could potentially have implications for the reputation of some UK institutions and also the attractiveness of the UK higher education sector to future international students. This longitudinal study explored the academic and social experiences of twelve Chinese top-up students in a UK institution in-depth and revealed that the students felt their experiences were influenced significantly by their surrounding contexts at the macro and meso levels, which, however, have been largely overlooked in existing research. This article suggests the importance of improving the communications between the partner institutions in China and the UK, and also providing sufficient pre-departure and after arrival support to Chinese top-up students at the institutional level.

Keywords: articulation agreements, Chinese top-up students, top-up programmes, U-curve

Procedia PDF Downloads 168
24964 Medical Knowledge Management since the Integration of Heterogeneous Data until the Knowledge Exploitation in a Decision-Making System

Authors: Nadjat Zerf Boudjettou, Fahima Nader, Rachid Chalal

Abstract:

Knowledge management is to acquire and represent knowledge relevant to a domain, a task or a specific organization in order to facilitate access, reuse and evolution. This usually means building, maintaining and evolving an explicit representation of knowledge. The next step is to provide access to that knowledge, that is to say, the spread in order to enable effective use. Knowledge management in the medical field aims to improve the performance of the medical organization by allowing individuals in the care facility (doctors, nurses, paramedics, etc.) to capture, share and apply collective knowledge in order to make optimal decisions in real time. In this paper, we propose a knowledge management approach based on integration technique of heterogeneous data in the medical field by creating a data warehouse, a technique of extracting knowledge from medical data by choosing a technique of data mining, and finally an exploitation technique of that knowledge in a case-based reasoning system.

Keywords: data warehouse, data mining, knowledge discovery in database, KDD, medical knowledge management, Bayesian networks

Procedia PDF Downloads 394
24963 Mean Shift-Based Preprocessing Methodology for Improved 3D Buildings Reconstruction

Authors: Nikolaos Vassilas, Theocharis Tsenoglou, Djamchid Ghazanfarpour

Abstract:

In this work we explore the capability of the mean shift algorithm as a powerful preprocessing tool for improving the quality of spatial data, acquired from airborne scanners, from densely built urban areas. On one hand, high resolution image data corrupted by noise caused by lossy compression techniques are appropriately smoothed while at the same time preserving the optical edges and, on the other, low resolution LiDAR data in the form of normalized Digital Surface Map (nDSM) is upsampled through the joint mean shift algorithm. Experiments on both the edge-preserving smoothing and upsampling capabilities using synthetic RGB-z data show that the mean shift algorithm is superior to bilateral filtering as well as to other classical smoothing and upsampling algorithms. Application of the proposed methodology for 3D reconstruction of buildings of a pilot region of Athens, Greece results in a significant visual improvement of the 3D building block model.

Keywords: 3D buildings reconstruction, data fusion, data upsampling, mean shift

Procedia PDF Downloads 314
24962 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman Electricity Transmission Company

Authors: Rahma Al Balushi

Abstract:

Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS dept. This paper will describe in detail the GIS data submission process and the journey to develop the current process. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting, and data alterations salso aided to reduce the missing attributes of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the year 2017 and the year 2021. Overall, concluding that by governance, asset information & GIS department can control GIS data process; collect, properly record, and manage asset data and information within OETC network. This control extends to other applications and systems integrated with/related to GIS systems.

Keywords: asset management ISO55001, standard procedures process, governance, geodatabase, NOC, CMMS

Procedia PDF Downloads 206
24961 Importance of Ethics in Cloud Security

Authors: Pallavi Malhotra

Abstract:

This paper examines the importance of ethics in cloud computing. In the modern society, cloud computing is offering individuals and businesses an unlimited space for storing and processing data or information. Most of the data and information stored in the cloud by various users such as banks, doctors, architects, engineers, lawyers, consulting firms, and financial institutions among others require a high level of confidentiality and safeguard. Cloud computing offers centralized storage and processing of data, and this has immensely contributed to the growth of businesses and improved sharing of information over the internet. However, the accessibility and management of data and servers by a third party raise concerns regarding the privacy of clients’ information and the possible manipulations of the data by third parties. This document suggests the approaches various stakeholders should take to address various ethical issues involving cloud-computing services. Ethical education and training is key to all stakeholders involved in the handling of data and information stored or being processed in the cloud.

Keywords: IT ethics, cloud computing technology, cloud privacy and security, ethical education

Procedia PDF Downloads 322
24960 Evaluation of Practicality of On-Demand Bus Using Actual Taxi-Use Data through Exhaustive Simulations

Authors: Jun-ichi Ochiai, Itsuki Noda, Ryo Kanamori, Keiji Hirata, Hitoshi Matsubara, Hideyuki Nakashima

Abstract:

We conducted exhaustive simulations for data assimilation and evaluation of service quality for various setting in a new shared transportation system, called SAVS. Computational social simulation is a key technology to design recent social services like SAVS as new transportation service. One open issue in SAVS was to determine the service scale through the social simulation. Using our exhaustive simulation framework, OACIS, we did data-assimilation and evaluation of effects of SAVS based on actual tax-use data at Tajimi city, Japan. Finally, we get the conditions to realize the new service in a reasonable service quality.

Keywords: on-demand bus sytem, social simulation, data assimilation, exhaustive simulation

Procedia PDF Downloads 319
24959 Optimal Pricing Based on Real Estate Demand Data

Authors: Vanessa Kummer, Maik Meusel

Abstract:

Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.

Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning

Procedia PDF Downloads 285
24958 Unlocking the Puzzle of Borrowing Adult Data for Designing Hybrid Pediatric Clinical Trials

Authors: Rajesh Kumar G

Abstract:

A challenging aspect of any clinical trial is to carefully plan the study design to meet the study objective in optimum way and to validate the assumptions made during protocol designing. And when it is a pediatric study, there is the added challenge of stringent guidelines and difficulty in recruiting the necessary subjects. Unlike adult trials, there is not much historical data available for pediatrics, which is required to validate assumptions for planning pediatric trials. Typically, pediatric studies are initiated as soon as approval is obtained for a drug to be marketed for adults, so with the adult study historical information and with the available pediatric pilot study data or simulated pediatric data, the pediatric study can be well planned. Generalizing the historical adult study for new pediatric study is a tedious task; however, it is possible by integrating various statistical techniques and utilizing the advantage of hybrid study design, which will help to achieve the study objective in a smoother way even with the presence of many constraints. This research paper will explain how well the hybrid study design can be planned along with integrated technique (SEV) to plan the pediatric study; In brief the SEV technique (Simulation, Estimation (using borrowed adult data and applying Bayesian methods)) incorporates the use of simulating the planned study data and getting the desired estimates to Validate the assumptions.This method of validation can be used to improve the accuracy of data analysis, ensuring that results are as valid and reliable as possible, which allow us to make informed decisions well ahead of study initiation. With professional precision, this technique based on the collected data allows to gain insight into best practices when using data from historical study and simulated data alike.

Keywords: adaptive design, simulation, borrowing data, bayesian model

Procedia PDF Downloads 75
24957 Analyzing Test Data Generation Techniques Using Evolutionary Algorithms

Authors: Arslan Ellahi, Syed Amjad Hussain

Abstract:

Software Testing is a vital process in software development life cycle. We can attain the quality of software after passing it through software testing phase. We have tried to find out automatic test data generation techniques that are a key research area of software testing to achieve test automation that can eventually decrease testing time. In this paper, we review some of the approaches presented in the literature which use evolutionary search based algorithms like Genetic Algorithm, Particle Swarm Optimization (PSO), etc. to validate the test data generation process. We also look into the quality of test data generation which increases or decreases the efficiency of testing. We have proposed test data generation techniques for model-based testing. We have worked on tuning and fitness function of PSO algorithm.

Keywords: search based, evolutionary algorithm, particle swarm optimization, genetic algorithm, test data generation

Procedia PDF Downloads 188
24956 The Impact of Artificial Intelligence on Human Developments Obligations and Theories

Authors: Seham Elia Moussa Shenouda

Abstract:

The relationship between development and human rights has long been the subject of academic debate. To understand the dynamics between these two concepts, various principles are adopted, from the right to development to development-based human rights. Despite the initiatives taken, the relationship between development and human rights remains unclear. However, the overlap between these two views and the idea that efforts should be made in the field of human rights have increased in recent years. It is then evaluated whether the right to sustainable development is acceptable or not. The article concludes that the principles of sustainable development are directly or indirectly recognized in various human rights instruments, which is a good answer to the question posed above. This book therefore cites regional and international human rights agreements such as , as well as the jurisprudence and interpretative guidelines of human rights institutions, to prove this hypothesis.

Keywords: sustainable development, human rights, the right to development, the human rights-based approach to development, environmental rights, economic development, social sustainability human rights protection, human rights violations, workers’ rights, justice, security

Procedia PDF Downloads 34
24955 Development of 4D Dynamic Simulation Tool for the Evaluation of Left Ventricular Myocardial Functions

Authors: Deepa, Yashbir Singh, Shi Yi Wu, Michael Friebe, Joao Manuel R. S. Tavares, Hu Wei-Chih

Abstract:

Cardiovascular disease can be detected by measuring the regional and global wall motion of the left ventricle (LV) of the heart; In this study, we designed a dynamic simulation tool using Computed Tomography (CT) images to assess the difference between actual and simulated left ventricular functions. Thirteen healthy subjects were involved in the study with actual and simulated left ventricular functions. In this research, we found the high correlation between actual left ventricular wall motion and simulated left ventricular wall motion. Our results confirm that our simulation tool is feasible for simulating left ventricular motion.

Keywords: cardiac imaging, left-ventricular remodeling, cardiac wall motion, myocardial functions

Procedia PDF Downloads 341
24954 Data Mining Spatial: Unsupervised Classification of Geographic Data

Authors: Chahrazed Zouaoui

Abstract:

In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.

Keywords: mining, GIS, geo-clustering, neighborhood

Procedia PDF Downloads 374
24953 Sustainable Water Resource Management and Challenges in Indian Agriculture

Authors: Rajendra Kumar Isaac, Monisha Isaac

Abstract:

India, having a vast cultivable area and regional climatic variability, encounters water Resource Management Problems at various levels. The agricultural production of India needs to be increased to meet out projected population growth. Sustainable water resource is the only option to ensure food security, especially in northern Indian states, where the ground and surface water resources are fast depleting. Various tools and technologies available for management of scarce water resources have been discussed. It was concluded that multiple use of water, adopting latest water management options, identification of climate adoptable cropping and farming systems, can enhance water productivity and would encounter the fast growing water management and water shortage problems in Indian agriculture.

Keywords: water resource management, sustainable, water management technologies, water productivity, agriculture

Procedia PDF Downloads 397
24952 The Re-Emergence of Russia Foreign Policy (Case Study: Middle East)

Authors: Maryam Azish

Abstract:

Russia, as an emerging global player in recent years, has projected a special place in the Middle East. Despite all the challenges it has faced over the years, it has always considered its presence in various fields with a strategy that has defined its maneuvering power as a level of competition and even confrontation with the United States. Therefore, its current approach is considered important as an influential actor in the Middle East. After the collapse of the Soviet Union, when the Russians withdrew completely from the Middle East, the American scene remained almost unrivaled by the Americans. With the start of the US-led war in Iraq and Afghanistan and the subsequent developments that led to the US military and political defeat, a new chapter in regional security was created in which ISIL and Taliban terrorism went along with the Arab Spring to destabilize the Middle East. Because of this, the Americans took every opportunity to strengthen their military presence. Iraq, Syria and Afghanistan have always been the three areas where terrorism was shaped, and the countries of the region have each reacted to this evil phenomenon accordingly. The West dealt with this phenomenon on a case-by-case basis in the general circumstances that created the fluid situation in the Arab countries and the region. Russian President Vladimir Putin accused the US of falling asleep in the face of ISIS and terrorism in Syria. In fact, this was an opportunity for the Russians to revive their presence in Syria. This article suggests that utilizing the recognition policy along with the constructivism theory will offer a better knowledge of Russia’s endeavors to endorse its international position. Accordingly, Russia’s distinctiveness and its ambitions for a situation of great power have played a vital role in shaping national interests and, subsequently, in foreign policy, in Putin's era in particular. The focal claim of the paper is that scrutinize Russia’s foreign policy with realistic methods cannot be attained. Consequently, with an aim to fill the prevailing vacuum, this study exploits the politics of acknowledgment in the context of constructivism to examine Russia’s foreign policy in the Middle East. The results of this paper show that the key aim of Russian foreign policy discourse, accompanied by increasing power and wealth, is to recognize and reinstate the position of great power in the universal system. The Syrian crisis has created an opportunity for Russia to unite its position in the developing global and regional order after ages of dynamic and prevalent existence in the Middle East as well as contradicting US unilateralism. In the meantime, the writer thinks that the question of identifying Russia’s position in the global system by the West has played a foremost role in serving its national interests.

Keywords: constructivism, foreign Policy, middle East, Russia, regionalism

Procedia PDF Downloads 148
24951 Analysis and Prediction of Netflix Viewing History Using Netflixlatte as an Enriched Real Data Pool

Authors: Amir Mabhout, Toktam Ghafarian, Amirhossein Farzin, Zahra Makki, Sajjad Alizadeh, Amirhossein Ghavi

Abstract:

The high number of Netflix subscribers makes it attractive for data scientists to extract valuable knowledge from the viewers' behavioural analyses. This paper presents a set of statistical insights into viewers' viewing history. After that, a deep learning model is used to predict the future watching behaviour of the users based on previous watching history within the Netflixlatte data pool. Netflixlatte in an aggregated and anonymized data pool of 320 Netflix viewers with a length 250 000 data points recorded between 2008-2022. We observe insightful correlations between the distribution of viewing time and the COVID-19 pandemic outbreak. The presented deep learning model predicts future movie and TV series viewing habits with an average loss of 0.175.

Keywords: data analysis, deep learning, LSTM neural network, netflix

Procedia PDF Downloads 248
24950 Analysis of User Data Usage Trends on Cellular and Wi-Fi Networks

Authors: Jayesh M. Patel, Bharat P. Modi

Abstract:

The availability of on mobile devices that can invoke the demonstrated that the total data demand from users is far higher than previously articulated by measurements based solely on a cellular-centric view of smart-phone usage. The ratio of Wi-Fi to cellular traffic varies significantly between countries, This paper is shown the compression between the cellular data usage and Wi-Fi data usage by the user. This strategy helps operators to understand the growing importance and application of yield management strategies designed to squeeze maximum returns from their investments into the networks and devices that enable the mobile data ecosystem. The transition from unlimited data plans towards tiered pricing and, in the future, towards more value-centric pricing offers significant revenue upside potential for mobile operators, but, without a complete insight into all aspects of smartphone customer behavior, operators will unlikely be able to capture the maximum return from this billion-dollar market opportunity.

Keywords: cellular, Wi-Fi, mobile, smart phone

Procedia PDF Downloads 364