Search results for: data security architecture
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27986

Search results for: data security architecture

25256 The Development and Provision of a Knowledge Management Ecosystem, Optimized for Genomics

Authors: Matthew I. Bellgard

Abstract:

The field of bioinformatics has made, and continues to make, substantial progress and contributions to life science research and development. However, this paper contends that a systems approach integrates bioinformatics activities for any project in a defined manner. The application of critical control points in this bioinformatics systems approach may be useful to identify and evaluate points in a pathway where specified activity risk can be reduced, monitored and quality enhanced.

Keywords: bioinformatics, food security, personalized medicine, systems approach

Procedia PDF Downloads 422
25255 On the Network Packet Loss Tolerance of SVM Based Activity Recognition

Authors: Gamze Uslu, Sebnem Baydere, Alper K. Demir

Abstract:

In this study, data loss tolerance of Support Vector Machines (SVM) based activity recognition model and multi activity classification performance when data are received over a lossy wireless sensor network is examined. Initially, the classification algorithm we use is evaluated in terms of resilience to random data loss with 3D acceleration sensor data for sitting, lying, walking and standing actions. The results show that the proposed classification method can recognize these activities successfully despite high data loss. Secondly, the effect of differentiated quality of service performance on activity recognition success is measured with activity data acquired from a multi hop wireless sensor network, which introduces high data loss. The effect of number of nodes on the reliability and multi activity classification success is demonstrated in simulation environment. To the best of our knowledge, the effect of data loss in a wireless sensor network on activity detection success rate of an SVM based classification algorithm has not been studied before.

Keywords: activity recognition, support vector machines, acceleration sensor, wireless sensor networks, packet loss

Procedia PDF Downloads 475
25254 Colonialism and Modernism in Architecture, the Case of a Blank Page Opportunity in Casablanka

Authors: Nezha Alaoui

Abstract:

The early 1950s French colonial context in Morocco provided an opportunity for architects to question the modernist established order by building dwellings for the local population. The dwellings were originally designed to encourage Muslims to adopt an urban lifestyle based on local customs. However, the inhabitants transformed their dwelling into a hybrid habitation. This paper aims to prove the relevance of the design process in accordance with the local colonial context by analyzing the dwellers' appropriation process and the modification of their habitat.

Keywords: colonial heritage, appropriation process, islamic spatial habit, housing experiment, modernist mass housing

Procedia PDF Downloads 128
25253 Formal Models of Sanitary Inspections Teams Activities

Authors: Tadeusz Nowicki, Radosław Pytlak, Robert Waszkowski, Jerzy Bertrandt, Anna Kłos

Abstract:

This paper presents methods for formal modeling of activities in the area of sanitary inspectors outbreak of food-borne diseases. The models allow you to measure the characteristics of the activities of sanitary inspection and as a result allow improving the performance of sanitary services and thus food security.

Keywords: food-borne disease, epidemic, sanitary inspection, mathematical models

Procedia PDF Downloads 302
25252 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman electricity Transmission Company

Authors: Rahma Saleh Hussein Al Balushi

Abstract:

Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS department. This paper will describe in detail the current GIS data submission process and the journey for developing it. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, and updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) for excavation permits and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting and data alterations has also contributed to reducing the missing attributes and enhance data quality index of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the years 2017 and year 2022. Overall, concluding that by governance, asset information & GIS department can control the GIS data process; collect, properly record, and manage asset data and information within the OETC network. This control extends to other applications and systems integrated with/related to GIS systems.

Keywords: asset management ISO55001, standard procedures process, governance, CMMS

Procedia PDF Downloads 125
25251 Efects of Data Corelation in a Sparse-View Compresive Sensing Based Image Reconstruction

Authors: Sajid Abas, Jon Pyo Hong, Jung-Ryun Le, Seungryong Cho

Abstract:

Computed tomography and laminography are heavily investigated in a compressive sensing based image reconstruction framework to reduce the dose to the patients as well as to the radiosensitive devices such as multilayer microelectronic circuit boards. Nowadays researchers are actively working on optimizing the compressive sensing based iterative image reconstruction algorithm to obtain better quality images. However, the effects of the sampled data’s properties on reconstructed the image’s quality, particularly in an insufficient sampled data conditions have not been explored in computed laminography. In this paper, we investigated the effects of two data properties i.e. sampling density and data incoherence on the reconstructed image obtained by conventional computed laminography and a recently proposed method called spherical sinusoidal scanning scheme. We have found that in a compressive sensing based image reconstruction framework, the image quality mainly depends upon the data incoherence when the data is uniformly sampled.

Keywords: computed tomography, computed laminography, compressive sending, low-dose

Procedia PDF Downloads 464
25250 Fuzzy Wavelet Model to Forecast the Exchange Rate of IDR/USD

Authors: Tri Wijayanti Septiarini, Agus Maman Abadi, Muhammad Rifki Taufik

Abstract:

The exchange rate of IDR/USD can be the indicator to analysis Indonesian economy. The exchange rate as a important factor because it has big effect in Indonesian economy overall. So, it needs the analysis data of exchange rate. There is decomposition data of exchange rate of IDR/USD to be frequency and time. It can help the government to monitor the Indonesian economy. This method is very effective to identify the case, have high accurate result and have simple structure. In this paper, data of exchange rate that used is weekly data from December 17, 2010 until November 11, 2014.

Keywords: the exchange rate, fuzzy mamdani, discrete wavelet transforms, fuzzy wavelet

Procedia PDF Downloads 571
25249 Humanising Digital Healthcare to Build Capacity by Harnessing the Power of Patient Data

Authors: Durhane Wong-Rieger, Kawaldip Sehmi, Nicola Bedlington, Nicole Boice, Tamás Bereczky

Abstract:

Patient-generated health data should be seen as the expression of the experience of patients, including the outcomes reflecting the impact a treatment or service had on their physical health and wellness. We discuss how the healthcare system can reach a place where digital is a determinant of health - where data is generated by patients and is respected and which acknowledges their contribution to science. We explore the biggest barriers facing this. The International Experience Exchange with Patient Organisation’s Position Paper is based on a global patient survey conducted in Q3 2021 that received 304 responses. Results were discussed and validated by the 15 patient experts and supplemented with literature research. Results are a subset of this. Our research showed patient communities want to influence how their data is generated, shared, and used. Our study concludes that a reasonable framework is needed to protect the integrity of patient data and minimise abuse, and build trust. Results also demonstrated a need for patient communities to have more influence and control over how health data is generated, shared, and used. The results clearly highlight that the community feels there is a lack of clear policies on sharing data.

Keywords: digital health, equitable access, humanise healthcare, patient data

Procedia PDF Downloads 82
25248 Study of the Influence of Eccentricity Due to Configuration and Materials on Seismic Response of a Typical Building

Authors: A. Latif Karimi, M. K. Shrimali

Abstract:

Seismic design is a critical stage in the process of design and construction of a building. It includes strategies for designing earthquake-resistant buildings to ensure health, safety, and security of the building occupants and assets. Hence, it becomes very important to understand the behavior of structural members precisely, for construction of buildings that can yield a better response to seismic forces. This paper investigates the behavior of a typical structure when subjected to ground motion. The corresponding mode shapes and modal frequencies are studied to interpret the response of an actual structure using different fabricated models and 3D visual models. In this study, three different structural configurations are subjected to horizontal ground motion, and the effect of “stiffness eccentricity” and placement of infill walls are checked to determine how each parameter contributes in a building’s response to dynamic forces. The deformation data from lab experiments and the analysis on SAP2000 software are reviewed to obtain the results. This study revealed that seismic response in a building can be improved by introducing higher deformation capacity in the building. Also, proper design of infill walls and maintaining a symmetrical configuration in a building are the key factors in building stability during the earthquake.

Keywords: eccentricity, seismic response, mode shape, building configuration, building dynamics

Procedia PDF Downloads 200
25247 Use of Machine Learning in Data Quality Assessment

Authors: Bruno Pinto Vieira, Marco Antonio Calijorne Soares, Armando Sérgio de Aguiar Filho

Abstract:

Nowadays, a massive amount of information has been produced by different data sources, including mobile devices and transactional systems. In this scenario, concerns arise on how to maintain or establish data quality, which is now treated as a product to be defined, measured, analyzed, and improved to meet consumers' needs, which is the one who uses these data in decision making and companies strategies. Information that reaches low levels of quality can lead to issues that can consume time and money, such as missed business opportunities, inadequate decisions, and bad risk management actions. The step of selecting, identifying, evaluating, and selecting data sources with significant quality according to the need has become a costly task for users since the sources do not provide information about their quality. Traditional data quality control methods are based on user experience or business rules limiting performance and slowing down the process with less than desirable accuracy. Using advanced machine learning algorithms, it is possible to take advantage of computational resources to overcome challenges and add value to companies and users. In this study, machine learning is applied to data quality analysis on different datasets, seeking to compare the performance of the techniques according to the dimensions of quality assessment. As a result, we could create a ranking of approaches used, besides a system that is able to carry out automatically, data quality assessment.

Keywords: machine learning, data quality, quality dimension, quality assessment

Procedia PDF Downloads 148
25246 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges

Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars

Abstract:

In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.

Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting

Procedia PDF Downloads 153
25245 Free Radical Scavenging Activity and Total Phenolic Assessment of Drug Repurposed Medicinal Plant Metabolites: Promising Tools against Post COVID-19 Syndromes and Non-Communicable Diseases in Botswana

Authors: D. Motlhanka, M. Mine, T. Bagaketse, T. Ngakane

Abstract:

There is a plethora of evidence from numerous sources that highlights the triumph of naturally derived medicinal plant metabolites with antioxidant capability for repurposed therapeutics. As post-COVID-19 syndromes and non-communicable diseases are on the rise, there is an urgent need to come up with new therapeutic strategies to address the problem. Non-communicable diseases and Post COVID-19 syndromes are classified as socio-economic diseases and are ranked high among threats to health security due to the economic burden they pose to any government budget commitment. Research has shown a strong link between accumulation of free radicals and oxidative stress critical for pathogenesis of non-communicable diseases and COVID-19 syndromes. Botswana has embarked on a robust programme derived from ethno-pharmacognosy and drug repurposing to address these threats to health security. In the current approach, a number of medicinally active plant-derived polyphenolics are repurposed and combined into new medicinal tools to target diabetes, Hypertension, Prostate Cancer and oxidative stress induced Post COVID 19 syndromes such as “brain fog”. All four formulants demonstrated Free Radical scavenging capacities above 95% at 200µg/ml using the diphenylpicryalhydrazyl free radical scavenging assay and the total phenolic contents between 6899-15000GAE(g/L) using the folin-ciocalteau assay respectively. These repurposed medicinal tools offer new hope and potential in the fight against emerging health threats driven by hyper-inflammation and free radical-induced oxidative stress.

Keywords: drug repurposed plant polyphenolics, free radical damage, non-communicable diseases, post COVID 19 syndromes

Procedia PDF Downloads 128
25244 Solving LWE by Pregressive Pumps and Its Optimization

Authors: Leizhang Wang, Baocang Wang

Abstract:

General Sieve Kernel (G6K) is considered as currently the fastest algorithm for the shortest vector problem (SVP) and record holder of open SVP challenge. We study the lattice basis quality improvement effects of the Workout proposed in G6K, which is composed of a series of pumps to solve SVP. Firstly, we use a low-dimensional pump output basis to propose a predictor to predict the quality of high-dimensional Pumps output basis. Both theoretical analysis and experimental tests are performed to illustrate that it is more computationally expensive to solve the LWE problems by using a G6K default SVP solving strategy (Workout) than these lattice reduction algorithms (e.g. BKZ 2.0, Progressive BKZ, Pump, and Jump BKZ) with sieving as their SVP oracle. Secondly, the default Workout in G6K is optimized to achieve a stronger reduction and lower computational cost. Thirdly, we combine the optimized Workout and the Pump output basis quality predictor to further reduce the computational cost by optimizing LWE instances selection strategy. In fact, we can solve the TU LWE challenge (n = 65, q = 4225, = 0:005) 13.6 times faster than the G6K default Workout. Fourthly, we consider a combined two-stage (Preprocessing by BKZ- and a big Pump) LWE solving strategy. Both stages use dimension for free technology to give new theoretical security estimations of several LWE-based cryptographic schemes. The security estimations show that the securities of these schemes with the conservative Newhope’s core-SVP model are somewhat overestimated. In addition, in the case of LAC scheme, LWE instances selection strategy can be optimized to further improve the LWE-solving efficiency even by 15% and 57%. Finally, some experiments are implemented to examine the effects of our strategies on the Normal Form LWE problems, and the results demonstrate that the combined strategy is four times faster than that of Newhope.

Keywords: LWE, G6K, pump estimator, LWE instances selection strategy, dimension for free

Procedia PDF Downloads 60
25243 Artificial Intelligence Impact on the Australian Government Public Sector

Authors: Jessica Ho

Abstract:

AI has helped government, businesses and industries transform the way they do things. AI is used in automating tasks to improve decision-making and efficiency. AI is embedded in sensors and used in automation to help save time and eliminate human errors in repetitive tasks. Today, we saw the growth in AI using the collection of vast amounts of data to forecast with greater accuracy, inform decision-making, adapt to changing market conditions and offer more personalised service based on consumer habits and preferences. Government around the world share the opportunity to leverage these disruptive technologies to improve productivity while reducing costs. In addition, these intelligent solutions can also help streamline government processes to deliver more seamless and intuitive user experiences for employees and citizens. This is a critical challenge for NSW Government as we are unable to determine the risk that is brought by the unprecedented pace of adoption of AI solutions in government. Government agencies must ensure that their use of AI complies with relevant laws and regulatory requirements, including those related to data privacy and security. Furthermore, there will always be ethical concerns surrounding the use of AI, such as the potential for bias, intellectual property rights and its impact on job security. Within NSW’s public sector, agencies are already testing AI for crowd control, infrastructure management, fraud compliance, public safety, transport, and police surveillance. Citizens are also attracted to the ease of use and accessibility of AI solutions without requiring specialised technical skills. This increased accessibility also comes with balancing a higher risk and exposure to the health and safety of citizens. On the other side, public agencies struggle with keeping up with this pace while minimising risks, but the low entry cost and open-source nature of generative AI led to a rapid increase in the development of AI powered apps organically – “There is an AI for That” in Government. Other challenges include the fact that there appeared to be no legislative provisions that expressly authorise the NSW Government to use an AI to make decision. On the global stage, there were too many actors in the regulatory space, and a sovereign response is needed to minimise multiplicity and regulatory burden. Therefore, traditional corporate risk and governance framework and regulation and legislation frameworks will need to be evaluated for AI unique challenges due to their rapidly evolving nature, ethical considerations, and heightened regulatory scrutiny impacting the safety of consumers and increased risks for Government. Creating an effective, efficient NSW Government’s governance regime, adapted to the range of different approaches to the applications of AI, is not a mere matter of overcoming technical challenges. Technologies have a wide range of social effects on our surroundings and behaviours. There is compelling evidence to show that Australia's sustained social and economic advancement depends on AI's ability to spur economic growth, boost productivity, and address a wide range of societal and political issues. AI may also inflict significant damage. If such harm is not addressed, the public's confidence in this kind of innovation will be weakened. This paper suggests several AI regulatory approaches for consideration that is forward-looking and agile while simultaneously fostering innovation and human rights. The anticipated outcome is to ensure that NSW Government matches the rising levels of innovation in AI technologies with the appropriate and balanced innovation in AI governance.

Keywords: artificial inteligence, machine learning, rules, governance, government

Procedia PDF Downloads 70
25242 Smart Demand Response: A South African Pragmatic, Non-Destructive and Alternative Advanced Metering Infrastructure-Based Maximum Demand Reduction Methodology

Authors: Christo Nicholls

Abstract:

The National Electricity Grid (NEG) in South Africa has been under strain for the last five years. This overburden of the NEG led Eskom (the State-Owned Entity responsible for the NEG) to implement a blunt methodology to assist them in reducing the maximum demand (MD) on the NEG, when required, called Loadshedding. The challenge of this methodology is that not only does it lead to immense technical issues with the distribution network equipment, e.g., transformers, due to the frequent abrupt off and on switching, it also has a broader negative fiscal impact on the distributors, as their key consumers (commercial & industrial) are now grid defecting due to the lack of Electricity Security Provision (ESP). This paper provides a pragmatic alternative methodology utilizing specific functionalities embedded within direct-connect single and three-phase Advanced Meter Infrastructure (AMI) Solutions deployed within the distribution network, in conjunction with a Multi-Agent Systems Based AI implementation focused on Automated Negotiation Peer-2-Peer trading. The results of this research clearly illustrate, not only does methodology provide a factual percentage contribution towards the NEG MD at the point of consideration, it also allows the distributor to leverage the real-time MD data from key consumers to activate complex, yet impact-measurable Demand Response (DR) programs.

Keywords: AI, AMI, demand response, multi-agent

Procedia PDF Downloads 112
25241 Nuclear Decay Data Evaluation for 217Po

Authors: S. S. Nafee, A. M. Al-Ramady, S. A. Shaheen

Abstract:

Evaluated nuclear decay data for the 217Po nuclide ispresented in the present work. These data include recommended values for the half-life T1/2, α-, β--, and γ-ray emission energies and probabilities. Decay data from 221Rn α and 217Bi β—decays are presented. Q(α) has been updated based on the recent published work of the Atomic Mass Evaluation AME2012. In addition, the logft values were calculated using the Logft program from the ENSDF evaluation package. Moreover, the total internal conversion electrons has been calculated using Bricc program. Meanwhile, recommendation values or the multi-polarities have been assigned based on recently measurement yield a better intensity balance at the 254 keV and 264 keV gamma transitions.

Keywords: nuclear decay data evaluation, mass evaluation, total converison coefficients, atomic mass evaluation

Procedia PDF Downloads 433
25240 Geographic Information System Using Google Fusion Table Technology for the Delivery of Disease Data Information

Authors: I. Nyoman Mahayasa Adiputra

Abstract:

Data in the field of health can be useful for the purposes of data analysis, one example of health data is disease data. Disease data is usually in a geographical plot in accordance with the area. Where the data was collected, in the city of Denpasar, Bali. Disease data report is still published in tabular form, disease information has not been mapped in GIS form. In this research, disease information in Denpasar city will be digitized in the form of a geographic information system with the smallest administrative area in the form of district. Denpasar City consists of 4 districts of North Denpasar, East Denpasar, West Denpasar and South Denpasar. In this research, we use Google fusion table technology for map digitization process, where this technology can facilitate from the administrator and from the recipient information. From the administrator side of the input disease, data can be done easily and quickly. From the receiving end of the information, the resulting GIS application can be published in a website-based application so that it can be accessed anywhere and anytime. In general, the results obtained in this study, divided into two, namely: (1) Geolocation of Denpasar and all of Denpasar districts, the process of digitizing the map of Denpasar city produces a polygon geolocation of each - district of Denpasar city. These results can be utilized in subsequent GIS studies if you want to use the same administrative area. (2) Dengue fever mapping in 2014 and 2015. Disease data used in this study is dengue fever case data taken in 2014 and 2015. Data taken from the profile report Denpasar Health Department 2015 and 2016. This mapping can be useful for the analysis of the spread of dengue hemorrhagic fever in the city of Denpasar.

Keywords: geographic information system, Google fusion table technology, delivery of disease data information, Denpasar city

Procedia PDF Downloads 129
25239 Inclusive Practices in Health Sciences: Equity Proofing Higher Education Programs

Authors: Mitzi S. Brammer

Abstract:

Given that the cultural make-up of programs of study in institutions of higher learning is becoming increasingly diverse, much has been written about cultural diversity from a university-level perspective. However, there are little data in the way of specific programs and how they address inclusive practices when teaching and working with marginalized populations. This research study aimed to discover baseline knowledge and attitudes of health sciences faculty, instructional staff, and students related to inclusive teaching/learning and interactions. Quantitative data were collected via an anonymous online survey (one designed for students and another designed for faculty/instructional staff) using a web-based program called Qualtrics. Quantitative data were analyzed amongst the faculty/instructional staff and students, respectively, using descriptive and comparative statistics (t-tests). Additionally, some participants voluntarily engaged in a focus group discussion in which qualitative data were collected around these same variables. Collecting qualitative data to triangulate the quantitative data added trustworthiness to the overall data. The research team analyzed collected data and compared identified categories and trends, comparing those data between faculty/staff and students, and reported results as well as implications for future study and professional practice.

Keywords: inclusion, higher education, pedagogy, equity, diversity

Procedia PDF Downloads 67
25238 An Analysis of Sequential Pattern Mining on Databases Using Approximate Sequential Patterns

Authors: J. Suneetha, Vijayalaxmi

Abstract:

Sequential Pattern Mining involves applying data mining methods to large data repositories to extract usage patterns. Sequential pattern mining methodologies used to analyze the data and identify patterns. The patterns have been used to implement efficient systems can recommend on previously observed patterns, in making predictions, improve usability of systems, detecting events, and in general help in making strategic product decisions. In this paper, identified performance of approximate sequential pattern mining defines as identifying patterns approximately shared with many sequences. Approximate sequential patterns can effectively summarize and represent the databases by identifying the underlying trends in the data. Conducting an extensive and systematic performance over synthetic and real data. The results demonstrate that ApproxMAP effective and scalable in mining large sequences databases with long patterns.

Keywords: multiple data, performance analysis, sequential pattern, sequence database scalability

Procedia PDF Downloads 342
25237 Medical Knowledge Management since the Integration of Heterogeneous Data until the Knowledge Exploitation in a Decision-Making System

Authors: Nadjat Zerf Boudjettou, Fahima Nader, Rachid Chalal

Abstract:

Knowledge management is to acquire and represent knowledge relevant to a domain, a task or a specific organization in order to facilitate access, reuse and evolution. This usually means building, maintaining and evolving an explicit representation of knowledge. The next step is to provide access to that knowledge, that is to say, the spread in order to enable effective use. Knowledge management in the medical field aims to improve the performance of the medical organization by allowing individuals in the care facility (doctors, nurses, paramedics, etc.) to capture, share and apply collective knowledge in order to make optimal decisions in real time. In this paper, we propose a knowledge management approach based on integration technique of heterogeneous data in the medical field by creating a data warehouse, a technique of extracting knowledge from medical data by choosing a technique of data mining, and finally an exploitation technique of that knowledge in a case-based reasoning system.

Keywords: data warehouse, data mining, knowledge discovery in database, KDD, medical knowledge management, Bayesian networks

Procedia PDF Downloads 395
25236 Mean Shift-Based Preprocessing Methodology for Improved 3D Buildings Reconstruction

Authors: Nikolaos Vassilas, Theocharis Tsenoglou, Djamchid Ghazanfarpour

Abstract:

In this work we explore the capability of the mean shift algorithm as a powerful preprocessing tool for improving the quality of spatial data, acquired from airborne scanners, from densely built urban areas. On one hand, high resolution image data corrupted by noise caused by lossy compression techniques are appropriately smoothed while at the same time preserving the optical edges and, on the other, low resolution LiDAR data in the form of normalized Digital Surface Map (nDSM) is upsampled through the joint mean shift algorithm. Experiments on both the edge-preserving smoothing and upsampling capabilities using synthetic RGB-z data show that the mean shift algorithm is superior to bilateral filtering as well as to other classical smoothing and upsampling algorithms. Application of the proposed methodology for 3D reconstruction of buildings of a pilot region of Athens, Greece results in a significant visual improvement of the 3D building block model.

Keywords: 3D buildings reconstruction, data fusion, data upsampling, mean shift

Procedia PDF Downloads 315
25235 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman Electricity Transmission Company

Authors: Rahma Al Balushi

Abstract:

Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS dept. This paper will describe in detail the GIS data submission process and the journey to develop the current process. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting, and data alterations salso aided to reduce the missing attributes of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the year 2017 and the year 2021. Overall, concluding that by governance, asset information & GIS department can control GIS data process; collect, properly record, and manage asset data and information within OETC network. This control extends to other applications and systems integrated with/related to GIS systems.

Keywords: asset management ISO55001, standard procedures process, governance, geodatabase, NOC, CMMS

Procedia PDF Downloads 207
25234 The Feminism of Data Privacy and Protection in Africa

Authors: Olayinka Adeniyi, Melissa Omino

Abstract:

The field of data privacy and data protection in Africa is still an evolving area, with many African countries yet to enact legislation on the subject. While African Governments are bringing their legislation to speed in this field, how patriarchy pervades every sector of African thought and manifests in society needs to be considered. Moreover, the laws enacted ought to be inclusive, especially towards women. This, in a nutshell, is the essence of data feminism. Data feminism is a new way of thinking about data science and data ethics that is informed by the ideas of intersectional feminism. Feminising data privacy and protection will involve thinking women, considering women in the issues of data privacy and protection, particularly in legislation, as is the case in this paper. The line of thought of women inclusion is not uncommon when even international and regional human rights specific for women only came long after the general human rights. The consideration is that these should have been inserted or rather included in the original general instruments in the first instance. Since legislation on data privacy is coming in this century, having seen the rights and shortcomings of earlier instruments, then the cue should be taken to ensure inclusive wholistic legislation for data privacy and protection in the first instance. Data feminism is arguably an area that has been scantily researched, albeit a needful one. With the spate of increase in the violence against women spiraling in the cyber world, compounding the issue of COVID-19 and the needful response of governments, and the effect of these on women and their rights, fast forward, the research on the feminism of data privacy and protection in Africa becomes inevitable. This paper seeks to answer the questions, what is data feminism in the African context, why is it important in the issue of data privacy and protection legislation; what are the laws, if any, existing on data privacy and protection in Africa, are they women inclusive, if not, why; what are the measures put in place for the privacy and protection of women in Africa, and how can this be made possible. The paper aims to investigate the issue of data privacy and protection in Africa, the legal framework, and the protection or provision that it has for women if any. It further aims to research the importance and necessity of feminizing data privacy and protection, the effect of lack of it, the challenges or bottlenecks in attaining this feat and the possibilities of accessing data privacy and protection for African women. The paper also researches the emerging practices of data privacy and protection of women in other jurisprudences. It approaches the research through the methodology of review of papers, analysis of laws, and reports. It seeks to contribute to the existing literature in the field and is explorative in its suggestion. It suggests a draft of some clauses to make any data privacy and protection legislation women inclusive. It would be useful for policymaking, academic, and public enlightenment.

Keywords: feminism, women, law, data, Africa

Procedia PDF Downloads 205
25233 Liberation as a Method for Monument Valorisation: The Case of the Defence Heritage Restoration

Authors: Donatella R. Fiorino, Marzia Loddo

Abstract:

The practice of freeing monuments from subsequent additions crosses the entire history of conservation and it is traditionally connected to the aim of valorisation, both for cultural and educational purpose and recently even for touristic exploitation. Defence heritage has been widely interested by these cultural and technical moods from philological restoration to critic innovations. A renovated critical analysis of Italian episodes and in particular the Sardinian case of the area of San Pancrazio in Cagliari, constitute an important lesson about the limits of this practice and the uncertainty in terms of results, towards the definition of a sustainable good practice in the restoration of military architectures.

Keywords: defensive architecture, liberation, Valorisation for tourism, historical restoration

Procedia PDF Downloads 342
25232 Evaluation of Practicality of On-Demand Bus Using Actual Taxi-Use Data through Exhaustive Simulations

Authors: Jun-ichi Ochiai, Itsuki Noda, Ryo Kanamori, Keiji Hirata, Hitoshi Matsubara, Hideyuki Nakashima

Abstract:

We conducted exhaustive simulations for data assimilation and evaluation of service quality for various setting in a new shared transportation system, called SAVS. Computational social simulation is a key technology to design recent social services like SAVS as new transportation service. One open issue in SAVS was to determine the service scale through the social simulation. Using our exhaustive simulation framework, OACIS, we did data-assimilation and evaluation of effects of SAVS based on actual tax-use data at Tajimi city, Japan. Finally, we get the conditions to realize the new service in a reasonable service quality.

Keywords: on-demand bus sytem, social simulation, data assimilation, exhaustive simulation

Procedia PDF Downloads 321
25231 Predicting Long-Term Meat Productivity for the Kingdom of Saudi Arabia

Authors: Ahsan Abdullah, Ahmed A. S. Bakshwain

Abstract:

Livestock is one of the fastest-growing sectors in agriculture. If carefully managed, have potential opportunities for economic growth, food sovereignty and food security. In this study we mainly analyse and compare long-term i.e. for year 2030 climate variability impact on predicted productivity of meat i.e. beef, mutton and poultry for the Kingdom of Saudi Arabia w.r.t three factors i.e. i) climatic-change vulnerability ii) CO2 fertilization and iii) water scarcity and compare the results with two countries of the region i.e. Iraq and Yemen. We do the analysis using data from diverse sources, which was extracted, transformed and integrated before usage. The collective impact of the three factors had an overall negative effect on the production of meat for all the three countries, with adverse impact on Iraq. High similarity was found between CO2 fertilization (effecting animal fodder) and water scarcity i.e. higher than that between production of beef and mutton for the three countries considered. Overall, the three factors do not seem to be favorable for the three Middle-East countries considered. This points to possibility of a vegetarian year 2030 based on dependency on indigenous live-stock population.

Keywords: prediction, animal-source foods, pastures, CO2 fertilization, climatic-change vulnerability, water scarcity

Procedia PDF Downloads 321
25230 Micro-Oculi Facades as a Sustainable Urban Facade

Authors: Ok-Kyun Im, Kyoung Hee Kim

Abstract:

We live in an era that faces global challenges of climate changes and resource depletion. With the rapid urbanization and growing energy consumption in the built environment, building facades become ever more important in architectural practice and environmental stewardship. Furthermore, building facade undergoes complex dynamics of social, cultural, environmental and technological changes. Kinetic facades have drawn attention of architects, designers, and engineers in the field of adaptable, responsive and interactive architecture since 1980’s. Materials and building technologies have gradually evolved to address the technical implications of kinetic facades. The kinetic façade is becoming an independent system of the building, transforming the design methodology to sustainable building solutions. Accordingly, there is a need for a new design methodology to guide the design of a kinetic façade and evaluate its sustainable performance. The research objectives are two-fold: First, to establish a new design methodology for kinetic facades and second, to develop a micro-oculi façade system and assess its performance using the established design method. The design approach to the micro-oculi facade is comprised of 1) façade geometry optimization and 2) dynamic building energy simulation. The façade geometry optimization utilizes multi-objective optimization process, aiming to balance the quantitative and qualitative performances to address the sustainability of the built environment. The dynamic building energy simulation was carried out using EnergyPlus and Radiance simulation engines with scripted interfaces. The micro-oculi office was compared with an office tower with a glass façade in accordance with ASHRAE 90.1 2013 to understand its energy efficiency. The micro-oculi facade is constructed with an array of circular frames attached to a pair of micro-shades called a micro-oculus. The micro-oculi are encapsulated between two glass panes to protect kinetic mechanisms with longevity. The micro-oculus incorporates rotating gears that transmit the power to adjacent micro-oculi to minimize the number of mechanical parts. The micro-oculus rotates around its center axis with a step size of 15deg depending on the sun’s position while maximizing daylighting potentials and view-outs. A 2 ft by 2ft prototyping was undertaken to identify operational challenges and material implications of the micro-oculi facade. In this research, a systematic design methodology was proposed, that integrates multi-objectives of kinetic façade design criteria and whole building energy performance simulation within a holistic design process. This design methodology is expected to encourage multidisciplinary collaborations between designers and engineers to collaborate issues of the energy efficiency, daylighting performance and user experience during design phases. The preliminary energy simulation indicated that compared to a glass façade, the micro-oculi façade showed energy savings due to its improved thermal properties, daylighting attributes, and dynamic solar performance across the day and seasons. It is expected that the micro oculi façade provides a cost-effective, environmentally-friendly, sustainable, and aesthetically pleasing alternative to glass facades. Recommendations for future studies include lab testing to validate the simulated data of energy and optical properties of the micro-oculi façade. A 1:1 performance mock-up of the micro-oculi façade can suggest in-depth understanding of long-term operability and new development opportunities applicable for urban façade applications.

Keywords: energy efficiency, kinetic facades, sustainable architecture, urban facades

Procedia PDF Downloads 257
25229 Evaluation of Information Technology Governance Frameworks for Better Governance in South Africa

Authors: Memory Ranga, Phillip Pretorious

Abstract:

The South African Government has invested a lot of money in Information Technology Governance (ITG) within the Government departments. The ITG framework was spearheaded by the Department of Public Service and Administration (DPSA). This led to the development of a governing ITG DPSA framework and later the Government Wide Enterprise Architecture (GWEA) Framework for assisting the departments to implement ITG. In addition to this, the government departments have adopted the Information Systems Audit and Control Association (ISACA) Control Objectives for Information and Related Technology (COBIT) for ITG processes. Despite all these available frameworks, departments fail to fully capitalise and improve the ITG processes mainly as these are too generic and difficult to apply for specific governance needs. There has been less research done to evaluate the progress on ITG initiatives within the government departments. This paper aims to evaluate the existing ITG frameworks within selected government departments in South Africa. A quantitative research approach was used in this study. Data was collected through an online questionnaire targeting ICT Managers and Directors from government departments. The study is undertaken within a case study and only the Eastern Cape Province was selected for the research. Document review mainly on ITG framework and best practices was also used. Data was analysed using the Google Analytic tools and SPSS. A one–sample Chi-Squared Test was used to verity the evaluation findings. Findings show that there is evidence that the current guiding National governance framework (DPSA) is out dated and does not accommodate the new changes in other governance frameworks. The Eastern Cape Government Departments have spent huge amount of money on ITG but not yet able to identify the benefits of the ITG initiatives. The guiding framework is rigid and does to address some of the departmental needs making it difficult to be flexible and apply the DPSA framework. Furthermore, despite the large budget on ITG, the departments still find themselves with many challenges and unable to improve some of the processes and services. All the engaged Eastern Cape departments have adopted the COBIT framework, but none has been conducting COBIT maturity Assessment which is a functionality of COBIT. There is evidence of too many the ITG frameworks and underutilisation of these frameworks. The study provides a comprehensive evaluation of the ITG frameworks that have been adopted by the South African Government Departments in the Eastern Cape Province. The evaluation guides and recommends the government departments to rethink and adopt ITG frameworks that could be customised to accommodate their needs. The adoption and application of ITG by government departments should assist in better governance and service delivery to the citizens.

Keywords: information technology governance, COBIT, evaluate, framework, governance, DPSA framework

Procedia PDF Downloads 123
25228 Ethnic Tourism and Real Estate Development: A Case of Yiren Ancient Town, China

Authors: Li Yang

Abstract:

Tourism is employed by many countries to facilitate socioeconomic development and to assist in the heritage preservation. An “ethnic culture boom” is currently driving the tourism industry in China. Ethnic minorities, commonly portrayed as primitive, colorful and exotic, have become a big tourist draw. Many cultural attractions have been built throughout China to meet the demands of domestic tourists. Sacred cultural heritage sites have been rehabilitated as a major component of ethnic tourism. The purpose of this study is to examine the interconnected consequences of tourism development and tourism-related leisure property development and, and to discuss, in a broader context, issues and considerations that are pertinent to the management and development of ethnic attractions. The role of real estate in tourism development and its sociocultural consequences are explored. An empirical research was conducted in Yiren Ancient Town (literally, "Ancient Town of Yi People") in Chuxiong City, Yunnan Province, China. Multiple research methods, including in-depth interviews, informal discussions, on-site observations, and secondary data review were employed to measure residents and tourism decision-makers’ perceptions of ethnic tourism and to explore the impacts of tourism on local community. Key informants from government officials, tourism developers and local communities were interviewed individually to gather what they think about benefits and costs of tourism, and what their concerns about and hopes for tourism development are. Yiren Ancient Town was constructed in classical Yi architecture style featuring tranquil garden scenery. Commercial streets, entertainment complexes, and accommodation facilities occupied the center of the town, creating culturally distinctive and visually stimulating places for tourists. A variety of activities are presented to visitors, including walking tours of the town, staged dance shows, musical performances, ethnic festivals and ceremonies, tasting minority food and wedding shows. This study reveals that tourism real estate has transformed the town from a traditional neighborhood into diverse real estate landscapes. Ethnic architecture, costumes, festivals and folk culture have been represented, altered and reinvented through the tourist gaze and mechanisms of cultural production. Tourism is now a new economic driver of the community providing opportunities for the creation of small businesses. There was a general appreciation in the community that tourism has created many employment opportunities, especially for self-employment. However, profit-seeking is a primary motivation for the government, developers, businesses, and other actors involved in the tourism development process. As the town has attracted an increasing number of visitors, commercialization and business competition are intense in the town. Many residents complained about elevated land prices, making the town and the surroundings comparatively high-value locales. Local community is also concerned about the decline of traditional ethnic culture and an erosion of the sense of identity and place. A balance is difficult to maintain between protection and development. The preservation of ethnic culture and heritage should be enhanced if long-term sustainable development of tourism is to occur and the loss of ethnic identities is to be avoided.

Keywords: ancient town, ethnic tourism, local community, real estate, China

Procedia PDF Downloads 279
25227 Optimal Pricing Based on Real Estate Demand Data

Authors: Vanessa Kummer, Maik Meusel

Abstract:

Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.

Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning

Procedia PDF Downloads 285