Search results for: data space connector
25718 Time Series Regression with Meta-Clusters
Authors: Monika Chuchro
Abstract:
This paper presents a preliminary attempt to apply classification of time series using meta-clusters in order to improve the quality of regression models. In this case, clustering was performed as a method to obtain a subgroups of time series data with normal distribution from inflow into waste water treatment plant data which Composed of several groups differing by mean value. Two simple algorithms: K-mean and EM were chosen as a clustering method. The rand index was used to measure the similarity. After simple meta-clustering, regression model was performed for each subgroups. The final model was a sum of subgroups models. The quality of obtained model was compared with the regression model made using the same explanatory variables but with no clustering of data. Results were compared by determination coefficient (R2), measure of prediction accuracy mean absolute percentage error (MAPE) and comparison on linear chart. Preliminary results allows to foresee the potential of the presented technique.Keywords: clustering, data analysis, data mining, predictive models
Procedia PDF Downloads 46925717 Python Implementation for S1000D Applicability Depended Processing Model - SALERNO
Authors: Theresia El Khoury, Georges Badr, Amir Hajjam El Hassani, Stéphane N’Guyen Van Ky
Abstract:
The widespread adoption of machine learning and artificial intelligence across different domains can be attributed to the digitization of data over several decades, resulting in vast amounts of data, types, and structures. Thus, data processing and preparation turn out to be a crucial stage. However, applying these techniques to S1000D standard-based data poses a challenge due to its complexity and the need to preserve logical information. This paper describes SALERNO, an S1000d AppLicability dEpended pRocessiNg mOdel. This python-based model analyzes and converts the XML S1000D-based files into an easier data format that can be used in machine learning techniques while preserving the different logic and relationships in files. The model parses the files in the given folder, filters them, and extracts the required information to be saved in appropriate data frames and Excel sheets. Its main idea is to group the extracted information by applicability. In addition, it extracts the full text by replacing internal and external references while maintaining the relationships between files, as well as the necessary requirements. The resulting files can then be saved in databases and used in different models. Documents in both English and French languages were tested, and special characters were decoded. Updates on the technical manuals were taken into consideration as well. The model was tested on different versions of the S1000D, and the results demonstrated its ability to effectively handle the applicability, requirements, references, and relationships across all files and on different levels.Keywords: aeronautics, big data, data processing, machine learning, S1000D
Procedia PDF Downloads 16525716 Life Prediction Method of Lithium-Ion Battery Based on Grey Support Vector Machines
Authors: Xiaogang Li, Jieqiong Miao
Abstract:
As for the problem of the grey forecasting model prediction accuracy is low, an improved grey prediction model is put forward. Firstly, use trigonometric function transform the original data sequence in order to improve the smoothness of data , this model called SGM( smoothness of grey prediction model), then combine the improved grey model with support vector machine , and put forward the grey support vector machine model (SGM - SVM).Before the establishment of the model, we use trigonometric functions and accumulation generation operation preprocessing data in order to enhance the smoothness of the data and weaken the randomness of the data, then use support vector machine (SVM) to establish a prediction model for pre-processed data and select model parameters using genetic algorithms to obtain the optimum value of the global search. Finally, restore data through the "regressive generate" operation to get forecasting data. In order to prove that the SGM-SVM model is superior to other models, we select the battery life data from calce. The presented model is used to predict life of battery and the predicted result was compared with that of grey model and support vector machines.For a more intuitive comparison of the three models, this paper presents root mean square error of this three different models .The results show that the effect of grey support vector machine (SGM-SVM) to predict life is optimal, and the root mean square error is only 3.18%. Keywords: grey forecasting model, trigonometric function, support vector machine, genetic algorithms, root mean square errorKeywords: Grey prediction model, trigonometric functions, support vector machines, genetic algorithms, root mean square error
Procedia PDF Downloads 46725715 Development of a Data Security Model Using Steganography
Authors: Terungwa Simon Yange, Agana Moses A.
Abstract:
This paper studied steganography and designed a simplistic approach to a steganographic tool for hiding information in image files with the view of addressing the security challenges with data by hiding data from unauthorized users to improve its security. The Structured Systems Analysis and Design Method (SSADM) was used in this work. The system was developed using Java Development Kit (JDK) 1.7.0_10 and MySQL Server as its backend. The system was tested with some hypothetical health records which proved the possibility of protecting data from unauthorized users by making it secret so that its existence cannot be easily recognized by fraudulent users. It further strengthens the confidentiality of patient records kept by medical practitioners in the health setting. In conclusion, this work was able to produce a user friendly steganography software that is very fast to install and easy to operate to ensure privacy and secrecy of sensitive data. It also produced an exact copy of the original image and the one carrying the secret message when compared with each.Keywords: steganography, cryptography, encryption, decryption, secrecy
Procedia PDF Downloads 26825714 Analysis of Citation Rate and Data Reuse for Openly Accessible Biodiversity Datasets on Global Biodiversity Information Facility
Authors: Nushrat Khan, Mike Thelwall, Kayvan Kousha
Abstract:
Making research data openly accessible has been mandated by most funders over the last 5 years as it promotes reproducibility in science and reduces duplication of effort to collect the same data. There are evidence that articles that publicly share research data have higher citation rates in biological and social sciences. However, how and whether shared data is being reused is not always intuitive as such information is not easily accessible from the majority of research data repositories. This study aims to understand the practice of data citation and how data is being reused over the years focusing on biodiversity since research data is frequently reused in this field. Metadata of 38,878 datasets including citation counts were collected through the Global Biodiversity Information Facility (GBIF) API for this purpose. GBIF was used as a data source since it provides citation count for datasets, not a commonly available feature for most repositories. Analysis of dataset types, citation counts, creation and update time of datasets suggests that citation rate varies for different types of datasets, where occurrence datasets that have more granular information have higher citation rates than checklist and metadata-only datasets. Another finding is that biodiversity datasets on GBIF are frequently updated, which is unique to this field. Majority of the datasets from the earliest year of 2007 were updated after 11 years, with no dataset that was not updated since creation. For each year between 2007 and 2017, we compared the correlations between update time and citation rate of four different types of datasets. While recent datasets do not show any correlations, 3 to 4 years old datasets show weak correlation where datasets that were updated more recently received high citations. The results are suggestive that it takes several years to cumulate citations for research datasets. However, this investigation found that when searched on Google Scholar or Scopus databases for the same datasets, the number of citations is often not the same as GBIF. Hence future aim is to further explore the citation count system adopted by GBIF to evaluate its reliability and whether it can be applicable to other fields of studies as well.Keywords: data citation, data reuse, research data sharing, webometrics
Procedia PDF Downloads 18525713 Significance of Transient Data and Its Applications in Turbine Generators
Authors: Chandra Gupt Porwal, Preeti C. Porwal
Abstract:
Transient data reveals much about the machine's condition that steady-state data cannot. New technologies make this information much more available for evaluating the mechanical integrity of a machine train. Recent surveys at various stations indicate that simplicity is preferred over completeness in machine audits throughout the power generation industry. This is most clearly shown by the number of rotating machinery predictive maintenance programs in which only steady-state vibration amplitude is trended while important transient vibration data is not even acquired. Efforts have been made to explain what transient data is, its importance, the types of plots used for its display, and its effective utilization for analysis. In order to demonstrate the value of measuring transient data and its practical application in rotating machinery for resolving complex and persistent issues with turbine generators, the author presents a few case studies that highlight the presence of rotor instabilities due to the shaft moving towards the bearing centre in a 100 MM LMZ unit located in the Northern Capital Region (NCR), heavy misalignment noticed—especially after 2993 rpm—caused by loose coupling bolts, which prevented the machine from being synchronized for more than four months in a 250 MW KWU unit in the Western Region (WR), and heavy preload noticed at Intermediate pressure turbine (IPT) bearing near HP- IP coupling, caused by high points on coupling faces at a 500 MW KWU unit in the Northern region (NR), experienced at Indian power plants.Keywords: transient data, steady-state-data, intermediate -pressure-turbine, high-points
Procedia PDF Downloads 7425712 Geographic Information System for District Level Energy Performance Simulations
Authors: Avichal Malhotra, Jerome Frisch, Christoph van Treeck
Abstract:
The utilization of semantic, cadastral and topological data from geographic information systems (GIS) has exponentially increased for building and urban-scale energy performance simulations. Urban planners, simulation scientists, and researchers use virtual 3D city models for energy analysis, algorithms and simulation tools. For dynamic energy simulations at city and district level, this paper provides an overview of the available GIS data models and their levels of detail. Adhering to different norms and standards, these models also intend to describe building and construction industry data. For further investigations, CityGML data models are considered for simulations. Though geographical information modelling has considerably many different implementations, extensions of virtual city data can also be made for domain specific applications. Highlighting the use of the extended CityGML models for energy researches, a brief introduction to the Energy Application Domain Extension (ADE) along with its significance is made. Consequently, addressing specific input simulation data, a workflow using Modelica underlining the usage of GIS information and the quantification of its significance over annual heating energy demand is presented in this paper.Keywords: CityGML, EnergyADE, energy performance simulation, GIS
Procedia PDF Downloads 17525711 Synthesis, Structural and Vibrational Studies of a New Lacunar Apatite: LIPB2CA2(PO4)3
Authors: A. Chari, A. El Bouari, B. Orayech, A. Faik, J. M. Igartua
Abstract:
The phosphate is a natural resource of great importance in Morocco. In order to exploit this wealth, synthesis and studies of new a material based phosphate, were carried out. The apatite structure present o lot of characteristics, One of the main characteristics is to allow large and various substitutions for both cations and anions. Beside their biological importance in hard tissue (bone and teeth), apatites have been extensively studied for their potential use as fluorescent lamp phosphors or laser host materials.The apatite have interesting possible application fields such as in medicine as materials of bone filling, coating of dental implants, agro chemicals as artificial fertilizers. The LiPb2Ca2(PO4)3 was synthesized by the solid-state method, its crystal structure was investigated by Rietveld analysis using XRPD data. This material crystallizes with a structure of lacunar apatite anion deficit. The LiPb2Ca2(PO4)3 is hexagonal apatite at room temperature, adopting the space group P63/m (ITA No. 176), Rietveld refinements showed that the site 4f is shared by three cations Ca, Pb and Li. While the 6h is occupied by the Pb and Li cations. The structure can be described as built up from the PO4 tetrahedra and the sixfold coordination cavities, which delimit hexagonal tunnels along the c-axis direction. These tunnels are linked by the cations occupying the 4 f sites. Raman and Infrared spectroscopy analyses were carried out. The observed frequencies were assigned and discussed on the basis of unit-cell group analysis and by comparison to other apatite-type materials.Keywords: apatite, Lacunar, crystal structure, Rietveldmethod, LiPb2Ca2(PO4)3, Phase transition
Procedia PDF Downloads 40625710 Performance Analysis of Air-Tunnel Heat Exchanger Integrated into Raft Foundation
Authors: Chien-Yeh Hsu, Yuan-Ching Chiang, Zi-Jie Chien, Sih-Li Chen
Abstract:
In this study, a field experiment and performance analysis of air-tunnel heat exchanger integrated with water-filled raft foundation of residential building were performed. In order to obtain better performance, conventional applications of air-tunnel inevitably have high initial cost or issues about insufficient installation space. To improve the feasibility of air tunnel heat exchanger in high-density housing, an integrated system consisting of air pipes immersed in the water-filled raft foundation was presented, taking advantage of immense amount of water and relatively stable temperature in raft foundation of building. The foundation-integrated air tunnel was applied to a residential building located in Yilan, Taiwan, and its thermal performance was measured in the field experiment. The results indicated that the cooling potential of integrated system was close to the potential of soil-based EAHE at 2 m depth or deeper. An analytical model based on thermal resistance method was validated by measurement results, and was used to carry out the dimensioning of foundation-integrated air tunnel. The discrepancies between calculated value and measured data were less than 2.7%. In addition, the return-on-investment with regard to thermal performance and economics of the application was evaluated. Because the installation for air tunnel is scheduled in the building foundation construction, the utilization of integrated system spends less construction cost compare to the conventional earth-air tunnel.Keywords: air tunnel, ground heat exchanger, raft foundation, residential building
Procedia PDF Downloads 33725709 Achieving High Renewable Energy Penetration in Western Australia Using Data Digitisation and Machine Learning
Authors: A. D. Tayal
Abstract:
The energy industry is undergoing significant disruption. This research outlines that, whilst challenging; this disruption is also an emerging opportunity for electricity utilities. One such opportunity is leveraging the developments in data analytics and machine learning. As the uptake of renewable energy technologies and complimentary control systems increases, electricity grids will likely transform towards dense microgrids with high penetration of renewable generation sources, rich in network and customer data, and linked through intelligent, wireless communications. Data digitisation and analytics have already impacted numerous industries, and its influence on the energy sector is growing, as computational capabilities increase to manage big data, and as machines develop algorithms to solve the energy challenges of the future. The objective of this paper is to address how far the uptake of renewable technologies can go given the constraints of existing grid infrastructure and provides a qualitative assessment of how higher levels of renewable energy penetration can be facilitated by incorporating even broader technological advances in the fields of data analytics and machine learning. Western Australia is used as a contextualised case study, given its abundance and diverse renewable resources (solar, wind, biomass, and wave) and isolated networks, making a high penetration of renewables a feasible target for policy makers over coming decades.Keywords: data, innovation, renewable, solar
Procedia PDF Downloads 37125708 A New Paradigm to Make Cloud Computing Greener
Authors: Apurva Saxena, Sunita Gond
Abstract:
Demand of computation, data storage in large amount are rapidly increases day by day. Cloud computing technology fulfill the demand of today’s computation but this will lead to high power consumption in cloud data centers. Initiative for Green IT try to reduce power consumption and its adverse environmental impacts. Paper also focus on various green computing techniques, proposed models and efficient way to make cloud greener.Keywords: virtualization, cloud computing, green computing, data center
Procedia PDF Downloads 56025707 Physiological Action of Anthraquinone-Containing Preparations
Authors: Dmitry Yu. Korulkin, Raissa A. Muzychkina, Evgenii N. Kojaev
Abstract:
In review the generalized data about biological activity of anthraquinone-containing plants and specimens on their basis is presented. Data of traditional medicine, results of bioscreening and clinical researches of specimens are analyzed.Keywords: anthraquinones, physiologically active substances, phytopreparation, Ramon
Procedia PDF Downloads 37925706 Tertiary Education Trust Fund Intervention Projects and Resource Utilization in Universities in South Western States, Nigeria
Authors: Oluwlola Felicia Kikelomo
Abstract:
This study examined the influence of Tertiary Education Trust Fund (TETF) intervention projects and resource utilization in universities in South Western State of Nigeria. The study was a descriptive design of the correlation type. Purposive sampling technique was used to select six out of 14 beneficiary universities in the States. Instruments used to collect data were TETF Intervention Projects Checklist (TETFIPC), Educational Facilities Checklists (EFC) and Resources Utilization Checklists (RUC). The research questions raised were answered using percentage and utilization rates, while Pearson product-moment correlation statistic was used to test the hypotheses formulated to guide the study 0.05 level of significance. Findings of the study indicated that building construction had the highest TETF allocation (64.5%), while staff development opportunities had the least (1.1%) in the sampled universities. Significant and positive relationship existed between time and space utilization rates and student academic performance in the universities (r (1,800) = 0.63 and r (1,800) = 0.59, p ≤ 0.05 respectively). Based, on these findings, it was recommended that there should be periodic evaluation of completed TETF projects and utilization to ensure that TETF funds are properly used for the approved projects; and that TETF should improve on the provision of educational facilities to universities for staff and students’ use through increase in education tax from 2% to 4% with collaboration with the world bank and other funding agencies as being practiced in other countries of the world such as Norway, Spain, and United Kingdom.Keywords: tertiary education trust fund, intervention, education, human development
Procedia PDF Downloads 38625705 Worldbuilding as Critical Architectural Pedagogy
Authors: Jesse Rafeiro
Abstract:
This paper discusses worldbuilding as a pedagogical approach to the first-year architectural design studio. The studio ran for three consecutive terms between 2016-2018. Taking its departure from the fifty-five city narratives in Italo Calvino’s Invisible Cities, students collectively designed in a “nowhere” space where intersecting and diverging narratives could be played out. Along with Calvino, students navigated between three main exercises and their imposed limits to develop architectural insight at three scales simulating the considerations of architectural practice: detail, building, and city. The first exercise asked each student to design and model a ruin based on randomly assigned incongruent fragments. Each student was given one plan fragment and two section fragments from different Renaissance Treatises. The students were asked to translate these in alternating axonometric projection and model-making explorations. Although the fragments themselves were imposed, students were free to interpret how the drawings fit together by imagining new details and atypical placements. An undulating terrain model was introduced in the second exercise to ground the worldbuilding exercises. Here, students were required to negotiate with one another to design a city of ruins. Free to place their models anywhere on the site, the students were restricted by the negotiation of territories marked by other students and the requirement to provide thresholds, open spaces, and corridors. The third exercise introduced new life into the ruined city through a series of design interventions. Each student was assigned an atypical building program suggesting a place for an activity, human or nonhuman. The atypical nature of the programs challenged the triviality of functional planning through explorations in spatial narratives free from preconceived assumptions. By contesting, playing out, or dreaming responses to realities taught in other coursework, this third exercise actualized learnings that are too often self-contained in the silos of differing course agendas. As such, the studio fostered an initial worldbuilding space within which to sharpen sensibility and criticality for subsequent years of education.Keywords: architectural pedagogy, critical pedagogy, Italo Calvino, worldbuilding
Procedia PDF Downloads 13625704 Place-Making Theory behind Claremont Court
Authors: Sandra Costa-Santos, Nadia Bertolino, Stephen Hicks, Vanessa May, Camilla Lewis
Abstract:
This paper aims to elaborate the architectural theory on place-making that supported Claremont Court housing scheme (Edinburgh, United Kingdom). Claremont Court (1959-62) is a large post-war mixed development housing scheme designed by Basil Spence, which included ‘place-making’ as one of its founding principles. Although some stylistic readings of the housing scheme have been published, the theory on place-making that allegedly ruled the design has yet to be clarified. The architecture allows us to mark or make a place within space in order to dwell. Under the framework of contemporary philosophical theories of place, this paper aims to explore the relationship between place and dwelling through a cross-disciplinary reading of Claremont Court, with a view to develop an architectural theory on place-making. Since dwelling represents the way we are immersed in our world in an existential manner, this theme is not just relevant for architecture but also for philosophy and sociology. The research in this work is interpretive-historic in nature. It examines documentary evidence of the original architectural design, together with relevant literature in sociology, history, and architecture, through the lens of theories of place. First, the paper explores how the dwelling types originally included in Claremont Court supported ideas of dwelling or meanings of home. Then, it traces shared space and social ties in order to study the symbolic boundaries that allow the creation of a collective identity or sense of belonging. Finally, the relation between the housing scheme and the supporting theory is identified. The findings of this research reveal Scottish architect Basil Spence’s exploration of the meaning of home, as he changed his approach to the mass housing while acting as President of the Royal Incorporation of British Architects (1958-60). When the British Government was engaged in various ambitious building programmes, he sought to drive architecture to a wider socio-political debate as president of the RIBA, hence moving towards a more ambitious and innovative socio-architectural approach. Rather than trying to address the ‘genius loci’ with an architectural proposition, as has been stated, the research shows that the place-making theory behind the housing scheme was supported by notions of community-based on shared space and dispositions. The design of the housing scheme was steered by a desire to foster social relations and collective identities, rather than by the idea of keeping the spirit of the place. This research is part of a cross-disciplinary project funded by the Arts and Humanities Research Council. The findings present Claremont Court as a signifier of Basil Spence’s attempt to address the post-war political debate on housing in United Kingdom. They highlight the architect’s theoretical agenda and challenge current purely stylistic readings of Claremont Court as they fail to acknowledge its social relevance.Keywords: architectural theory, dwelling, place-making, post-war housing
Procedia PDF Downloads 26725703 Dental Ethics versus Malpractice, as Phenomenon with a Growing Trend
Authors: Saimir Heta, Kers Kapaj, Rialda Xhizdari, Ilma Robo
Abstract:
Dealing with emerging cases of dental malpractice with justifications that stem from the clear rules of dental ethics is a phenomenon with an increasing trend in today's dental practice. Dentists should clearly understand how far the limit of malpractice goes, with or without minimal or major consequences, for the affected patient, which can be justified as a complication of dental treatment, in support of the rules of dental ethics in the dental office. Indeed, malpractice can occur in cases of lack of professionalism, but it can also come as a consequence of anatomical and physiological limitations in the implementation of the dental protocols, predetermined and indicated by the patient in the paragraph of the treatment plan in his personal card. This study is of the review type with the aim of the latest findings published in the literature about the problem of dealing with these phenomena. The combination of keywords is done in such a way with the aim to give the necessary space for collecting the right information in the networks of publications about this field, always first from the point of view of the dentist and not from that of the lawyer or jurist. From the findings included in this article, it was noticed the diversity of approaches towards the phenomenon depends on the different countries based on the legal basis that these countries have. There is a lack of or a small number of articles that touch on this topic, and these articles are presented with a limited number of data on the same topic. Conclusions: Dental malpractice should not be hidden under the guise of various dental complications that we justify with the strict rules of ethics for patients treated in the dental chair. The individual experience of dental malpractice must be published with the aim of serving as a source of experience for future generations of dentists.Keywords: dental ethics, malpractice, professional protocol, random deviation
Procedia PDF Downloads 10125702 Personal Data Protection: A Legal Framework for Health Law in Turkey
Authors: Veli Durmus, Mert Uydaci
Abstract:
Every patient who needs to get a medical treatment should share health-related personal data with healthcare providers. Therefore, personal health data plays an important role to make health decisions and identify health threats during every encounter between a patient and caregivers. In other words, health data can be defined as privacy and sensitive information which is protected by various health laws and regulations. In many cases, the data are an outcome of the confidential relationship between patients and their healthcare providers. Globally, almost all nations have own laws, regulations or rules in order to protect personal data. There is a variety of instruments that allow authorities to use the health data or to set the barriers data sharing across international borders. For instance, Directive 95/46/EC of the European Union (EU) (also known as EU Data Protection Directive) establishes harmonized rules in European borders. In addition, the General Data Protection Regulation (GDPR) will set further common principles in 2018. Because of close policy relationship with EU, this study provides not only information on regulations, directives but also how they play a role during the legislative process in Turkey. Even if the decision is controversial, the Board has recently stated that private or public healthcare institutions are responsible for the patient call system, for doctors to call people waiting outside a consultation room, to prevent unlawful processing of personal data and unlawful access to personal data during the treatment. In Turkey, vast majority private and public health organizations provide a service that ensures personal data (i.e. patient’s name and ID number) to call the patient. According to the Board’s decision, hospital or other healthcare institutions are obliged to take all necessary administrative precautions and provide technical support to protect patient privacy. However, this application does not effectively and efficiently performing in most health services. For this reason, it is important to draw a legal framework of personal health data by stating what is the main purpose of this regulation and how to deal with complicated issues on personal health data in Turkey. The research is descriptive on data protection law for health care setting in Turkey. Primary as well as secondary data has been used for the study. The primary data includes the information collected under current national and international regulations or law. Secondary data include publications, books, journals, empirical legal studies. Consequently, privacy and data protection regimes in health law show there are some obligations, principles and procedures which shall be binding upon natural or legal persons who process health-related personal data. A comparative approach presents there are significant differences in some EU member states due to different legal competencies, policies, and cultural factors. This selected study provides theoretical and practitioner implications by highlighting the need to illustrate the relationship between privacy and confidentiality in Personal Data Protection in Health Law. Furthermore, this paper would help to define the legal framework for the health law case studies on data protection and privacy.Keywords: data protection, personal data, privacy, healthcare, health law
Procedia PDF Downloads 22825701 Formalizing a Procedure for Generating Uncertain Resource Availability Assumptions Based on Real Time Logistic Data Capturing with Auto-ID Systems for Reactive Scheduling
Authors: Lars Laußat, Manfred Helmus, Kamil Szczesny, Markus König
Abstract:
As one result of the project “Reactive Construction Project Scheduling using Real Time Construction Logistic Data and Simulation”, a procedure for using data about uncertain resource availability assumptions in reactive scheduling processes has been developed. Prediction data about resource availability is generated in a formalized way using real-time monitoring data e.g. from auto-ID systems on the construction site and in the supply chains. The paper focuses on the formalization of the procedure for monitoring construction logistic processes, for the detection of disturbance and for generating of new and uncertain scheduling assumptions for the reactive resource constrained simulation procedure that is and will be further described in other papers.Keywords: auto-ID, construction logistic, fuzzy, monitoring, RFID, scheduling
Procedia PDF Downloads 52025700 Wavelet Based Advanced Encryption Standard Algorithm for Image Encryption
Authors: Ajish Sreedharan
Abstract:
With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. As encryption process is applied to the whole image in AES ,it is difficult to improve the efficiency. In this paper, wavelet decomposition is used to concentrate the main information of image to the low frequency part. Then, AES encryption is applied to the low frequency part. The high frequency parts are XORed with the encrypted low frequency part and a wavelet reconstruction is applied. Theoretical analysis and experimental results show that the proposed algorithm has high efficiency, and satisfied security suits for image data transmission.Keywords: discrete wavelet transforms, AES, dynamic SBox
Procedia PDF Downloads 43425699 Using Data from Foursquare Web Service to Represent the Commercial Activity of a City
Authors: Taras Agryzkov, Almudena Nolasco-Cirugeda, Jose L. Oliver, Leticia Serrano-Estrada, Leandro Tortosa, Jose F. Vicent
Abstract:
This paper aims to represent the commercial activity of a city taking as source data the social network Foursquare. The city of Murcia is selected as case study, and the location-based social network Foursquare is the main source of information. After carrying out a reorganisation of the user-generated data extracted from Foursquare, it is possible to graphically display on a map the various city spaces and venues –especially those related to commercial, food and entertainment sector businesses. The obtained visualisation provides information about activity patterns in the city of Murcia according to the people`s interests and preferences and, moreover, interesting facts about certain characteristics of the town itself.Keywords: social networks, spatial analysis, data visualization, geocomputation, Foursquare
Procedia PDF Downloads 43125698 An Experimental Determination of the Limiting Factors Governing the Operation of High-Hydrogen Blends in Domestic Appliances Designed to Burn Natural Gas
Authors: Haiqin Zhou, Robin Irons
Abstract:
The introduction of hydrogen into local networks may, in many cases, require the initial operation of those systems on natural gas/hydrogen blends, either because of a lack of sufficient hydrogen to allow a 100% conversion or because existing infrastructure imposes limitations on the % hydrogen that can be burned before the end-use technologies are replaced. In many systems, the largest number of end-use technologies are small-scale but numerous appliances used for domestic and industrial heating and cooking. In such a scenario, it is important to understand exactly how much hydrogen can be introduced into these appliances before their performance becomes unacceptable and what imposes that limitation. This study seeks to explore a range of significantly higher hydrogen blends and a broad range of factors that might limit operability or environmental acceptability. We will present tests from a burner designed for space heating and optimized for natural gas as an increasing % of hydrogen blends (increasing from 25%) were burned and explore the range of parameters that might govern the acceptability of operation. These include gaseous emissions (particularly NOx and unburned carbon), temperature, flame length, stability and general operational acceptability. Results will show emissions, Temperature, and flame length as a function of thermal load and percentage of hydrogen in the blend. The relevant application and regulation will ultimately determine the acceptability of these values, so it is important to understand the full operational envelope of the burners in question through the sort of extensive parametric testing we have carried out. The present dataset should represent a useful data source for designers interested in exploring appliance operability. In addition to this, we present data on two factors that may be absolutes in determining allowable hydrogen percentages. The first of these is flame blowback. Our results show that, for our system, the threshold between acceptable and unacceptable performance lies between 60 and 65% mol% hydrogen. Another factor that may limit operation, and which would be important in domestic applications, is the acoustic performance of these burners. We will describe a range of operational conditions in which hydrogen blend burners produce a loud and invasive ‘screech’. It will be important for equipment designers and users to find ways to avoid this or mitigate it if performance is to be deemed acceptable.Keywords: blends, operational, domestic appliances, future system operation.
Procedia PDF Downloads 3425697 Application of Data Mining Techniques for Tourism Knowledge Discovery
Authors: Teklu Urgessa, Wookjae Maeng, Joong Seek Lee
Abstract:
Application of five implementations of three data mining classification techniques was experimented for extracting important insights from tourism data. The aim was to find out the best performing algorithm among the compared ones for tourism knowledge discovery. Knowledge discovery process from data was used as a process model. 10-fold cross validation method is used for testing purpose. Various data preprocessing activities were performed to get the final dataset for model building. Classification models of the selected algorithms were built with different scenarios on the preprocessed dataset. The outperformed algorithm tourism dataset was Random Forest (76%) before applying information gain based attribute selection and J48 (C4.5) (75%) after selection of top relevant attributes to the class (target) attribute. In terms of time for model building, attribute selection improves the efficiency of all algorithms. Artificial Neural Network (multilayer perceptron) showed the highest improvement (90%). The rules extracted from the decision tree model are presented, which showed intricate, non-trivial knowledge/insight that would otherwise not be discovered by simple statistical analysis with mediocre accuracy of the machine using classification algorithms.Keywords: classification algorithms, data mining, knowledge discovery, tourism
Procedia PDF Downloads 30025696 Developing an Out-of-Distribution Generalization Model Selection Framework through Impurity and Randomness Measurements and a Bias Index
Authors: Todd Zhou, Mikhail Yurochkin
Abstract:
Out-of-distribution (OOD) detection is receiving increasing amounts of attention in the machine learning research community, boosted by recent technologies, such as autonomous driving and image processing. This newly-burgeoning field has called for the need for more effective and efficient methods for out-of-distribution generalization methods. Without accessing the label information, deploying machine learning models to out-of-distribution domains becomes extremely challenging since it is impossible to evaluate model performance on unseen domains. To tackle this out-of-distribution detection difficulty, we designed a model selection pipeline algorithm and developed a model selection framework with different impurity and randomness measurements to evaluate and choose the best-performing models for out-of-distribution data. By exploring different randomness scores based on predicted probabilities, we adopted the out-of-distribution entropy and developed a custom-designed score, ”CombinedScore,” as the evaluation criterion. This proposed score was created by adding labeled source information into the judging space of the uncertainty entropy score using harmonic mean. Furthermore, the prediction bias was explored through the equality of opportunity violation measurement. We also improved machine learning model performance through model calibration. The effectiveness of the framework with the proposed evaluation criteria was validated on the Folktables American Community Survey (ACS) datasets.Keywords: model selection, domain generalization, model fairness, randomness measurements, bias index
Procedia PDF Downloads 12625695 Adaptation Mechanism and Planning Response to Resiliency Shrinking of Small Towns Based on Complex Adaptive System by Taking Wuhan as an Example
Abstract:
The rapid urbanization process taking big cities as the main body leads to the unequal configuration of urban and rural areas in the aspects of land supply, industrial division of labor, service supply and space allocation, and induces the shrinking characterization of service energy, industrial system and population vitality in small towns. As an important spatial unit in the spectrum of urbanization that serves, connects and couples urban and rural areas, the shrinking phenomenon faced by small towns has an important influence on the healthy development of urbanization. Based on the census of small towns in Wuhan metropolitan area, we have found that the shrinking of small towns is a passive contraction of elastic tension under the squeeze in cities. Once affected by the external forces such as policy regulation, planning guidance, and population return, small towns will achieve expansion and growth. Based on the theory of complex adaptive systems, this paper comprehensively constructs the development index evaluation system of small towns from five aspects of population, economy, space, society and ecology, measures the shrinking level of small towns, further analyzes the shrinking characteristics of small towns, and identifies whether the shrinking is elastic or not. And then this paper measures the resilience ability index of small town contract from the above-mentioned five aspects. Finally, this paper proposes an adaptive mechanism of urban-rural interaction evolution under fine division of labor to response the passive shrinking in small towns of Wuhan. Based on the above, the paper creatively puts forward the planning response measures of the small towns on the aspects of spatial layout, function orientation and service support, which can provide reference for other regions.Keywords: complex adaptive systems, resiliency shrinking, adaptation mechanism, planning response
Procedia PDF Downloads 13025694 Data Integrity: Challenges in Health Information Systems in South Africa
Authors: T. Thulare, M. Herselman, A. Botha
Abstract:
Poor system use, including inappropriate design of health information systems, causes difficulties in communication with patients and increased time spent by healthcare professionals in recording the necessary health information for medical records. System features like pop-up reminders, complex menus, and poor user interfaces can make medical records far more time consuming than paper cards as well as affect decision-making processes. Although errors associated with health information and their real and likely effect on the quality of care and patient safety have been documented for many years, more research is needed to measure the occurrence of these errors and determine the causes to implement solutions. Therefore, the purpose of this paper is to identify data integrity challenges in hospital information systems through a scoping review and based on the results provide recommendations on how to manage these. Only 34 papers were found to be most suitable out of 297 publications initially identified in the field. The results indicated that human and computerized systems are the most common challenges associated with data integrity and factors such as policy, environment, health workforce, and lack of awareness attribute to these challenges but if measures are taken the data integrity challenges can be managed.Keywords: data integrity, data integrity challenges, hospital information systems, South Africa
Procedia PDF Downloads 19025693 Detection of Keypoint in Press-Fit Curve Based on Convolutional Neural Network
Authors: Shoujia Fang, Guoqing Ding, Xin Chen
Abstract:
The quality of press-fit assembly is closely related to reliability and safety of product. The paper proposed a keypoint detection method based on convolutional neural network to improve the accuracy of keypoint detection in press-fit curve. It would provide an auxiliary basis for judging quality of press-fit assembly. The press-fit curve is a curve of press-fit force and displacement. Both force data and distance data are time-series data. Therefore, one-dimensional convolutional neural network is used to process the press-fit curve. After the obtained press-fit data is filtered, the multi-layer one-dimensional convolutional neural network is used to perform the automatic learning of press-fit curve features, and then sent to the multi-layer perceptron to finally output keypoint of the curve. We used the data of press-fit assembly equipment in the actual production process to train CNN model, and we used different data from the same equipment to evaluate the performance of detection. Compared with the existing research result, the performance of detection was significantly improved. This method can provide a reliable basis for the judgment of press-fit quality.Keywords: keypoint detection, curve feature, convolutional neural network, press-fit assembly
Procedia PDF Downloads 23625692 Employing a Knime-based and Open-source Tools to Identify AMI and VER Metabolites from UPLC-MS Data
Authors: Nouf Alourfi
Abstract:
This study examines the metabolism of amitriptyline (AMI) and verapamil (VER) using a KNIME-based method. KNIME improved workflow is an open-source data-analytics platform that integrates a number of open-source metabolomics tools such as CFMID and MetFrag to provide standard data visualisations, predict candidate metabolites, assess them against experimental data, and produce reports on identified metabolites. The use of this workflow is demonstrated by employing three types of liver microsomes (human, rat, and Guinea pig) to study the in vitro metabolism of the two drugs (AMI and VER). This workflow is used to create and treat UPLC-MS (Orbitrap) data. The formulas and structures of these drugs' metabolites can be assigned automatically. The key metabolic routes for amitriptyline are hydroxylation, N-dealkylation, N-oxidation, and conjugation, while N-demethylation, O-demethylation and N-dealkylation, and conjugation are the primary metabolic routes for verapamil. The identified metabolites are compatible to the published, clarifying the solidity of the workflow technique and the usage of computational tools like KNIME in supporting the integration and interoperability of emerging novel software packages in the metabolomics area.Keywords: KNIME, CFMID, MetFrag, Data Analysis, Metabolomics
Procedia PDF Downloads 12625691 GIS for Simulating Air Traffic by Applying Different Multi-radar Positioning Techniques
Authors: Amara Rafik, Bougherara Maamar, Belhadj Aissa Mostefa
Abstract:
Radar data is one of the many data sources used by ATM Air Traffic Management systems. These data come from air navigation radar antennas. These radars intercept signals emitted by the various aircraft crossing the controlled airspace and calculate the position of these aircraft and retransmit their positions to the Air Traffic Management System. For greater reliability, these radars are positioned in such a way as to allow their coverage areas to overlap. An aircraft will therefore be detected by at least one of these radars. However, the position coordinates of the same aircraft and sent by these different radars are not necessarily identical. Therefore, the ATM system must calculate a single position (radar track) which will ultimately be sent to the control position and displayed on the air traffic controller's monitor. There are several techniques for calculating the radar track. Furthermore, the geographical nature of the problem requires the use of a Geographic Information System (GIS), i.e. a geographical database on the one hand and geographical processing. The objective of this work is to propose a GIS for traffic simulation which reconstructs the evolution over time of aircraft positions from a multi-source radar data set and by applying these different techniques.Keywords: ATM, GIS, radar data, air traffic simulation
Procedia PDF Downloads 9225690 Integrating of Multi-Criteria Decision Making and Spatial Data Warehouse in Geographic Information System
Authors: Zohra Mekranfar, Ahmed Saidi, Abdellah Mebrek
Abstract:
This work aims to develop multi-criteria decision making (MCDM) and spatial data warehouse (SDW) methods, which will be integrated into a GIS according to a ‘GIS dominant’ approach. The GIS operating tools will be operational to operate the SDW. The MCDM methods can provide many solutions to a set of problems with various and multiple criteria. When the problem is so complex, integrating spatial dimension, it makes sense to combine the MCDM process with other approaches like data mining, ascending analyses, we present in this paper an experiment showing a geo-decisional methodology of SWD construction, On-line analytical processing (OLAP) technology which combines both basic multidimensional analysis and the concepts of data mining provides powerful tools to highlight inductions and information not obvious by traditional tools. However, these OLAP tools become more complex in the presence of the spatial dimension. The integration of OLAP with a GIS is the future geographic and spatial information solution. GIS offers advanced functions for the acquisition, storage, analysis, and display of geographic information. However, their effectiveness for complex spatial analysis is questionable due to their determinism and their decisional rigor. A prerequisite for the implementation of any analysis or exploration of spatial data requires the construction and structuring of a spatial data warehouse (SDW). This SDW must be easily usable by the GIS and by the tools offered by an OLAP system.Keywords: data warehouse, GIS, MCDM, SOLAP
Procedia PDF Downloads 18225689 Using Open Source Data and GIS Techniques to Overcome Data Deficiency and Accuracy Issues in the Construction and Validation of Transportation Network: Case of Kinshasa City
Authors: Christian Kapuku, Seung-Young Kho
Abstract:
An accurate representation of the transportation system serving the region is one of the important aspects of transportation modeling. Such representation often requires developing an abstract model of the system elements, which also requires important amount of data, surveys and time. However, in some cases such as in developing countries, data deficiencies, time and budget constraints do not always allow such accurate representation, leaving opportunities to assumptions that may negatively affect the quality of the analysis. With the emergence of Internet open source data especially in the mapping technologies as well as the advances in Geography Information System, opportunities to tackle these issues have raised. Therefore, the objective of this paper is to demonstrate such application through a practical case of the development of the transportation network for the city of Kinshasa. The GIS geo-referencing was used to construct the digitized map of Transportation Analysis Zones using available scanned images. Centroids were then dynamically placed at the center of activities using an activities density map. Next, the road network with its characteristics was built using OpenStreet data and other official road inventory data by intersecting their layers and cleaning up unnecessary links such as residential streets. The accuracy of the final network was then checked, comparing it with satellite images from Google and Bing. For the validation, the final network was exported into Emme3 to check for potential network coding issues. Results show a high accuracy between the built network and satellite images, which can mostly be attributed to the use of open source data.Keywords: geographic information system (GIS), network construction, transportation database, open source data
Procedia PDF Downloads 171