Search results for: data protection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26257

Search results for: data protection

24967 Comparative Analysis of the Third Generation of Research Data for Evaluation of Solar Energy Potential

Authors: Claudineia Brazil, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Rafael Haag

Abstract:

Renewable energy sources are dependent on climatic variability, so for adequate energy planning, observations of the meteorological variables are required, preferably representing long-period series. Despite the scientific and technological advances that meteorological measurement systems have undergone in the last decades, there is still a considerable lack of meteorological observations that form series of long periods. The reanalysis is a system of assimilation of data prepared using general atmospheric circulation models, based on the combination of data collected at surface stations, ocean buoys, satellites and radiosondes, allowing the production of long period data, for a wide gamma. The third generation of reanalysis data emerged in 2010, among them is the Climate Forecast System Reanalysis (CFSR) developed by the National Centers for Environmental Prediction (NCEP), these data have a spatial resolution of 0.50 x 0.50. In order to overcome these difficulties, it aims to evaluate the performance of solar radiation estimation through alternative data bases, such as data from Reanalysis and from meteorological satellites that satisfactorily meet the absence of observations of solar radiation at global and/or regional level. The results of the analysis of the solar radiation data indicated that the reanalysis data of the CFSR model presented a good performance in relation to the observed data, with determination coefficient around 0.90. Therefore, it is concluded that these data have the potential to be used as an alternative source in locations with no seasons or long series of solar radiation, important for the evaluation of solar energy potential.

Keywords: climate, reanalysis, renewable energy, solar radiation

Procedia PDF Downloads 198
24966 L. rhamnosus GG Lysate Can Inhibit Cytotoxic Effects of S. aureus on Keratinocytes in vitro

Authors: W. Mohammed Saeed, A. J. Mcbain, S. M. Cruickshank, C. A. O’Neill

Abstract:

In the gut, probiotics have been shown to protect epithelial cells from pathogenic bacteria through a number of mechanisms: 1-Increasing epithelial barrier function, 2-Modulation of the immune response especially innate immune response, 3-Inhibition of pathogen adherence and down regulation of virulence factors. Since probiotics have positive impacts on the gut, their potential effects on other body tissues, such as skin have begun to be investigated. The purpose of this project is to characterize the potential of probiotic bacteria lysate as therapeutic agent for preventing or reducing the S. aureus infection. Normal human primary keratinocytes (KCs) were exposed to S. aureus (106/ml) in the presence or absence of L. rhamnosus GG lysate (extracted from 108cfu/ml). The viability of the KCs was measured after 24 hours using a trypan blue exclusion assay. When KCs were treated with S aureus alone, only 25% of the KCs remained viable at 24 hours post infection. However, in the presence of L. rhamnosus GG lysate the viability of pathogen infected KCs increased to 58% (p=0.008, n=3). Furthermore, when KCs co-exposed, pre- exposed or post-exposed to L. rhamnosus GG lysate, the viability of the KCs increased to ≈60%, the L. rhamnosus GG lysate was afforded equal protection in different conditions. These data suggests that two possible separate mechanisms are involved in the protective effects of L. rhamnosus GG such as reducing S. aureus growth, or inhibiting of pathogenic adhesion. Interestingly, a lysate of L rhamnosus GG provided significant reduction in S. aureus growth and adhesion of S. aureus that being viable following 24 hours incubation with S aureus. Therefore, a series of Liquid Chromatography (RP-LC) methods were adopted to partially purify the lysate in combination with functional assays to elucidate in which fractions the efficacious molecules were contained. In addition, the Mass Spectrometry-based protein sequencing was used to identify putative proteins in the fractions. The data presented from purification process demonstrated that L. rhamnosus GG lysate has the potential to protect keratinocytes from the toxic effects of the skin pathogen, S. aureus. Three potential mechanisms were identified: inhibition of pathogen growth; competitive exclusion; and displacement of the pathogen from keratinocyte binding sites. In this study, ‘moonlight’ proteins were identified in the current study’s MS/MS data for L. rhamnosus GG lysate, which could elucidate the ability of lysate in the competitive exclusion and displacement of S. aureus from keratinocyte binding sites. Taken together, it can be speculated that L. rhamnosus GG lysate utilizes different mechanisms to protect keratinocytes from S. aureus toxicity. The present study indicates that the proteinaceous substances are involved in anti-adhesion activity. This is achieved by displacing the pathogen and preventing the severity of pathogen infection and the moonlight proteins might be involved in inhibiting the adhesion of pathogens.

Keywords: lysate, fractions, adhesion, L. rhamnosus GG, S. aureus toxicity

Procedia PDF Downloads 283
24965 Implementation of Inclusive Education in DepEd-Dasmarinas: Basis for Inclusion Program Framework

Authors: Manuela S. Tolentino, John G. Nepomuceno

Abstract:

The purpose of this investigation was to assess the implementation of inclusive education (IE) in 6 elementary and 5 secondary public schools in the City Schools Division of Dasmarinas. Participants in this study were 11 school heads, 73 teachers, 22 parents and 22 students (regular and with special needs) who were selected using purposive sampling. A 30-item questionnaire was used to gather data on the extent of the implementation of IE in the division while focus group discussion (FGD) was used to gather insights on what facilitate and hinder the implementation of the IE program. This study assessed the following variables: school culture and environment, inclusive education policy implementation, and curriculum design and practices. Data were analyzed using frequency count, mean and ranking. Results revealed that participants have similar assessment on the extent of the implementation of IE. School heads rated school culture and environment as highest in terms of implementation while teachers and pupils chose curriculum design and practices. On the other hand, parents felt that inclusive education policies are implemented best. School culture and environment are given high ratings. Participants perceived that the IE program in the division is making everyone feel welcome regardless of age, sex, social status, physical, mental and emotional state; students with or without disability are equally valued, and students help each. However, some aspects of the IE program implementation are given low ratings namely: partnership between staff, parents and caregivers, school’s effort to minimize discriminatory practice, and stakeholders sharing the philosophy of inclusion. As regards education policy implementation, indicators with the highest ranks were school’s effort to admit students from the locality especially students with special needs, and the implementation of the child protection policy and anti-bullying policy. The results of the FGD revealed that both school heads and teachers possessed the welcoming gesture to accommodate students with special needs. This can be linked to the increasing enrolment of SNE in the division. However, limitations of the teachers’ knowledge on handling learners, facilities and collaboration among stakeholders hinder the implementation of IE program. Based on the findings, inclusion program framework was developed for program enhancement. This will be the basis for the improvement of the program’s efficiency, the relationship between stakeholders, and formulation of solutions.

Keywords: inclusion, inclusive education, framework, special education

Procedia PDF Downloads 161
24964 Data Mining Spatial: Unsupervised Classification of Geographic Data

Authors: Chahrazed Zouaoui

Abstract:

In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.

Keywords: mining, GIS, geo-clustering, neighborhood

Procedia PDF Downloads 367
24963 Measurement of Viscosity and Moisture of Oil in Supradistribution Transformers Using Ultrasonic Waves

Authors: Ehsan Kadkhodaie, Shahin Parvar, Soroush Senemar, Mostafa Shriat, Abdolrasoul Malekpour

Abstract:

The role of oil in supra distribution transformers is so critical and, several standards in determining the quality of oil have been offered. So far, moisture, viscosity and insulation protection of the oil have been measured based on mechanical and chemical methods and systems such as kart fisher, falling ball and TDM 4000 that most of these techniques are destructive and have many problems such as pollution. In this study, due to the properties of oil and also physical behavior of ultrasound wave new method was designed to in the determination of oil indicators including viscosity and moisture. The results show the oil viscosity can be found from the relationship μ = 42.086/√EE and moisture from (PLUS+) = −15.65 (PPM) + 26040 relationship.

Keywords: oil, viscosity, moisture, ultrasonic waves

Procedia PDF Downloads 570
24962 Analysis and Prediction of Netflix Viewing History Using Netflixlatte as an Enriched Real Data Pool

Authors: Amir Mabhout, Toktam Ghafarian, Amirhossein Farzin, Zahra Makki, Sajjad Alizadeh, Amirhossein Ghavi

Abstract:

The high number of Netflix subscribers makes it attractive for data scientists to extract valuable knowledge from the viewers' behavioural analyses. This paper presents a set of statistical insights into viewers' viewing history. After that, a deep learning model is used to predict the future watching behaviour of the users based on previous watching history within the Netflixlatte data pool. Netflixlatte in an aggregated and anonymized data pool of 320 Netflix viewers with a length 250 000 data points recorded between 2008-2022. We observe insightful correlations between the distribution of viewing time and the COVID-19 pandemic outbreak. The presented deep learning model predicts future movie and TV series viewing habits with an average loss of 0.175.

Keywords: data analysis, deep learning, LSTM neural network, netflix

Procedia PDF Downloads 227
24961 Analysis of User Data Usage Trends on Cellular and Wi-Fi Networks

Authors: Jayesh M. Patel, Bharat P. Modi

Abstract:

The availability of on mobile devices that can invoke the demonstrated that the total data demand from users is far higher than previously articulated by measurements based solely on a cellular-centric view of smart-phone usage. The ratio of Wi-Fi to cellular traffic varies significantly between countries, This paper is shown the compression between the cellular data usage and Wi-Fi data usage by the user. This strategy helps operators to understand the growing importance and application of yield management strategies designed to squeeze maximum returns from their investments into the networks and devices that enable the mobile data ecosystem. The transition from unlimited data plans towards tiered pricing and, in the future, towards more value-centric pricing offers significant revenue upside potential for mobile operators, but, without a complete insight into all aspects of smartphone customer behavior, operators will unlikely be able to capture the maximum return from this billion-dollar market opportunity.

Keywords: cellular, Wi-Fi, mobile, smart phone

Procedia PDF Downloads 353
24960 Data Driven Infrastructure Planning for Offshore Wind farms

Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree

Abstract:

The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.

Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data

Procedia PDF Downloads 69
24959 Empirical Acceleration Functions and Fuzzy Information

Authors: Muhammad Shafiq

Abstract:

In accelerated life testing approaches life time data is obtained under various conditions which are considered more severe than usual condition. Classical techniques are based on obtained precise measurements, and used to model variation among the observations. In fact, there are two types of uncertainty in data: variation among the observations and the fuzziness. Analysis techniques, which do not consider fuzziness and are only based on precise life time observations, lead to pseudo results. This study was aimed to examine the behavior of empirical acceleration functions using fuzzy lifetimes data. The results showed an increased fuzziness in the transformed life times as compare to the input data.

Keywords: acceleration function, accelerated life testing, fuzzy number, non-precise data

Procedia PDF Downloads 287
24958 Evaluating Alternative Structures for Prefix Trees

Authors: Feras Hanandeh, Izzat Alsmadi, Muhammad M. Kwafha

Abstract:

Prefix trees or tries are data structures that are used to store data or index of data. The goal is to be able to store and retrieve data by executing queries in quick and reliable manners. In principle, the structure of the trie depends on having letters in nodes at the different levels to point to the actual words in the leafs. However, the exact structure of the trie may vary based on several aspects. In this paper, we evaluated different structures for building tries. Using datasets of words of different sizes, we evaluated the different forms of trie structures. Results showed that some characteristics may impact significantly, positively or negatively, the size and the performance of the trie. We investigated different forms and structures for the trie. Results showed that using an array of pointers in each level to represent the different alphabet letters is the best choice.

Keywords: data structures, indexing, tree structure, trie, information retrieval

Procedia PDF Downloads 445
24957 Data Management System for Environmental Remediation

Authors: Elizaveta Petelina, Anton Sizo

Abstract:

Environmental remediation projects deal with a wide spectrum of data, including data collected during site assessment, execution of remediation activities, and environmental monitoring. Therefore, an appropriate data management is required as a key factor for well-grounded decision making. The Environmental Data Management System (EDMS) was developed to address all necessary data management aspects, including efficient data handling and data interoperability, access to historical and current data, spatial and temporal analysis, 2D and 3D data visualization, mapping, and data sharing. The system focuses on support of well-grounded decision making in relation to required mitigation measures and assessment of remediation success. The EDMS is a combination of enterprise and desktop level data management and Geographic Information System (GIS) tools assembled to assist to environmental remediation, project planning, and evaluation, and environmental monitoring of mine sites. EDMS consists of seven main components: a Geodatabase that contains spatial database to store and query spatially distributed data; a GIS and Web GIS component that combines desktop and server-based GIS solutions; a Field Data Collection component that contains tools for field work; a Quality Assurance (QA)/Quality Control (QC) component that combines operational procedures for QA and measures for QC; Data Import and Export component that includes tools and templates to support project data flow; a Lab Data component that provides connection between EDMS and laboratory information management systems; and a Reporting component that includes server-based services for real-time report generation. The EDMS has been successfully implemented for the Project CLEANS (Clean-up of Abandoned Northern Mines). Project CLEANS is a multi-year, multimillion-dollar project aimed at assessing and reclaiming 37 uranium mine sites in northern Saskatchewan, Canada. The EDMS has effectively facilitated integrated decision-making for CLEANS project managers and transparency amongst stakeholders.

Keywords: data management, environmental remediation, geographic information system, GIS, decision making

Procedia PDF Downloads 144
24956 An Efficient Approach for Speed up Non-Negative Matrix Factorization for High Dimensional Data

Authors: Bharat Singh Om Prakash Vyas

Abstract:

Now a day’s applications deal with High Dimensional Data have tremendously used in the popular areas. To tackle with such kind of data various approached has been developed by researchers in the last few decades. To tackle with such kind of data various approached has been developed by researchers in the last few decades. One of the problems with the NMF approaches, its randomized valued could not provide absolute optimization in limited iteration, but having local optimization. Due to this, we have proposed a new approach that considers the initial values of the decomposition to tackle the issues of computationally expensive. We have devised an algorithm for initializing the values of the decomposed matrix based on the PSO (Particle Swarm Optimization). Through the experimental result, we will show the proposed method converse very fast in comparison to other row rank approximation like simple NMF multiplicative, and ACLS techniques.

Keywords: ALS, NMF, high dimensional data, RMSE

Procedia PDF Downloads 333
24955 Integrating Time-Series and High-Spatial Remote Sensing Data Based on Multilevel Decision Fusion

Authors: Xudong Guan, Ainong Li, Gaohuan Liu, Chong Huang, Wei Zhao

Abstract:

Due to the low spatial resolution of MODIS data, the accuracy of small-area plaque extraction with a high degree of landscape fragmentation is greatly limited. To this end, the study combines Landsat data with higher spatial resolution and MODIS data with higher temporal resolution for decision-level fusion. Considering the importance of the land heterogeneity factor in the fusion process, it is superimposed with the weighting factor, which is to linearly weight the Landsat classification result and the MOIDS classification result. Three levels were used to complete the process of data fusion, that is the pixel of MODIS data, the pixel of Landsat data, and objects level that connect between these two levels. The multilevel decision fusion scheme was tested in two sites of the lower Mekong basin. We put forth a comparison test, and it was proved that the classification accuracy was improved compared with the single data source classification results in terms of the overall accuracy. The method was also compared with the two-level combination results and a weighted sum decision rule-based approach. The decision fusion scheme is extensible to other multi-resolution data decision fusion applications.

Keywords: image classification, decision fusion, multi-temporal, remote sensing

Procedia PDF Downloads 114
24954 Metagenomics Analysis of Bacteria in Sorghum Using next Generation Sequencing

Authors: Kedibone Masenya, Memory Tekere, Jasper Rees

Abstract:

Sorghum is an important cereal crop in the world. In particular, it has attracted breeders due to capacity to serve as food, feed, fiber and bioenergy crop. Like any other plant, sorghum hosts a variety of microbes, which can either, have a neutral, negative and positive influence on the plant. In the current study, regions (V3/V4) of 16 S rRNA were targeted to extensively assess bacterial multitrophic interactions in the phyllosphere of sorghum. The results demonstrated that the presence of a pathogen has a significant effect on the endophytic bacterial community. Understanding these interactions is key to develop new strategies for plant protection.

Keywords: bacteria, multitrophic, sorghum, target sequencing

Procedia PDF Downloads 270
24953 Analysis of Cooperative Learning Behavior Based on the Data of Students' Movement

Authors: Wang Lin, Li Zhiqiang

Abstract:

The purpose of this paper is to analyze the cooperative learning behavior pattern based on the data of students' movement. The study firstly reviewed the cooperative learning theory and its research status, and briefly introduced the k-means clustering algorithm. Then, it used clustering algorithm and mathematical statistics theory to analyze the activity rhythm of individual student and groups in different functional areas, according to the movement data provided by 10 first-year graduate students. It also focused on the analysis of students' behavior in the learning area and explored the law of cooperative learning behavior. The research result showed that the cooperative learning behavior analysis method based on movement data proposed in this paper is feasible. From the results of data analysis, the characteristics of behavior of students and their cooperative learning behavior patterns could be found.

Keywords: behavior pattern, cooperative learning, data analyze, k-means clustering algorithm

Procedia PDF Downloads 173
24952 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 90
24951 A Security Cloud Storage Scheme Based Accountable Key-Policy Attribute-Based Encryption without Key Escrow

Authors: Ming Lun Wang, Yan Wang, Ning Ruo Sun

Abstract:

With the development of cloud computing, more and more users start to utilize the cloud storage service. However, there exist some issues: 1) cloud server steals the shared data, 2) sharers collude with the cloud server to steal the shared data, 3) cloud server tampers the shared data, 4) sharers and key generation center (KGC) conspire to steal the shared data. In this paper, we use advanced encryption standard (AES), hash algorithms, and accountable key-policy attribute-based encryption without key escrow (WOKE-AKP-ABE) to build a security cloud storage scheme. Moreover, the data are encrypted to protect the privacy. We use hash algorithms to prevent the cloud server from tampering the data uploaded to the cloud. Analysis results show that this scheme can resist conspired attacks.

Keywords: cloud storage security, sharing storage, attributes, Hash algorithm

Procedia PDF Downloads 379
24950 Comparative Study of Impedance Parameters for 42CrMo4 Steel Nitrided and Exposed at Electrochemical Corrosion

Authors: M. H. Belahssen, S. Benramache

Abstract:

This paper presents corrosion behavior of alloy 42CrMo4 steel nitrided by plasma. Different samples nitrided were tested. The corrosion behavior was evaluated by electrochemical impedance spectroscopy and the tests were carried out in acid chloride solution 1M. The best corrosion protection was observed for nitrided samples. The aim of this work is to compare equivalents circuits corresponding to Nyquist curves simulated and experimental and select who gives best results of impedance parameters with lowest error.

Keywords: pasma nitriding, steel, alloy 42CrMo4, elecrochemistry, corrosion behavior

Procedia PDF Downloads 349
24949 Determination of Four Anions in the Ground Layer of Tomb Murals by Ion Chromatography

Authors: Liping Qiu, Xiaofeng Zhang

Abstract:

The ion chromatography method for the rapid determination of four anions (F⁻、Cl⁻、SO₄²⁻、NO₃⁻) in burial ground poles was optimized. The L₉(₃⁴) orthogonal test was used to determine the optimal parameters of sample pretreatment: accurately weigh 2.000g of sample, add 10mL of ultrapure water, and extract for 40min under the conditions of shaking temperature 40℃ and shaking speed 180 r·min-1. The eluent was 25 mmol/L KOH solution, the analytical column was Ion Pac® AS11-SH (250 mm × 4.0 mm), and the purified filtrate was measured by a conductivity detector. Under this method, the detection limit of each ion is 0.066~0.078mg/kg, the relative standard deviation is 0.86%~2.44% (n=7), and the recovery rate is 94.6~101.9.

Keywords: ion chromatography, tomb, anion (F⁻, Cl⁻, SO₄²⁻, NO₃⁻), environmental protection

Procedia PDF Downloads 89
24948 Development of Superhydrophobic Cotton Fabrics and Their Functional Properties

Authors: Muhammad Zaman Khan, Vijay Baheti, Jiri Militky

Abstract:

The present study is focused on the development of multifunctional cotton fabric while having good physiological comfort properties. The functional properties developed include superhydrophobicity (Lotus effect) and UV protection. For this, TiO₂ nanoparticles along with fluorocarbon and organic-inorganic binder have been used to optimize the multifunctional properties. Deposition of TiO₂ nanoparticles with water repellent finish on cotton fabric has been carried out using the pad dry cure method at fix parameters. The morphology and elemental composition of as-deposited particles have been studied by using SEM and EDS. The chemical composition of nanoparticles was determined using energy dispersive spectroscopy. The treated samples exhibited excellent water repellency and UV protection factor. The study of the comfort properties of fabric showed that it had excellent physiological comfort properties. Optimized concentration of water repellent chemical (50g/l) was used in formulations with TiO₂ nanoparticles and organic-inorganic binder. Four formulations were prepared according to the design of the experiment. The formulations were applied to the cotton fabric by roller padding at room temperature (15–20°C). Surface morphology was investigated via SEM images. EDS analysis was also carried out to analyze the composition and atomic percentage of elements. The water contact angle (WCA) of cotton fabric increases with increase in TiO₂ nanoparticles concentration and reaches its maximum value (157°) when the concentration of TiO₂ is 20g/l. The water sliding angle (WSA) decreases and gains minimum value at the same concentration of TiO₂ at which WCA is highest. It was seen samples treated with formulations of TiO₂ nanoparticles exhibits excellent UPF, UV-A and UV-B blocking. However, there was no significant deterioration of air permeability. The water vapor permeability was also slightly decreased (4%) but is acceptable. It can be concluded that there is no significant change in both air and water vapor permeability after nanoparticles coating on the surface of the cotton fabric. The coated cotton fabric has little effect on the stiffness. The stiffness of coated samples was not increased significantly; thus comfort of cotton fabric is not decreased. This functionalized cotton fabric also exhibits good physiological comfort properties. ''The authors are also thankful to student grant competition 21312 provided at Technical University of Liberec''.

Keywords: comfort, functional, nanoparticles, UV protective

Procedia PDF Downloads 135
24947 The Study on Life of Valves Evaluation Based on Tests Data

Authors: Binjuan Xu, Qian Zhao, Ping Jiang, Bo Guo, Zhijun Cheng, Xiaoyue Wu

Abstract:

Astronautical valves are key units in engine systems of astronautical products; their reliability will influence results of rocket or missile launching, even lead to damage to staff and devices on the ground. Besides failure in engine system may influence the hitting accuracy and flight shot of missiles. Therefore high reliability is quite essential to astronautical products. There are quite a few literature doing research based on few failure test data to estimate valves’ reliability, thus this paper proposed a new method to estimate valves’ reliability, according to the corresponding tests of different failure modes, this paper takes advantage of tests data which acquired from temperature, vibration, and action tests to estimate reliability in every failure modes, then this paper has regarded these three kinds of tests as three stages in products’ process to integrate these results to acquire valves’ reliability. Through the comparison of results achieving from tests data and simulated data, the results have illustrated how to obtain valves’ reliability based on the few failure data with failure modes and prove that the results are effective and rational.

Keywords: censored data, temperature tests, valves, vibration tests

Procedia PDF Downloads 330
24946 Development of Energy Benchmarks Using Mandatory Energy and Emissions Reporting Data: Ontario Post-Secondary Residences

Authors: C. Xavier Mendieta, J. J McArthur

Abstract:

Governments are playing an increasingly active role in reducing carbon emissions, and a key strategy has been the introduction of mandatory energy disclosure policies. These policies have resulted in a significant amount of publicly available data, providing researchers with a unique opportunity to develop location-specific energy and carbon emission benchmarks from this data set, which can then be used to develop building archetypes and used to inform urban energy models. This study presents the development of such a benchmark using the public reporting data. The data from Ontario’s Ministry of Energy for Post-Secondary Educational Institutions are being used to develop a series of building archetype dynamic building loads and energy benchmarks to fill a gap in the currently available building database. This paper presents the development of a benchmark for college and university residences within ASHRAE climate zone 6 areas in Ontario using the mandatory disclosure energy and greenhouse gas emissions data. The methodology presented includes data cleaning, statistical analysis, and benchmark development, and lessons learned from this investigation are presented and discussed to inform the development of future energy benchmarks from this larger data set. The key findings from this initial benchmarking study are: (1) the importance of careful data screening and outlier identification to develop a valid dataset; (2) the key features used to develop a model of the data are building age, size, and occupancy schedules and these can be used to estimate energy consumption; and (3) policy changes affecting the primary energy generation significantly affected greenhouse gas emissions, and consideration of these factors was critical to evaluate the validity of the reported data.

Keywords: building archetypes, data analysis, energy benchmarks, GHG emissions

Procedia PDF Downloads 293
24945 Exploring Factors Associated with Substance Use among Pregnant Women in a Cape Town Community

Authors: Mutshinye Manguvhewa, Maria Florence, Mansoo Yu, Elize Koch, Kamal Kamaloodien

Abstract:

Substance use among pregnant women is a perennial problem in the Western Cape Province of South Africa. There are many influential factors are associated with substance use among women of childbearing age. The study explored factors associated with substance use among pregnant women using a qualitative research design and the bio-ecological theoretical framework to explore and guide the researcher throughout the study. Participants were selected using purposive sampling. Only participants accessed from the Department of Social Development meeting the inclusion criteria of the study were interviewed using semi structured interviews. Immediate referral for psychological intervention during the interview was available for participants who needed it. Braun and Clarke's six phases of thematic analysis were utilised to analyse the data. The study adheres to ethical guidelines for the participants' protection. Participants were informed about the study before the initiation of the interviews and the details of their voluntary participation were explained. The key findings from this study illustrate that socio-cultural factors, personal factors, emotional response and intimate relationships are the major contributing factors to substance use among pregnant women in this sample. The results outline the preventative measures that pregnant women implement. Lastly, the study reveals the positive and negative perceptions of substance use programmes that participants share. Some of the study findings are similar to the existing literature and some of the findings differed. Recommendations emanating from the study include that the stakeholders, rehabilitation centres, Department of Health and future researchers should act proactively against substance use during pregnancy.

Keywords: substance addiction, antenatal care, pregnancy, substance use

Procedia PDF Downloads 113
24944 Collision Detection Algorithm Based on Data Parallelism

Authors: Zhen Peng, Baifeng Wu

Abstract:

Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.

Keywords: data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability

Procedia PDF Downloads 279
24943 Relevance of Copyright and Trademark in the Gaming Industry

Authors: Deeksha Karunakar

Abstract:

The gaming industry is one of the biggest industries in the world. Video games are interactive works of authorship that require the execution of a computer programme on specialized hardware but which also incorporate a wide variety of other artistic mediums, such as music, scripts, stories, video, paintings, and characters, into which the player takes an active role. Therefore, video games are not made as singular, simple works but rather as a collection of elements that, if they reach a certain level of originality and creativity, can each be copyrighted on their own. A video game is made up of a wide variety of parts, all of which combine to form the overall sensation that we, the players, have while playing. The entirety of the components is implemented in the form of software code, which is then translated into the game's user interface. Even while copyright protection is already in place for the coding of software, the work that is produced because of that coding can also be protected by copyright. This includes the game's storyline or narrative, its characters, and even elements of the code on their own. In each sector, there is a potential legal framework required, and the gaming industry also requires legal frameworks. This represents the importance of intellectual property laws in each sector. This paper will explore the beginnings of video games, the various aspects of game copyrights, and the approach of the courts, including examples of a few different instances. Although the creative arts have always been known to draw inspiration from and build upon the works of others, it has not always been simple to evaluate whether a game has been cloned. The video game business is experiencing growth as it has never seen before today. The majority of today's video games are both pieces of software and works of audio-visual art. Even though the existing legal framework does not have a clause specifically addressing video games, it is clear that there is a great many alternative means by which this protection can be granted. This paper will represent the importance of copyright and trademark laws in the gaming industry and its regulations with the help of relevant case laws via utilizing doctrinal methodology to support its findings. The aim of the paper is to make aware of the applicability of intellectual property laws in the gaming industry and how the justice system is evolving to adapt to such new industries. Furthermore, it will provide in-depth knowledge of their relationship with each other.

Keywords: copyright, DMCA, gaming industry, trademark, WIPO

Procedia PDF Downloads 57
24942 Changing Arbitrary Data Transmission Period by Using Bluetooth Module on Gas Sensor Node of Arduino Board

Authors: Hiesik Kim, Yong-Beom Kim, Jaheon Gu

Abstract:

Internet of Things (IoT) applications are widely serviced and spread worldwide. Local wireless data transmission technique must be developed to rate up with some technique. Bluetooth wireless data communication is wireless technique is technique made by Special Inter Group (SIG) using the frequency range 2.4 GHz, and it is exploiting Frequency Hopping to avoid collision with a different device. To implement experiment, equipment for experiment transmitting measured data is made by using Arduino as open source hardware, gas sensor, and Bluetooth module and algorithm controlling transmission rate is demonstrated. Experiment controlling transmission rate also is progressed by developing Android application receiving measured data, and controlling this rate is available at the experiment result. It is important that in the future, improvement for communication algorithm be needed because a few error occurs when data is transferred or received.

Keywords: Arduino, Bluetooth, gas sensor, IoT, transmission

Procedia PDF Downloads 269
24941 Real-Time Sensor Fusion for Mobile Robot Localization in an Oil and Gas Refinery

Authors: Adewole A. Ayoade, Marshall R. Sweatt, John P. H. Steele, Qi Han, Khaled Al-Wahedi, Hamad Karki, William A. Yearsley

Abstract:

Understanding the behavioral characteristics of sensors is a crucial step in fusing data from several sensors of different types. This paper introduces a practical, real-time approach to integrate heterogeneous sensor data to achieve higher accuracy than would be possible from any one individual sensor in localizing a mobile robot. We use this approach in both indoor and outdoor environments and it is especially appropriate for those environments like oil and gas refineries due to their sparse and featureless nature. We have studied the individual contribution of each sensor data to the overall combined accuracy achieved from the fusion process. A Sequential Update Extended Kalman Filter(EKF) using validation gates was used to integrate GPS data, Compass data, WiFi data, Inertial Measurement Unit(IMU) data, Vehicle Velocity, and pose estimates from Fiducial marker system. Results show that the approach can enable a mobile robot to navigate autonomously in any environment using a priori information.

Keywords: inspection mobile robot, navigation, sensor fusion, sequential update extended Kalman filter

Procedia PDF Downloads 461
24940 Fault Location Detection in Active Distribution System

Authors: R. Rezaeipour, A. R. Mehrabi

Abstract:

Recent increase of the DGs and microgrids in distribution systems, disturbs the tradition structure of the system. Coordination between protection devices in such a system becomes the concern of the network operators. This paper presents a new method for fault location detection in the active distribution networks, independent of the fault type or its resistance. The method uses synchronized voltage and current measurements at the interconnection of DG units and is able to adapt to changes in the topology of the system. The method has been tested on a 38-bus distribution system, with very encouraging results.

Keywords: fault location detection, active distribution system, micro grids, network operators

Procedia PDF Downloads 772
24939 Energy Efficient Massive Data Dissemination Through Vehicle Mobility in Smart Cities

Authors: Salman Naseer

Abstract:

One of the main challenges of operating a smart city (SC) is collecting the massive data generated from multiple data sources (DS) and to transmit them to the control units (CU) for further data processing and analysis. These ever-increasing data demands require not only more and more capacity of the transmission channels but also results in resource over-provision to meet the resilience requirements, thus the unavoidable waste because of the data fluctuations throughout the day. In addition, the high energy consumption (EC) and carbon discharges from these data transmissions posing serious issues to the environment we live in. Therefore, to overcome the issues of intensive EC and carbon emissions (CE) of massive data dissemination in Smart Cities, we propose an energy efficient and carbon reduction approach by utilizing the daily mobility of the existing vehicles as an alternative communications channel to accommodate the data dissemination in smart cities. To illustrate the effectiveness and efficiency of our approach, we take the Auckland City in New Zealand as an example, assuming massive data generated by various sources geographically scattered throughout the Auckland region to the control centres located in city centre. The numerical results show that our proposed approach can provide up to 5 times lower delay as transferring the large volume of data by utilizing the existing daily vehicles’ mobility than the conventional transmission network. Moreover, our proposed approach offers about 30% less EC and CE than that of conventional network transmission approach.

Keywords: smart city, delay tolerant network, infrastructure offloading, opportunistic network, vehicular mobility, energy consumption, carbon emission

Procedia PDF Downloads 130
24938 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm

Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy

Abstract:

IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.

Keywords: IoT, fog networks, data stewardship, dynamic access policy

Procedia PDF Downloads 47