Search results for: data integrity and privacy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25449

Search results for: data integrity and privacy

24429 Thermomechanical Behaviour of Various Pressurized Installations Subjected to Thermal Load Due to the Combustion of Metal Particles

Authors: Khaled Ayfi, Morgan Dal, Frederic Coste, Nicolas Gallienne, Martina Ridlova, Philippe Lorong

Abstract:

In the gas industry, contamination of equipment by metal particles is one of the feared phenomena. Indeed, particles inside equipment can be driven by the gas flow and accumulate in places where the velocity is low. As they constitute a potential ignition hazard, particular attention is paid to the presence of particles in the oxygen industry. Indeed, the heat release from ignited particles may damage the equipment and even result in a loss of integrity. The objective of this work is to support the development of new design criteria. Studying the thermomechanical behavior of this equipment, thanks to numerical simulations, allows us to test the influence of various operating parameters (oxygen pressure, wall thickness, initial operating temperature, nature of the metal, etc.). Therefore, in this study, we propose a numerical model that describes the thermomechanical behavior of various pressurized installations heated locally by the combustion of small particles. This model takes into account the geometric and material nonlinearity and has been validated by the comparison of simulation results with experimental measurements obtained by a new device developed in this work.

Keywords: ignition, oxygen, numerical simulation, thermomechanical behaviour

Procedia PDF Downloads 147
24428 The Influence of High Temperatures on HVFA Concrete Columns by NDT Methods

Authors: D. Jagath Kumari, K. Srinivasa Rao

Abstract:

Quality assurance of the structures subjected to high temperatures is now enforcing measure for the Structural Engineers. The existing relations between strength and nondestructive measurements have been established under normal conditions are not suitable to concretes that have been exposed to high temperatures. The scope of the work is to investigate the influence of high temperatures of short durations on the residual properties of reinforced HVFA concrete columns that affect the strength by non-destructive tests (NDT). Fly ash concrete is increasingly used in the design of normal strength, high strength and high performance concretes. In this paper, the authors revealed the influence of high temperatures on HVFA concrete columns. These columns are heated from 100oC to 800oC with increments of 100oC and allowed to cool to room temperature by two methods one is air cooling method and the other immediate water quenching method. All the specimens were tested identically, before heating and after heating for compressive strength and material integrity by rebound hammer and ultrasonic pulse velocity (UPV) meter respectively. HVFA concrete retained more residual strength by water quenching method than air-cooling method.

Keywords: HVFA concrete, NDT methods, residual strength, non-destructive tests

Procedia PDF Downloads 451
24427 Authorization of Commercial Communication Satellite Grounds for Promoting Turkish Data Relay System

Authors: Celal Dudak, Aslı Utku, Burak Yağlioğlu

Abstract:

Uninterrupted and continuous satellite communication through the whole orbit time is becoming more indispensable every day. Data relay systems are developed and built for various high/low data rate information exchanges like TDRSS of USA and EDRSS of Europe. In these missions, a couple of task-dedicated communication satellites exist. In this regard, for Turkey a data relay system is attempted to be defined exchanging low data rate information (i.e. TTC) for Earth-observing LEO satellites appointing commercial GEO communication satellites all over the world. First, justification of this attempt is given, demonstrating duration enhancements in the link. Discussion of preference of RF communication is, also, given instead of laser communication. Then, preferred communication GEOs – including TURKSAT4A already belonging to Turkey- are given, together with the coverage enhancements through STK simulations and the corresponding link budget. Also, a block diagram of the communication system is given on the LEO satellite.

Keywords: communication, GEO satellite, data relay system, coverage

Procedia PDF Downloads 436
24426 The Development of Encrypted Near Field Communication Data Exchange Format Transmission in an NFC Passive Tag for Checking the Genuine Product

Authors: Tanawat Hongthai, Dusit Thanapatay

Abstract:

This paper presents the development of encrypted near field communication (NFC) data exchange format transmission in an NFC passive tag for the feasibility of implementing a genuine product authentication. We propose a research encryption and checking the genuine product into four major categories; concept, infrastructure, development and applications. This result shows the passive NFC-forum Type 2 tag can be configured to be compatible with the NFC data exchange format (NDEF), which can be automatically partially data updated when there is NFC field.

Keywords: near field communication, NFC data exchange format, checking the genuine product, encrypted NFC

Procedia PDF Downloads 272
24425 Data Hiding by Vector Quantization in Color Image

Authors: Yung Gi Wu

Abstract:

With the growing of computer and network, digital data can be spread to anywhere in the world quickly. In addition, digital data can also be copied or tampered easily so that the security issue becomes an important topic in the protection of digital data. Digital watermark is a method to protect the ownership of digital data. Embedding the watermark will influence the quality certainly. In this paper, Vector Quantization (VQ) is used to embed the watermark into the image to fulfill the goal of data hiding. This kind of watermarking is invisible which means that the users will not conscious the existing of embedded watermark even though the embedded image has tiny difference compared to the original image. Meanwhile, VQ needs a lot of computation burden so that we adopt a fast VQ encoding scheme by partial distortion searching (PDS) and mean approximation scheme to speed up the data hiding process. The watermarks we hide to the image could be gray, bi-level and color images. Texts are also can be regarded as watermark to embed. In order to test the robustness of the system, we adopt Photoshop to fulfill sharpen, cropping and altering to check if the extracted watermark is still recognizable. Experimental results demonstrate that the proposed system can resist the above three kinds of tampering in general cases.

Keywords: data hiding, vector quantization, watermark, color image

Procedia PDF Downloads 358
24424 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model

Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin

Abstract:

Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.

Keywords: anomaly detection, autoencoder, data centers, deep learning

Procedia PDF Downloads 187
24423 Genetic Screening of Sahiwal Bulls for Higher Fertility

Authors: Atul C. Mahajan, A. K. Chakravarty, V. Jamuna, C. S. Patil, Neeraj Kashyap, Bharti Deshmukh, Vijay Kumar

Abstract:

The selection of Sahiwal bulls on the basis of dams best lactation milk yield under breeding programme in herd of the country neglecting fertility traits leads to deterioration in their performances and economy. The goal of this study was to explore polymorphism of CRISP2 gene and their association with semen traits (Post Thaw Motility, Hypo-osmotic Swelling Test, Acrosome Integrity, DNA Fragmentation and capacitation status), scrotal circumference, expected predicted difference (EPD) for milk yield and fertility. Sahiwal bulls included in present study were 60 bulls used in breeding programme as well as 50 young bulls yet to be included in breeding programme. All the Sahiwal bulls were found to be polymorphic for CRISP2 gene (AA, AG and GG) present within exon 7 to the position 589 of CRISP2 mRNA by using PCR-SSCP and Sequencing. Semen analysis were done on 60 breeding bulls frozen semen doses pertaining to four season (winter, summer, rainy and autumn). The scrotal circumference was measured from existing Sahiwal breeding bulls in the herd (n=47). The effect of non-genetic factors on reproduction traits were studied by least-squares technique and the significant difference of means between subclasses of season, period, parity and age group were tested. The data were adjusted for the significant non-genetic factors to remove the differential environmental effects. The adjusted data were used to generate traits like Waiting Period (WP), Pregnancy Rate (PR), Expected Predicted Difference (EPD) of fertility, respectively. Genetic and phenotypic parameters of reproduction traits were estimated. The overall least-squares means of Age at First Calving (AFC), Service Period (SP) and WP were estimated as 36.69 ± 0.18 months, 120.47 ± 8.98 days and 79.78 ± 3.09 days respectively. Season and period of birth had significant effect (p < 0.01) on AFC. AFC was highest during autumn season of birth followed by summer, winter and rainy. Season and period of calving had significant effect (p < 0.01) on SP and WP of sahiwal cows. The WP for Sahiwal cows was standardized based on four developed predicted model for pregnancy rate 42, 63, 84 and 105 days using all lactation records. The WP for Sahiwal cows were standardized as 42 days. A selection criterion was developed for Sahiwal breeding bulls and young Sahiwal bulls on the basis of EPD of fertility. The genotype has significant effect on expected predicted difference of fertility and some semen parameters like post thaw motility and HOST. AA Genotype of CRISP2 gene revealed better EPD for fertility than EPD of milk yield. AA genotype of CRISP2 gene has higher scrotal circumference than other genotype. For young Sahiwal bulls only AA genotypes were present with similar patterns. So on the basis of association of genotype with seminal traits, EPD of milk yield and EPD for fertility status, AA and AG genotype of CRISP2 gene was better for higher fertility in Sahiwal bulls.

Keywords: expected predicted difference, fertility, sahiwal, waiting period

Procedia PDF Downloads 583
24422 Consumer Trust in User-Generated Brand Recommendations on Social Networking Sites

Authors: Minimol M. C.

Abstract:

The study provides insights into the consumer’s trust on user generated brand recommendations on social networking sites and also investigates the role of ad scepticism in generating consumer trust in user generated brand recommendations. The work contributes to a better understanding of trust development in the context of social networking sites. Specifically, the study reveals that not all dimensions of trustworthiness are equal. The individual user characteristics vary according to the person. The major finding of this study is that high degrees of trust toward user generated brand recommendations can be generated on the basis of high trust toward social networking sites and ad scepticism. Consumers trust the user generated brand recommendations based on the individual’s trust in the particular social networking platform and the level of their individual ad-scepticism. The study pinpoints that as consumers’ trust in user generated brand recommendations is affected by their trust in social networking sites, it is influenced by benevolence, integrity, the propensity to trust, and individual user characteristics to a great extent, and hence, it is imperative for brands should attempt to build on these factors so that they can engage consumers to generate user generated content on social media.

Keywords: Consumer trust, user-generated brand recommendations, ad scepticism, social networking sites

Procedia PDF Downloads 97
24421 Influence of Build Orientation on Machinability of Selective Laser Melted Titanium Alloy-Ti-6Al-4V

Authors: Manikandakumar Shunmugavel, Ashwin Polishetty, Moshe Goldberg, Junior Nomani, Guy Littlefair

Abstract:

Selective laser melting (SLM), a promising additive manufacturing (AM) technology, has a huge potential in the fabrication of Ti-6Al-4V near-net shape components. However, poor surface finish of the components fabricated from this technology requires secondary machining to achieve the desired accuracy and tolerance. Therefore, a systematic understanding of the machinability of SLM fabricated Ti-6Al-4V components is paramount to improve the productivity and product quality. Considering the significance of machining in SLM fabricated Ti-6Al-4V components, this research aim is to study the influence of build orientation on machinability characteristics by performing low speed orthogonal cutting tests. In addition, the machinability of SLM fabricated Ti-6Al-4V is compared with conventionally produced wrought Ti-6Al-4V to understand the influence of SLM technology on machining. This paper is an attempt to provide evidence to the hypothesis associated that build orientation influences cutting forces, chip formation and surface integrity during orthogonal cutting of SLM Ti-6Al-4V samples. Results obtained from the low speed orthogonal cutting tests highlight the practical importance of microstructure and build orientation on machinability of SLM Ti-6Al-4V.

Keywords: additive manufacturing, build orientation, machinability, titanium alloys (Ti-6Al-4V)

Procedia PDF Downloads 279
24420 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 309
24419 FCNN-MR: A Parallel Instance Selection Method Based on Fast Condensed Nearest Neighbor Rule

Authors: Lu Si, Jie Yu, Shasha Li, Jun Ma, Lei Luo, Qingbo Wu, Yongqi Ma, Zhengji Liu

Abstract:

Instance selection (IS) technique is used to reduce the data size to improve the performance of data mining methods. Recently, to process very large data set, several proposed methods divide the training set into some disjoint subsets and apply IS algorithms independently to each subset. In this paper, we analyze the limitation of these methods and give our viewpoint about how to divide and conquer in IS procedure. Then, based on fast condensed nearest neighbor (FCNN) rule, we propose a large data sets instance selection method with MapReduce framework. Besides ensuring the prediction accuracy and reduction rate, it has two desirable properties: First, it reduces the work load in the aggregation node; Second and most important, it produces the same result with the sequential version, which other parallel methods cannot achieve. We evaluate the performance of FCNN-MR on one small data set and two large data sets. The experimental results show that it is effective and practical.

Keywords: instance selection, data reduction, MapReduce, kNN

Procedia PDF Downloads 250
24418 Structural Health Monitoring of Buildings and Infrastructure

Authors: Mojtaba Valinejadshoubi, Ashutosh Bagchi, Osama Moselhi

Abstract:

Structures such as buildings, bridges, dams, wind turbines etc. need to be maintained against various factors such as deterioration, excessive loads, environment, temperature, etc. Choosing an appropriate monitoring system is important for determining any critical damage to a structure and address that to avoid any adverse consequence. Structural Health Monitoring (SHM) has emerged as an effective technique to monitor the health of the structures. SHM refers to an ongoing structural performance assessment using different kinds of sensors attached to or embedded in the structures to evaluate their integrity and safety to help engineers decide on rehabilitation measures. Ability of SHM in identifying the location and severity of structural damages by considering any changes in characteristics of the structures such as their frequency, stiffness and mode shapes helps engineers to monitor the structures and take the most effective corrective actions to maintain their safety and extend their service life. The main objective of this study is to review the overall SHM process specifically determining the natural frequency of an instrumented simply-supported concrete beam using modal testing and finite element model updating.

Keywords: structural health monitoring, natural frequency, modal analysis, finite element model updating

Procedia PDF Downloads 332
24417 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking

Authors: Trevor Toy, Josef Langerman

Abstract:

Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.

Keywords: big data markets, open banking, blockchain, personal data management

Procedia PDF Downloads 70
24416 Experimental Evaluation of Succinct Ternary Tree

Authors: Dmitriy Kuptsov

Abstract:

Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.

Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation

Procedia PDF Downloads 156
24415 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement

Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti

Abstract:

Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.

Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing

Procedia PDF Downloads 103
24414 Prosperous Digital Image Watermarking Approach by Using DCT-DWT

Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar

Abstract:

In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacks

Keywords: watermarking, digital, DCT-DWT, security

Procedia PDF Downloads 417
24413 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 59
24412 Keratin Reconstruction: Evaluation of Green Peptides Technology on Hair Performance

Authors: R. Di Lorenzo, S. Laneri, A. Sacchi

Abstract:

Hair surface properties affect hair texture and shine, whereas the healthy state of the hair cortex sways hair ends. Even if cosmetic treatments are intrinsically safe, there is potentially damaging action on the hair fibers. Loss of luster, frizz, split ends, and other hair problems are particularly prevalent among people who repeatedly alter the natural style of their hair or among people with intrinsically weak hair. Technological and scientific innovations in hair care thus become invaluable allies to preserve their natural well-being and shine. The study evaluated restoring keratin-like ingredients that improve hair fibers' structural integrity, increase tensile strength, improve hair manageability and moisturizing. The hair shaft is composed of 65 - 95% of keratin. It gives the hair resistance, elasticity, and plastic properties and also contributes to their waterproofing. Providing exogenous keratin is, therefore, a practical approach to protect and nourish the hair. By analyzing the amino acid composition of keratin, we find a high frequency of hydrophobic amino acids. It confirms the critical role interactions, mainly hydrophobic, between cosmetic products and hair. The active ingredient analyzed comes from vegetable proteins through an enzymatic cut process that selected only oligo- and polypeptides (> 3500 KDa) rich in amino acids with hydrocarbon side chains apolar or sulfur. These chemical components are the most expressed amino acids at the level of the capillary keratin structure, and it determines the most significant possible compatibility with the target substrate. Given the biological variability of the sources, it isn't easy to define a constant and reproducible molecular formula of the product. Still, it consists of hydroxypropiltrimonium vegetable peptides with keratin-like performances. 20 natural hair tresses (30 cm in length and 0.50 g weight) were treated with the investigated products (5 % v/v aqueous solution) following a specific protocol and compared with non-treated (Control) and benchmark-keratin-treated strands (Benchmark). Their brightness, moisture content, cortical and surface integrity, and tensile strength were evaluated and statistically compared. Keratin-like treated hair tresses showed better results than the other two groups (Control and Benchmark). The product improves the surface with significant regularization of the cuticle closure, improves the cortex and the peri-medullar area filling, gives a highly organized and tidy structure, delivers a significant amount of sulfur on the hair, and is more efficient moisturization and imbibition power, increases hair brightness. The hydroxypropyltrimonium quaternized group added to the C-terminal end interacts with the negative charges that form on the hair after washing when disheveled and tangled. The interactions anchor the product to the hair surface, keeping the cuticles adhered to the shaft. The small size allows the peptides to penetrate and give body to the hair, together with a conditioning effect that gives an image of healthy hair. Results suggest that the product is a valid ally in numerous restructuring/conditioning, shaft protection, straightener/dryer-damage prevention hair care product.

Keywords: conditioning, hair damage, hair, keratin, polarized light microscopy, scanning electron microscope, thermogravimetric analysis

Procedia PDF Downloads 122
24411 Applying ASHRAE Standards on the Hospital Buildings of UAE

Authors: Hanan M. Taleb

Abstract:

Energy consumption associated with buildings has a significant impact on the environment. To that end, and as a transaction between the inside and outside and between the building and urban space, the building skin plays an especially important role. It provides protection from the elements; demarcates private property and creates privacy. More importantly, it controls the admission of solar radiation. Therefore, designing the building skin sustainably will help to achieve optimal performance in terms of both energy consumption and thermal comfort. Unfortunately, with accelerating construction expansion, many recent buildings do not pay attention to the importance of the envelope design. This piece of research will highlight the importance of this part of the creation of buildings by providing evidence of a significant reduction in energy consumption if the envelopes are redesigned. Consequently, the aim of this paper is to enhance the performance of the hospital envelope in order to achieve sustainable performance. A hospital building sited in Abu Dhabi, in the UAE, has been chosen to act as a case study. A detailed analysis of the annual energy performance of the case study will be performed with the use of a computerised simulation; this is in order to explore their energy performance shortcomings. The energy consumption of the base case will then be compared with that resulting from the new proposed building skin. The results will inform architects and designers of the savings potential from various strategies.

Keywords: ASHREA, building skin, building envelopes, hospitals, Abu Dhabi, UAE, IES software

Procedia PDF Downloads 358
24410 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 366
24409 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 424
24408 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design

Authors: Qing K. Zhu

Abstract:

Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.

Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise

Procedia PDF Downloads 251
24407 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations

Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe

Abstract:

In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.

Keywords: electronic health records, electronic emergency department information system, emergency department, data quality

Procedia PDF Downloads 267
24406 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset

Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba

Abstract:

We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).

Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process

Procedia PDF Downloads 250
24405 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator

Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain

Abstract:

Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.

Keywords: percent depth dose, flatness, symmetry, golden beam data

Procedia PDF Downloads 483
24404 Variable-Fidelity Surrogate Modelling with Kriging

Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans

Abstract:

Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.

Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients

Procedia PDF Downloads 549
24403 Robust Barcode Detection with Synthetic-to-Real Data Augmentation

Authors: Xiaoyan Dai, Hsieh Yisan

Abstract:

Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.

Keywords: barcode detection, data augmentation, deep learning, image-based processing

Procedia PDF Downloads 159
24402 Improving Machine Learning Translation of Hausa Using Named Entity Recognition

Authors: Aishatu Ibrahim Birma, Aminu Tukur, Abdulkarim Abbass Gora

Abstract:

Machine translation plays a vital role in the Field of Natural Language Processing (NLP), breaking down language barriers and enabling communication across diverse communities. In the context of Hausa, a widely spoken language in West Africa, mainly in Nigeria, effective translation systems are essential for enabling seamless communication and promoting cultural exchange. However, due to the unique linguistic characteristics of Hausa, accurate translation remains a challenging task. The research proposes an approach to improving the machine learning translation of Hausa by integrating Named Entity Recognition (NER) techniques. Named entities, such as person names, locations, organizations, and dates, are critical components of a language's structure and meaning. Incorporating NER into the translation process can enhance the quality and accuracy of translations by preserving the integrity of named entities and also maintaining consistency in translating entities (e.g., proper names), and addressing the cultural references specific to Hausa. The NER will be incorporated into Neural Machine Translation (NMT) for the Hausa to English Translation.

Keywords: machine translation, natural language processing (NLP), named entity recognition (NER), neural machine translation (NMT)

Procedia PDF Downloads 36
24401 Microseismics: Application in Hydrocarbon Reservoir Management

Authors: Rahul Kumar Singh, Apurva Sharma, Dilip Kumar Srivastava

Abstract:

Tilting of our interest towards unconventional exploitation of hydrocarbons has raised a serious concern to environmentalists. Emerging technologies like horizontal/multi-lateral drilling with subsequent hydraulic fracturing or fracking etc., for exploitation of different conventional/unconventional hydrocarbon reservoirs, are related to creating micro-level seismic events below the surface of the earth. Monitoring of these micro-level seismic events is not possible by the conventional methodology of the seismic method. So, to tackle this issue, a new technology that is microseismic is very much in discussions around the globe. Multiple researches are being carried out these days around the globe in order to prove microseismic as a new essential in the E & P industry, especially for unconventional reservoir management. Microseismic monitoring is now used for reservoir surveillance, and the best application is checking the integrity of the caprock and containment of fluid in it. In general, in whatever terms we want to use micro-seismic related events monitoring and understanding the effectiveness of stimulation, this technology offers a lot of value in terms of insight into the subsurface characteristics and processes, and this makes it really a good geophysical method to be used in future.

Keywords: microseismic, monitoring, hydraulic fracturing or fracking, reservoir surveillance, seismic hazards

Procedia PDF Downloads 176
24400 Analysis of Delivery of Quad Play Services

Authors: Rahul Malhotra, Anurag Sharma

Abstract:

Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.

Keywords: FTTH, quad play, play service, access networks, data rate

Procedia PDF Downloads 403