Search results for: image and telemetric data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27126

Search results for: image and telemetric data

24816 Nazca: A Context-Based Matching Method for Searching Heterogeneous Structures

Authors: Karine B. de Oliveira, Carina F. Dorneles

Abstract:

The structure level matching is the problem of combining elements of a structure, which can be represented as entities, classes, XML elements, web forms, and so on. This is a challenge due to large number of distinct representations of semantically similar structures. This paper describes a structure-based matching method applied to search for different representations in data sources, considering the similarity between elements of two structures and the data source context. Using real data sources, we have conducted an experimental study comparing our approach with our baseline implementation and with another important schema matching approach. We demonstrate that our proposal reaches higher precision than the baseline.

Keywords: context, data source, index, matching, search, similarity, structure

Procedia PDF Downloads 366
24815 Surgical Imaging in Ancient Egypt

Authors: Ahmed Hefny Mohamed El-Badwy

Abstract:

This research aims to study of the surgery science and imaging in ancient Egypt, and how to diagnose the surgical cases, whether due to injuries or disease that requires surgical intervention, Medical diagnosis and how to treat it. The ancient Egyptian physician tried to change over from magic and theological thinking to become a stand-alone experimental science, they were able to distinguish between diseases, and they divide them into internal and external diseases even this division exists to date in modern medicine. There is no evidence to recognize the amount of human knowledge in the prehistoric knowledge of medicine and surgery except skeleton. It is not far from the human being in those times familiar with some means of treatment, Surgery in the Stone age was rudimentary, Flint stone was used after trimming in a certain way as a lancet to slit and open the skin. Wooden tree branches were used to make splints to treat bone fractures. Surgery developed further when copper was discovered, it led to the advancement of Egyptian civilization, then modern and advanced tools appeared in the operating theater, like a knife or a scalpel, there is evidence of surgery performed in ancient Egypt during the dynastic period (323 – 3200 BC). The climate and environmental conditions have preserved medical papyri and human remains that have confirmed their knowledge of surgical methods, including sedation. The ancient Egyptians reached a great importance in surgery, evidenced by the scenes that depict the pathological image and the surgical process, but the image alone is not sufficient to prove the pathology, its presence in ancient Egypt and its treatment method. As there are a number of medical papyri, especially Edwin Smith and Ebris, which prove the ancient Egyptian surgeon's knowledge of the pathological condition that It requires a surgical intervention, otherwise, its diagnosis and the method of treatment will not be described with such accuracy through these texts. Some surgeries are described in the department of surgery at Ebris papyrus (recipes from 863 to 877). The level of surgery in ancient Egypt was high, and they performed surgery such as hernias and Aneurysm, however, we have not received a lengthy explanation of the various surgeries, and the surgeon has usually only said “treated surgically”. It is evident in the Ebris papyrus that they used sharp surgical tools and cautery in operations where bleeding is expected, such as hernias, arterial sacs and tumors.

Keywords: ancientegypt, egypt, archaeology, the ancient egyptian

Procedia PDF Downloads 70
24814 Spatially Random Sampling for Retail Food Risk Factors Study

Authors: Guilan Huang

Abstract:

In 2013 and 2014, the U.S. Food and Drug Administration (FDA) collected data from selected fast food restaurants and full service restaurants for tracking changes in the occurrence of foodborne illness risk factors. This paper discussed how we customized spatial random sampling method by considering financial position and availability of FDA resources, and how we enriched restaurants data with location. Location information of restaurants provides opportunity for quantitatively determining random sampling within non-government units (e.g.: 240 kilometers around each data-collector). Spatial analysis also could optimize data-collectors’ work plans and resource allocation. Spatial analytic and processing platform helped us handling the spatial random sampling challenges. Our method fits in FDA’s ability to pinpoint features of foodservice establishments, and reduced both time and expense on data collection.

Keywords: geospatial technology, restaurant, retail food risk factor study, spatially random sampling

Procedia PDF Downloads 352
24813 Automatic MC/DC Test Data Generation from Software Module Description

Authors: Sekou Kangoye, Alexis Todoskoff, Mihaela Barreau

Abstract:

Modified Condition/Decision Coverage (MC/DC) is a structural coverage criterion that is highly recommended or required for safety-critical software coverage. Therefore, many testing standards include this criterion and require it to be satisfied at a particular level of testing (e.g. validation and unit levels). However, an important amount of time is needed to meet those requirements. In this paper we propose to automate MC/DC test data generation. Thus, we present an approach to automatically generate MC/DC test data, from software module description written over a dedicated language. We introduce a new merging approach that provides high MC/DC coverage for the description, with only a little number of test cases.

Keywords: domain-specific language, MC/DC, test data generation, safety-critical software coverage

Procedia PDF Downloads 448
24812 Blockchain-Based Approach on Security Enhancement of Distributed System in Healthcare Sector

Authors: Loong Qing Zhe, Foo Jing Heng

Abstract:

A variety of data files are now available on the internet due to the advancement of technology across the globe today. As more and more data are being uploaded on the internet, people are becoming more concerned that their private data, particularly medical health records, are being compromised and sold to others for money. Hence, the accessibility and confidentiality of patients' medical records have to be protected through electronic means. Blockchain technology is introduced to offer patients security against adversaries or unauthorised parties. In the blockchain network, only authorised personnel or organisations that have been validated as nodes may share information and data. For any change within the network, including adding a new block or modifying existing information about the block, a majority of two-thirds of the vote is required to confirm its legitimacy. Additionally, a consortium permission blockchain will connect all the entities within the same community. Consequently, all medical data in the network can be safely shared with all authorised entities. Also, synchronization can be performed within the cloud since the data is real-time. This paper discusses an efficient method for storing and sharing electronic health records (EHRs). It also examines the framework of roles within the blockchain and proposes a new approach to maintain EHRs with keyword indexes to search for patients' medical records while ensuring data privacy.

Keywords: healthcare sectors, distributed system, blockchain, electronic health records (EHR)

Procedia PDF Downloads 196
24811 Demographic Factors Influencing Employees’ Salary Expectations and Labor Turnover

Authors: M. Osipova

Abstract:

Thanks to informational technologies development every sphere of economics is becoming more and more data-centralized as people are generating huge datasets containing information on any aspect of their life. Applying research of such data to human resources management allows getting scarce statistics on labor market state including salary expectations and potential employees’ typical career behavior, and this information can become a reliable basis for management decisions. The following article presents results of career behavior research based on freely accessible resume data. Information used for study is much wider than one usually uses in human resources surveys. That is why there is enough data for statistically significant results even for subgroups analysis.

Keywords: human resources management, salary expectations, statistics, turnover

Procedia PDF Downloads 357
24810 Modelling High Strain Rate Tear Open Behavior of a Bilaminate Consisting of Foam and Plastic Skin Considering Tensile Failure and Compression

Authors: Laura Pytel, Georg Baumann, Gregor Gstrein, Corina Klug

Abstract:

Premium cars often coat the instrument panels with a bilaminate consisting of a soft foam and a plastic skin. The coating is torn open during the passenger airbag deployment under high strain rates. Characterizing and simulating the top coat layer is crucial for predicting the attenuation that delays the airbag deployment, effecting the design of the restrain system and to reduce the demand of simulation adjustments through expensive physical component testing.Up to now, bilaminates used within cars either have been modelled by using a two-dimensional shell formulation for the whole coating system as one which misses out the interaction of the two layers or by combining a three-dimensional formulation foam layer with a two-dimensional skin layer but omitting the foam in the significant parts like the expected tear line area and the hinge where high compression is expected. In both cases, the properties of the coating causing the attenuation are not considered. Further, at present, the availability of material information, as there are failure dependencies of the two layers, as well as the strain rate of up to 200 1/s, are insufficient. The velocity of the passenger airbag flap during an airbag shot has been measured with about 11.5 m/s during first ripping; the digital image correlation evaluation showed resulting strain rates of above 1500 1/s. This paper provides a high strain rate material characterization of a bilaminate consisting of a thin polypropylene foam and a thermoplasctic olefins (TPO) skin and the creation of validated material models. With the help of a Split Hopkinson tension bar, strain rates of 1500 1/s were within reach. The experimental data was used to calibrate and validate a more physical modelling approach of the forced ripping of the bilaminate. In the presented model, the three-dimensional foam layer is continuously tied to the two-dimensional skin layer, allowing failure in both layers at any possible position. The simulation results show a higher agreement in terms of the trajectory of the flaps and its velocity during ripping. The resulting attenuation of the airbag deployment measured by the contact force between airbag and flaps increases and serves usable data for dimensioning modules of an airbag system.

Keywords: bilaminate ripping behavior, High strain rate material characterization and modelling, induced material failure, TPO and foam

Procedia PDF Downloads 73
24809 Multi-Dimensional Experience of Processing Textual and Visual Information: Case Study of Allocations to Places in the Mind’s Eye Based on Individual’s Semantic Knowledge Base

Authors: Joanna Wielochowska, Aneta Wielochowska

Abstract:

Whilst the relationship between scientific areas such as cognitive psychology, neurobiology and philosophy of mind has been emphasized in recent decades of scientific research, concepts and discoveries made in both fields overlap and complement each other in their quest for answers to similar questions. The object of the following case study is to describe, analyze and illustrate the nature and characteristics of a certain cognitive experience which appears to display features of synaesthesia, or rather high-level synaesthesia (ideasthesia). The following research has been conducted on the subject of two authors, monozygotic twins (both polysynaesthetes) experiencing involuntary associations of identical nature. Authors made attempts to identify which cognitive and conceptual dependencies may guide this experience. Operating on self-introduced nomenclature, the described phenomenon- multi-dimensional processing of textual and visual information- aims to define a relationship that involuntarily and immediately couples the content introduced by means of text or image a sensation of appearing in a certain place in the mind’s eye. More precisely: (I) defining a concept introduced by means of textual content during activity of reading or writing, or (II) defining a concept introduced by means of visual content during activity of looking at image(s) with simultaneous sensation of being allocated to a given place in the mind’s eye. A place can be then defined as a cognitive representation of a certain concept. During the activity of processing information, a person has an immediate and involuntary feel of appearing in a certain place themselves, just like a character of a story, ‘observing’ a venue or a scenery from one or more perspectives and angles. That forms a unique and unified experience, constituting a background mental landscape of text or image being looked at. We came to a conclusion that semantic allocations to a given place could be divided and classified into the categories and subcategories and are naturally linked with an individual’s semantic knowledge-base. A place can be defined as a representation one’s unique idea of a given concept that has been established in their semantic knowledge base. A multi-level structure of selectivity of places in the mind’s eye, as a reaction to a given information (one stimuli), draws comparisons to structures and patterns found in botany. Double-flowered varieties of flowers and a whorl system (arrangement) which is characteristic to components of some flower species were given as an illustrative example. A composition of petals that fan out from one single point and wrap around a stem inspired an idea that, just like in nature, in philosophy of mind there are patterns driven by the logic specific to a given phenomenon. The study intertwines terms perceived through the philosophical lens, such as definition of meaning, subjectivity of meaning, mental atmosphere of places, and others. Analysis of this rare experience aims to contribute to constantly developing theoretical framework of the philosophy of mind and influence the way human semantic knowledge base and processing given content in terms of distinguishing between information and meaning is researched.

Keywords: information and meaning, information processing, mental atmosphere of places, patterns in nature, philosophy of mind, selectivity, semantic knowledge base, senses, synaesthesia

Procedia PDF Downloads 130
24808 Exploring Electroactive Polymers for Dynamic Data Physicalization

Authors: Joanna Dauner, Jan Friedrich, Linda Elsner, Kora Kimpel

Abstract:

Active materials such as Electroactive Polymers (EAPs) are promising for the development of novel shape-changing interfaces. This paper explores the potential of EAPs in a multilayer unimorph structure from a design perspective to investigate the visual qualities of the material for dynamic data visualization and data physicalization. We discuss various concepts of how the material can be used for this purpose. Multilayer unimorph EAPs are of particular interest to designers because they can be easily prototyped using everyday materials and tools. By changing the structure and geometry of the EAPs, their movement and behavior can be modified. We present the results of our preliminary user testing, where we evaluated different movement patterns. As a result, we introduce a prototype display built with EAPs for dynamic data physicalization. Finally, we discuss the potentials and drawbacks and identify further open research questions for the design discipline.

Keywords: electroactive polymer, shape-changing interfaces, smart material interfaces, data physicalization

Procedia PDF Downloads 104
24807 Surgical Imaging in Ancient Egypt

Authors: Haitham Nabil Zaghlol Hasan

Abstract:

This research aims to study of the surgery science and imaging in ancient Egypt and how to diagnose the surgical cases, whether due to injuries or disease that requires surgical intervention, Medical diagnosis and how to treat it. The ancient Egyptian physician tried to change over from magic and theological thinking to become a stand-alone experimental science, they were able to distinguish between diseases, and they divide them into internal and external diseases even though this division exists to date in modern medicine. There is no evidence to recognize the amount of human knowledge in the prehistoric knowledge of medicine and surgery except skeleton. It is not far from the human being in those times familiar with some means of treatment, Surgery in the Stone age was rudimentary, Flint stone was used after trimming in a certain way as a lancet to slit and open the skin. Wooden tree branches were used to make splints to treat bone fractures. Surgery developed further when copper was discovered, it led to the advancement of Egyptian civilization, then modern and advanced tools appeared in the operating theater, like a knife or a scalpel, there is evidence of surgery performed in ancient Egypt during the dynastic period (323 – 3200 BC). The climate and environmental conditions have preserved medical papyri and human remains that have confirmed their knowledge of surgical methods, including sedation. The ancient Egyptians reached great importance in surgery, evidenced by the scenes that depict the pathological image and the surgical process, but the image alone is not sufficient to prove the pathology, its presence in ancient Egypt and its treatment method. As there are a number of medical papyri, especially Edwin Smith and Ebris, which prove the ancient Egyptian surgeon's knowledge of the pathological condition that It requires surgical intervention, otherwise, its diagnosis and the method of treatment will not be described with such accuracy through these texts. Some surgeries are described in the department of surgery at Ebris papyrus (recipes from 863 to 877). The level of surgery in ancient Egypt was high, and they performed surgery such as hernias and Aneurysm, however, we have not received a lengthy explanation of the various surgeries, and the surgeon has usually only said: “treated surgically”. It is evident in the Ebris papyrus that they used sharp surgical tools and cautery in operations where bleeding is expected, such as hernias, arterial sacs and tumors.

Keywords: egypt, ancient_egypt, civilization, archaeology

Procedia PDF Downloads 71
24806 Research and Implementation of Cross-domain Data Sharing System in Net-centric Environment

Authors: Xiaoqing Wang, Jianjian Zong, Li Li, Yanxing Zheng, Jinrong Tong, Mao Zhan

Abstract:

With the rapid development of network and communication technology, a great deal of data has been generated in different domains of a network. These data show a trend of increasing scale and more complex structure. Therefore, an effective and flexible cross-domain data-sharing system is needed. The Cross-domain Data Sharing System(CDSS) in a net-centric environment is composed of three sub-systems. The data distribution sub-system provides data exchange service through publish-subscribe technology that supports asynchronism and multi-to-multi communication, which adapts to the needs of the dynamic and large-scale distributed computing environment. The access control sub-system adopts Attribute-Based Access Control(ABAC) technology to uniformly model various data attributes such as subject, object, permission and environment, which effectively monitors the activities of users accessing resources and ensures that legitimate users get effective access control rights within a legal time. The cross-domain access security negotiation subsystem automatically determines the access rights between different security domains in the process of interactive disclosure of digital certificates and access control policies through trust policy management and negotiation algorithms, which provides an effective means for cross-domain trust relationship establishment and access control in a distributed environment. The CDSS’s asynchronous,multi-to-multi and loosely-coupled communication features can adapt well to data exchange and sharing in dynamic, distributed and large-scale network environments. Next, we will give CDSS new features to support the mobile computing environment.

Keywords: data sharing, cross-domain, data exchange, publish-subscribe

Procedia PDF Downloads 128
24805 Routing Protocol in Ship Dynamic Positioning Based on WSN Clustering Data Fusion System

Authors: Zhou Mo, Dennis Chow

Abstract:

In the dynamic positioning system (DPS) for vessels, the reliable information transmission between each note basically relies on the wireless protocols. From the perspective of cluster-based routing protocols for wireless sensor networks, the data fusion technology based on the sleep scheduling mechanism and remaining energy in network layer is proposed, which applies the sleep scheduling mechanism to the routing protocols, considering the remaining energy of node and location information when selecting cluster-head. The problem of uneven distribution of nodes in each cluster is solved by the Equilibrium. At the same time, Classified Forwarding Mechanism as well as Redelivery Policy strategy is adopted to avoid congestion in the transmission of huge amount of data, reduce the delay in data delivery and enhance the real-time response. In this paper, a simulation test is conducted to improve the routing protocols, which turn out to reduce the energy consumption of nodes and increase the efficiency of data delivery.

Keywords: DPS for vessel, wireless sensor network, data fusion, routing protocols

Procedia PDF Downloads 530
24804 Advanced Data Visualization Techniques for Effective Decision-making in Oil and Gas Exploration and Production

Authors: Deepak Singh, Rail Kuliev

Abstract:

This research article explores the significance of advanced data visualization techniques in enhancing decision-making processes within the oil and gas exploration and production domain. With the oil and gas industry facing numerous challenges, effective interpretation and analysis of vast and diverse datasets are crucial for optimizing exploration strategies, production operations, and risk assessment. The article highlights the importance of data visualization in managing big data, aiding the decision-making process, and facilitating communication with stakeholders. Various advanced data visualization techniques, including 3D visualization, augmented reality (AR), virtual reality (VR), interactive dashboards, and geospatial visualization, are discussed in detail, showcasing their applications and benefits in the oil and gas sector. The article presents case studies demonstrating the successful use of these techniques in optimizing well placement, real-time operations monitoring, and virtual reality training. Additionally, the article addresses the challenges of data integration and scalability, emphasizing the need for future developments in AI-driven visualization. In conclusion, this research emphasizes the immense potential of advanced data visualization in revolutionizing decision-making processes, fostering data-driven strategies, and promoting sustainable growth and improved operational efficiency within the oil and gas exploration and production industry.

Keywords: augmented reality (AR), virtual reality (VR), interactive dashboards, real-time operations monitoring

Procedia PDF Downloads 93
24803 Object-Scene: Deep Convolutional Representation for Scene Classification

Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang

Abstract:

Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.

Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization

Procedia PDF Downloads 335
24802 The Data Quality Model for the IoT based Real-time Water Quality Monitoring Sensors

Authors: Rabbia Idrees, Ananda Maiti, Saurabh Garg, Muhammad Bilal Amin

Abstract:

IoT devices are the basic building blocks of IoT network that generate enormous volume of real-time and high-speed data to help organizations and companies to take intelligent decisions. To integrate this enormous data from multisource and transfer it to the appropriate client is the fundamental of IoT development. The handling of this huge quantity of devices along with the huge volume of data is very challenging. The IoT devices are battery-powered and resource-constrained and to provide energy efficient communication, these IoT devices go sleep or online/wakeup periodically and a-periodically depending on the traffic loads to reduce energy consumption. Sometime these devices get disconnected due to device battery depletion. If the node is not available in the network, then the IoT network provides incomplete, missing, and inaccurate data. Moreover, many IoT applications, like vehicle tracking and patient tracking require the IoT devices to be mobile. Due to this mobility, If the distance of the device from the sink node become greater than required, the connection is lost. Due to this disconnection other devices join the network for replacing the broken-down and left devices. This make IoT devices dynamic in nature which brings uncertainty and unreliability in the IoT network and hence produce bad quality of data. Due to this dynamic nature of IoT devices we do not know the actual reason of abnormal data. If data are of poor-quality decisions are likely to be unsound. It is highly important to process data and estimate data quality before bringing it to use in IoT applications. In the past many researchers tried to estimate data quality and provided several Machine Learning (ML), stochastic and statistical methods to perform analysis on stored data in the data processing layer, without focusing the challenges and issues arises from the dynamic nature of IoT devices and how it is impacting data quality. A comprehensive review on determining the impact of dynamic nature of IoT devices on data quality is done in this research and presented a data quality model that can deal with this challenge and produce good quality of data. This research presents the data quality model for the sensors monitoring water quality. DBSCAN clustering and weather sensors are used in this research to make data quality model for the sensors monitoring water quality. An extensive study has been done in this research on finding the relationship between the data of weather sensors and sensors monitoring water quality of the lakes and beaches. The detailed theoretical analysis has been presented in this research mentioning correlation between independent data streams of the two sets of sensors. With the help of the analysis and DBSCAN, a data quality model is prepared. This model encompasses five dimensions of data quality: outliers’ detection and removal, completeness, patterns of missing values and checks the accuracy of the data with the help of cluster’s position. At the end, the statistical analysis has been done on the clusters formed as the result of DBSCAN, and consistency is evaluated through Coefficient of Variation (CoV).

Keywords: clustering, data quality, DBSCAN, and Internet of things (IoT)

Procedia PDF Downloads 144
24801 Riesz Mixture Model for Brain Tumor Detection

Authors: Mouna Zitouni, Mariem Tounsi

Abstract:

This research introduces an application of the Riesz mixture model for medical image segmentation for accurate diagnosis and treatment of brain tumors. We propose a pixel classification technique based on the Riesz distribution, derived from an extended Bartlett decomposition. To our knowledge, this is the first study addressing this approach. The Expectation-Maximization algorithm is implemented for parameter estimation. A comparative analysis, using both synthetic and real brain images, demonstrates the superiority of the Riesz model over a recent method based on the Wishart distribution.

Keywords: EM algorithm, segmentation, Riesz probability distribution, Wishart probability distribution

Procedia PDF Downloads 27
24800 The Impact of Social Customer Relationship Management on Brand Loyalty and Reducing Co-Destruction of Value by Customers

Authors: Sanaz Farhangi, Habib Alipour

Abstract:

The main objective of this paper is to explore how social media as a critical platform would increase the interactions between the tourism sector and stakeholders. Nowadays, human interactions through social media in many areas, especially in tourism, provide various experiences and information that users share and discuss. Organizations and firms can gain customer loyalty through social media platforms, albeit consumers' negative image of the product or services. Such a negative image can be reduced through constant communication between produces and consumers, especially with the availability of the new technology. Therefore, effective management of customer relationships in social media creates an extraordinary opportunity for organizations to enhance value and brand loyalty. In this study, we seek to develop a conceptual model for addressing factors such as social media, SCRM, and customer engagement affecting brand loyalty and diminish co-destruction. To support this model, we scanned the relevant literature using a comprehensive category of ideas in the context of marketing and customer relationship management. This will allow exploring whether there is any relationship between social media, customer engagement, social customer relationship management (SCRM), co-destruction, and brand loyalty. SCRM has been explored as a moderating factor in the relationship between customer engagement and social media to secure brand loyalty and diminish co-destruction of the company’s value. Although numerous studies have been conducted on the impact of social media on customers and marketing behavior, there are limited studies for investigating the relationship between SCRM, brand loyalty, and negative e-WOM, which results in the reduction of the co-destruction of value by customers. This study is an important contribution to the tourism and hospitality industry in orienting customer behavior in social media using SCRM. This study revealed that through social media platforms, management can generate discussion and engagement about the product and services, which facilitates customers feeling in an appositive way towards the firm and its product. Study has also revealed that customers’ complaints through social media have a multi-purpose effect; it can degrade the value of the product, but at the same time, it will motivate the firm to overcome its weaknesses and correct its shortcomings. This study has also implications for the managers and practitioners, especially in the tourism and hospitality sector. Future research direction and limitations of the research were also discussed.

Keywords: brand loyalty, co-destruction, customer engagement, SCRM, tourism and hospitality

Procedia PDF Downloads 119
24799 New Security Approach of Confidential Resources in Hybrid Clouds

Authors: Haythem Yahyaoui, Samir Moalla, Mounir Bouden, Skander ghorbel

Abstract:

Nowadays, Cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime, also an optimized and secured access to the resources and gives more security for the data which stored in the platform, however, some companies do not trust Cloud providers, in their point of view, providers can access and modify some confidential data such as bank accounts, many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, although, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some modifications on the data before sending them to the Cloud in the objective to make them unreadable. This work aims on enhancing the quality of service of providers and improving the trust of the customers.

Keywords: cloud, confidentiality, cryptography, security issues, trust issues

Procedia PDF Downloads 382
24798 Estimation of Chronic Kidney Disease Using Artificial Neural Network

Authors: Ilker Ali Ozkan

Abstract:

In this study, an artificial neural network model has been developed to estimate chronic kidney failure which is a common disease. The patients’ age, their blood and biochemical values, and 24 input data which consists of various chronic diseases are used for the estimation process. The input data have been subjected to preprocessing because they contain both missing values and nominal values. 147 patient data which was obtained from the preprocessing have been divided into as 70% training and 30% testing data. As a result of the study, artificial neural network model with 25 neurons in the hidden layer has been found as the model with the lowest error value. Chronic kidney failure disease has been able to be estimated accurately at the rate of 99.3% using this artificial neural network model. The developed artificial neural network has been found successful for the estimation of chronic kidney failure disease using clinical data.

Keywords: estimation, artificial neural network, chronic kidney failure disease, disease diagnosis

Procedia PDF Downloads 449
24797 Impact of Map Generalization in Spatial Analysis

Authors: Lin Li, P. G. R. N. I. Pussella

Abstract:

When representing spatial data and their attributes on different types of maps, the scale plays a key role in the process of map generalization. The process is consisted with two main operators such as selection and omission. Once some data were selected, they would undergo of several geometrical changing processes such as elimination, simplification, smoothing, exaggeration, displacement, aggregation and size reduction. As a result of these operations at different levels of data, the geometry of the spatial features such as length, sinuosity, orientation, perimeter and area would be altered. This would be worst in the case of preparation of small scale maps, since the cartographer has not enough space to represent all the features on the map. What the GIS users do is when they wanted to analyze a set of spatial data; they retrieve a data set and does the analysis part without considering very important characteristics such as the scale, the purpose of the map and the degree of generalization. Further, the GIS users use and compare different maps with different degrees of generalization. Sometimes, GIS users are going beyond the scale of the source map using zoom in facility and violate the basic cartographic rule 'it is not suitable to create a larger scale map using a smaller scale map'. In the study, the effect of map generalization for GIS analysis would be discussed as the main objective. It was used three digital maps with different scales such as 1:10000, 1:50000 and 1:250000 which were prepared by the Survey Department of Sri Lanka, the National Mapping Agency of Sri Lanka. It was used common features which were on above three maps and an overlay analysis was done by repeating the data with different combinations. Road data, River data and Land use data sets were used for the study. A simple model, to find the best place for a wild life park, was used to identify the effects. The results show remarkable effects on different degrees of generalization processes. It can see that different locations with different geometries were received as the outputs from this analysis. The study suggests that there should be reasonable methods to overcome this effect. It can be recommended that, as a solution, it would be very reasonable to take all the data sets into a common scale and do the analysis part.

Keywords: generalization, GIS, scales, spatial analysis

Procedia PDF Downloads 331
24796 Identity Verification Based on Multimodal Machine Learning on Red Green Blue (RGB) Red Green Blue-Depth (RGB-D) Voice Data

Authors: LuoJiaoyang, Yu Hongyang

Abstract:

In this paper, we experimented with a new approach to multimodal identification using RGB, RGB-D and voice data. The multimodal combination of RGB and voice data has been applied in tasks such as emotion recognition and has shown good results and stability, and it is also the same in identity recognition tasks. We believe that the data of different modalities can enhance the effect of the model through mutual reinforcement. We try to increase the three modalities on the basis of the dual modalities and try to improve the effectiveness of the network by increasing the number of modalities. We also implemented the single-modal identification system separately, tested the data of these different modalities under clean and noisy conditions, and compared the performance with the multimodal model. In the process of designing the multimodal model, we tried a variety of different fusion strategies and finally chose the fusion method with the best performance. The experimental results show that the performance of the multimodal system is better than that of the single modality, especially in dealing with noise, and the multimodal system can achieve an average improvement of 5%.

Keywords: multimodal, three modalities, RGB-D, identity verification

Procedia PDF Downloads 75
24795 Assessment of Seeding and Weeding Field Robot Performance

Authors: Victor Bloch, Eerikki Kaila, Reetta Palva

Abstract:

Field robots are an important tool for enhancing efficiency and decreasing the climatic impact of food production. There exists a number of commercial field robots; however, since this technology is still new, the robot advantages and limitations, as well as methods for optimal using of robots, are still unclear. In this study, the performance of a commercial field robot for seeding and weeding was assessed. A research 2-ha sugar beet field with 0.5m row width was used for testing, which included robotic sowing of sugar beet and weeding five times during the first two months of the growing. About three and five percent of the field were used as untreated and chemically weeded control areas, respectively. The plant detection was based on the exact plant location without image processing. The robot was equipped with six seeding and weeding tools, including passive between-rows harrow hoes and active hoes cutting inside rows between the plants, and it moved with a maximal speed of 0.9 km/h. The robot's performance was assessed by image processing. The field images were collected by an action camera with a height of 2 m and a resolution 27M pixels installed on the robot and by a drone with a 16M pixel camera flying at 4 m height. To detect plants and weeds, the YOLO model was trained with transfer learning from two available datasets. A preliminary analysis of the entire field showed that in the areas treated by the robot, the weed average density varied across the field from 6.8 to 9.1 weeds/m² (compared with 0.8 in the chemically treated area and 24.3 in the untreated area), the weed average density inside rows was 2.0-2.9 weeds / m (compared with 0 on the chemically treated area), and the emergence rate was 90-95%. The information about the robot's performance has high importance for the application of robotics for field tasks. With the help of the developed method, the performance can be assessed several times during the growth according to the robotic weeding frequency. When it’s used by farmers, they can know the field condition and efficiency of the robotic treatment all over the field. Farmers and researchers could develop optimal strategies for using the robot, such as seeding and weeding timing, robot settings, and plant and field parameters and geometry. The robot producers can have quantitative information from an actual working environment and improve the robots accordingly.

Keywords: agricultural robot, field robot, plant detection, robot performance

Procedia PDF Downloads 90
24794 Non-Linear Causality Inference Using BAMLSS and Bi-CAM in Finance

Authors: Flora Babongo, Valerie Chavez

Abstract:

Inferring causality from observational data is one of the fundamental subjects, especially in quantitative finance. So far most of the papers analyze additive noise models with either linearity, nonlinearity or Gaussian noise. We fill in the gap by providing a nonlinear and non-gaussian causal multiplicative noise model that aims to distinguish the cause from the effect using a two steps method based on Bayesian additive models for location, scale and shape (BAMLSS) and on causal additive models (CAM). We have tested our method on simulated and real data and we reached an accuracy of 0.86 on average. As real data, we considered the causality between financial indices such as S&P 500, Nasdaq, CAC 40 and Nikkei, and companies' log-returns. Our results can be useful in inferring causality when the data is heteroskedastic or non-injective.

Keywords: causal inference, DAGs, BAMLSS, financial index

Procedia PDF Downloads 155
24793 Managing Incomplete PSA Observations in Prostate Cancer Data: Key Strategies and Best Practices for Handling Loss to Follow-Up and Missing Data

Authors: Madiha Liaqat, Rehan Ahmed Khan, Shahid Kamal

Abstract:

Multiple imputation with delta adjustment is a versatile and transparent technique for addressing univariate missing data in the presence of various missing mechanisms. This approach allows for the exploration of sensitivity to the missing-at-random (MAR) assumption. In this review, we outline the delta-adjustment procedure and illustrate its application for assessing the sensitivity to deviations from the MAR assumption. By examining diverse missingness scenarios and conducting sensitivity analyses, we gain valuable insights into the implications of missing data on our analyses, enhancing the reliability of our study's conclusions. In our study, we focused on assessing logPSA, a continuous biomarker in incomplete prostate cancer data, to examine the robustness of conclusions against plausible departures from the MAR assumption. We introduced several approaches for conducting sensitivity analyses, illustrating their application within the pattern mixture model (PMM) under the delta adjustment framework. This proposed approach effectively handles missing data, particularly loss to follow-up.

Keywords: loss to follow-up, incomplete response, multiple imputation, sensitivity analysis, prostate cancer

Procedia PDF Downloads 94
24792 General Architecture for Automation of Machine Learning Practices

Authors: U. Borasi, Amit Kr. Jain, Rakesh, Piyush Jain

Abstract:

Data collection, data preparation, model training, model evaluation, and deployment are all processes in a typical machine learning workflow. Training data needs to be gathered and organised. This often entails collecting a sizable dataset and cleaning it to remove or correct any inaccurate or missing information. Preparing the data for use in the machine learning model requires pre-processing it after it has been acquired. This often entails actions like scaling or normalising the data, handling outliers, selecting appropriate features, reducing dimensionality, etc. This pre-processed data is then used to train a model on some machine learning algorithm. After the model has been trained, it needs to be assessed by determining metrics like accuracy, precision, and recall, utilising a test dataset. Every time a new model is built, both data pre-processing and model training—two crucial processes in the Machine learning (ML) workflow—must be carried out. Thus, there are various Machine Learning algorithms that can be employed for every single approach to data pre-processing, generating a large set of combinations to choose from. Example: for every method to handle missing values (dropping records, replacing with mean, etc.), for every scaling technique, and for every combination of features selected, a different algorithm can be used. As a result, in order to get the optimum outcomes, these tasks are frequently repeated in different combinations. This paper suggests a simple architecture for organizing this largely produced “combination set of pre-processing steps and algorithms” into an automated workflow which simplifies the task of carrying out all possibilities.

Keywords: machine learning, automation, AUTOML, architecture, operator pool, configuration, scheduler

Procedia PDF Downloads 61
24791 Chloroform-Formic Acid Solvent Systems for Nanofibrous Polycaprolactone Webs

Authors: I. Yalcin Enis, J. Vojtech, T. Gok Sadikoglu

Abstract:

In this study, polycaprolactone (PCL) was dissolved in chloroform: ethanol solvent system at a concentration of 18 w/v %. 1, 2, 4, and 6 droplets of formic acid were added to the prepared 10ml PCL-chloroform:ethanol solutions separately. Fibrous webs were produced by electrospinning technique. Morphology of the webs was investigated by using scanning electron microscopy (SEM) whereas fiber diameters were measured by Image J Software System. The effect of formic acid addition to the mostly used chloroform solvent on fiber morphology was examined.

Keywords: chloroform, electrospinning, formic acid polycaprolactone, fiber

Procedia PDF Downloads 280
24790 Collaboration of Game Based Learning with Models Roaming the Stairs Using the Tajribi Method on the Eye PAI Lessons at the Ummul Mukminin Islamic Boarding School, Makassar South Sulawesi

Authors: Ratna Wulandari, Shahidin

Abstract:

This article aims to see how the Game Based Learning learning model with the Roaming The Stairs game makes a tajribi method can make PAI lessons active and interactive learning. This research uses a qualitative approach with a case study type of research. Data collection methods were carried out using interviews, observation, and documentation. Data analysis was carried out through the stages of data reduction, data display, and verification and drawing conclusions. The data validity test was carried out using the triangulation method. and drawing conclusions. The results of the research show that (1) children in grades 9A, 9B, and 9C like learning PAI using the Roaming The Stairs game (2) children in grades 9A, 9B, and 9C are active and can work in groups to solve problems in the Roaming The Stairs game (3) the class atmosphere becomes fun with learning method, namely learning while playing.

Keywords: game based learning, Roaming The Stairs, Tajribi PAI

Procedia PDF Downloads 27
24789 Wedding Organizer Strategy in the Era Covid-19 Pandemic In Surabaya, Indonesia

Authors: Rifky Cahya Putra

Abstract:

At this time of corona makes some countries affected difficult. As a result, many traders or companies are difficult to work in this pandemic era. So human activities in some fields must implement a new lifestyle or known as new normal. The transition from the one activity to another certainly requires high adaptation. So that almost in all sectors experience the impact of this phase, on of which is the wedding organizer. This research aims to find out what strategies are used so that the company can run in this pandemic. Techniques in data collection in the form interview to the owner of the wedding organizer and his team. Data analysis qualitative descriptive use interactive model analysis consisting of three main things, namely data reduction, data presentaion, and conclusion. For the result of the interview, the conclusion is that there are three strategies consisting of social media, sponsorship, and promotion.

Keywords: strategy, wedding organizer, pandemic, indonesia

Procedia PDF Downloads 139
24788 Research on Routing Protocol in Ship Dynamic Positioning Based on WSN Clustering Data Fusion System

Authors: Zhou Mo, Dennis Chow

Abstract:

In the dynamic positioning system (DPS) for vessels, the reliable information transmission between each note basically relies on the wireless protocols. From the perspective of cluster-based routing pro-tocols for wireless sensor networks, the data fusion technology based on the sleep scheduling mechanism and remaining energy in network layer is proposed, which applies the sleep scheduling mechanism to the routing protocols, considering the remaining energy of node and location information when selecting cluster-head. The problem of uneven distribution of nodes in each cluster is solved by the Equilibrium. At the same time, Classified Forwarding Mechanism as well as Redelivery Policy strategy is adopted to avoid congestion in the transmission of huge amount of data, reduce the delay in data delivery and enhance the real-time response. In this paper, a simulation test is conducted to improve the routing protocols, which turns out to reduce the energy consumption of nodes and increase the efficiency of data delivery.

Keywords: DPS for vessel, wireless sensor network, data fusion, routing protocols

Procedia PDF Downloads 473
24787 MapReduce Algorithm for Geometric and Topological Information Extraction from 3D CAD Models

Authors: Ahmed Fradi

Abstract:

In a digital world in perpetual evolution and acceleration, data more and more voluminous, rich and varied, the new software solutions emerged with the Big Data phenomenon offer new opportunities to the company enabling it not only to optimize its business and to evolve its production model, but also to reorganize itself to increase competitiveness and to identify new strategic axes. Design and manufacturing industrial companies, like the others, face these challenges, data represent a major asset, provided that they know how to capture, refine, combine and analyze them. The objective of our paper is to propose a solution allowing geometric and topological information extraction from 3D CAD model (precisely STEP files) databases, with specific algorithm based on the programming paradigm MapReduce. Our proposal is the first step of our future approach to 3D CAD object retrieval.

Keywords: Big Data, MapReduce, 3D object retrieval, CAD, STEP format

Procedia PDF Downloads 543