Search results for: real-time data acquisition and reporting
24081 Advanced Data Visualization Techniques for Effective Decision-making in Oil and Gas Exploration and Production
Authors: Deepak Singh, Rail Kuliev
Abstract:
This research article explores the significance of advanced data visualization techniques in enhancing decision-making processes within the oil and gas exploration and production domain. With the oil and gas industry facing numerous challenges, effective interpretation and analysis of vast and diverse datasets are crucial for optimizing exploration strategies, production operations, and risk assessment. The article highlights the importance of data visualization in managing big data, aiding the decision-making process, and facilitating communication with stakeholders. Various advanced data visualization techniques, including 3D visualization, augmented reality (AR), virtual reality (VR), interactive dashboards, and geospatial visualization, are discussed in detail, showcasing their applications and benefits in the oil and gas sector. The article presents case studies demonstrating the successful use of these techniques in optimizing well placement, real-time operations monitoring, and virtual reality training. Additionally, the article addresses the challenges of data integration and scalability, emphasizing the need for future developments in AI-driven visualization. In conclusion, this research emphasizes the immense potential of advanced data visualization in revolutionizing decision-making processes, fostering data-driven strategies, and promoting sustainable growth and improved operational efficiency within the oil and gas exploration and production industry.Keywords: augmented reality (AR), virtual reality (VR), interactive dashboards, real-time operations monitoring
Procedia PDF Downloads 8624080 The Data Quality Model for the IoT based Real-time Water Quality Monitoring Sensors
Authors: Rabbia Idrees, Ananda Maiti, Saurabh Garg, Muhammad Bilal Amin
Abstract:
IoT devices are the basic building blocks of IoT network that generate enormous volume of real-time and high-speed data to help organizations and companies to take intelligent decisions. To integrate this enormous data from multisource and transfer it to the appropriate client is the fundamental of IoT development. The handling of this huge quantity of devices along with the huge volume of data is very challenging. The IoT devices are battery-powered and resource-constrained and to provide energy efficient communication, these IoT devices go sleep or online/wakeup periodically and a-periodically depending on the traffic loads to reduce energy consumption. Sometime these devices get disconnected due to device battery depletion. If the node is not available in the network, then the IoT network provides incomplete, missing, and inaccurate data. Moreover, many IoT applications, like vehicle tracking and patient tracking require the IoT devices to be mobile. Due to this mobility, If the distance of the device from the sink node become greater than required, the connection is lost. Due to this disconnection other devices join the network for replacing the broken-down and left devices. This make IoT devices dynamic in nature which brings uncertainty and unreliability in the IoT network and hence produce bad quality of data. Due to this dynamic nature of IoT devices we do not know the actual reason of abnormal data. If data are of poor-quality decisions are likely to be unsound. It is highly important to process data and estimate data quality before bringing it to use in IoT applications. In the past many researchers tried to estimate data quality and provided several Machine Learning (ML), stochastic and statistical methods to perform analysis on stored data in the data processing layer, without focusing the challenges and issues arises from the dynamic nature of IoT devices and how it is impacting data quality. A comprehensive review on determining the impact of dynamic nature of IoT devices on data quality is done in this research and presented a data quality model that can deal with this challenge and produce good quality of data. This research presents the data quality model for the sensors monitoring water quality. DBSCAN clustering and weather sensors are used in this research to make data quality model for the sensors monitoring water quality. An extensive study has been done in this research on finding the relationship between the data of weather sensors and sensors monitoring water quality of the lakes and beaches. The detailed theoretical analysis has been presented in this research mentioning correlation between independent data streams of the two sets of sensors. With the help of the analysis and DBSCAN, a data quality model is prepared. This model encompasses five dimensions of data quality: outliers’ detection and removal, completeness, patterns of missing values and checks the accuracy of the data with the help of cluster’s position. At the end, the statistical analysis has been done on the clusters formed as the result of DBSCAN, and consistency is evaluated through Coefficient of Variation (CoV).Keywords: clustering, data quality, DBSCAN, and Internet of things (IoT)
Procedia PDF Downloads 13924079 New Security Approach of Confidential Resources in Hybrid Clouds
Authors: Haythem Yahyaoui, Samir Moalla, Mounir Bouden, Skander ghorbel
Abstract:
Nowadays, Cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime, also an optimized and secured access to the resources and gives more security for the data which stored in the platform, however, some companies do not trust Cloud providers, in their point of view, providers can access and modify some confidential data such as bank accounts, many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, although, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some modifications on the data before sending them to the Cloud in the objective to make them unreadable. This work aims on enhancing the quality of service of providers and improving the trust of the customers.Keywords: cloud, confidentiality, cryptography, security issues, trust issues
Procedia PDF Downloads 37824078 Estimation of Chronic Kidney Disease Using Artificial Neural Network
Authors: Ilker Ali Ozkan
Abstract:
In this study, an artificial neural network model has been developed to estimate chronic kidney failure which is a common disease. The patients’ age, their blood and biochemical values, and 24 input data which consists of various chronic diseases are used for the estimation process. The input data have been subjected to preprocessing because they contain both missing values and nominal values. 147 patient data which was obtained from the preprocessing have been divided into as 70% training and 30% testing data. As a result of the study, artificial neural network model with 25 neurons in the hidden layer has been found as the model with the lowest error value. Chronic kidney failure disease has been able to be estimated accurately at the rate of 99.3% using this artificial neural network model. The developed artificial neural network has been found successful for the estimation of chronic kidney failure disease using clinical data.Keywords: estimation, artificial neural network, chronic kidney failure disease, disease diagnosis
Procedia PDF Downloads 44724077 Impact of Map Generalization in Spatial Analysis
Authors: Lin Li, P. G. R. N. I. Pussella
Abstract:
When representing spatial data and their attributes on different types of maps, the scale plays a key role in the process of map generalization. The process is consisted with two main operators such as selection and omission. Once some data were selected, they would undergo of several geometrical changing processes such as elimination, simplification, smoothing, exaggeration, displacement, aggregation and size reduction. As a result of these operations at different levels of data, the geometry of the spatial features such as length, sinuosity, orientation, perimeter and area would be altered. This would be worst in the case of preparation of small scale maps, since the cartographer has not enough space to represent all the features on the map. What the GIS users do is when they wanted to analyze a set of spatial data; they retrieve a data set and does the analysis part without considering very important characteristics such as the scale, the purpose of the map and the degree of generalization. Further, the GIS users use and compare different maps with different degrees of generalization. Sometimes, GIS users are going beyond the scale of the source map using zoom in facility and violate the basic cartographic rule 'it is not suitable to create a larger scale map using a smaller scale map'. In the study, the effect of map generalization for GIS analysis would be discussed as the main objective. It was used three digital maps with different scales such as 1:10000, 1:50000 and 1:250000 which were prepared by the Survey Department of Sri Lanka, the National Mapping Agency of Sri Lanka. It was used common features which were on above three maps and an overlay analysis was done by repeating the data with different combinations. Road data, River data and Land use data sets were used for the study. A simple model, to find the best place for a wild life park, was used to identify the effects. The results show remarkable effects on different degrees of generalization processes. It can see that different locations with different geometries were received as the outputs from this analysis. The study suggests that there should be reasonable methods to overcome this effect. It can be recommended that, as a solution, it would be very reasonable to take all the data sets into a common scale and do the analysis part.Keywords: generalization, GIS, scales, spatial analysis
Procedia PDF Downloads 32824076 Interconnections of Circular Economy, Circularity, and Sustainability: A Systematic Review and Conceptual Framework
Authors: Anteneh Dagnachew Sewenet, Paola Pisano
Abstract:
The concept of circular economy, circularity, and sustainability are interconnected and promote a more sustainable future. However, previous studies have mainly focused on each concept individually, neglecting the relationships and gaps in the existing literature. This study aims to integrate and link these concepts to expand the theoretical and practical methods of scholars and professionals in pursuit of sustainability. The aim of this systematic literature review is to comprehensively analyze and summarize the interconnections between circular economy, circularity, and sustainability. Additionally, it seeks to develop a conceptual framework that can guide practitioners and serve as a basis for future research. The review employed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) protocol. A total of 78 articles were analyzed, utilizing the Scopus and Web of Science databases. The analysis involved summarizing and systematizing the conceptualizations of circularity and its relationship with the circular economy and long-term sustainability. The review provided a comprehensive overview of the interconnections between circular economy, circularity, and sustainability. Key themes, theoretical frameworks, empirical findings, and conceptual gaps in the literature were identified. Through a rigorous analysis of scholarly articles, the study highlighted the importance of integrating these concepts for a more sustainable future. This study contributes to the existing literature by integrating and linking the concepts of circular economy, circularity, and sustainability. It expands the theoretical understanding of how these concepts relate to each other and provides a conceptual framework that can guide future research in this field. The findings emphasize the need for a holistic approach in achieving sustainability goals. The data collection for this review involved identifying relevant articles from the Scopus and Web of Science databases. The selection of articles was made based on predefined inclusion and exclusion criteria. The PRISMA protocol guided the systematic analysis of the selected articles, including summarizing and systematizing their content. This study addressed the question of how circularity is conceptualized and related to both the circular economy and long-term sustainability. It aimed to identify the interconnections between these concepts and bridge the gap in the existing literature. The review provided a comprehensive analysis of the interconnections between the circular economy, circularity, and sustainability. It presented a conceptual framework that can guide practitioners in implementing circular economy strategies and serve as a basis for future research. By integrating these concepts, scholars, and professionals can enhance the theoretical and practical methods in pursuit of a more sustainable future. The findings emphasize the importance of taking a holistic approach to achieve sustainability goals and highlight conceptual gaps that can be addressed in future studies.Keywords: circularity, circular economy, sustainability, innovation
Procedia PDF Downloads 10624075 Identity Verification Based on Multimodal Machine Learning on Red Green Blue (RGB) Red Green Blue-Depth (RGB-D) Voice Data
Authors: LuoJiaoyang, Yu Hongyang
Abstract:
In this paper, we experimented with a new approach to multimodal identification using RGB, RGB-D and voice data. The multimodal combination of RGB and voice data has been applied in tasks such as emotion recognition and has shown good results and stability, and it is also the same in identity recognition tasks. We believe that the data of different modalities can enhance the effect of the model through mutual reinforcement. We try to increase the three modalities on the basis of the dual modalities and try to improve the effectiveness of the network by increasing the number of modalities. We also implemented the single-modal identification system separately, tested the data of these different modalities under clean and noisy conditions, and compared the performance with the multimodal model. In the process of designing the multimodal model, we tried a variety of different fusion strategies and finally chose the fusion method with the best performance. The experimental results show that the performance of the multimodal system is better than that of the single modality, especially in dealing with noise, and the multimodal system can achieve an average improvement of 5%.Keywords: multimodal, three modalities, RGB-D, identity verification
Procedia PDF Downloads 7024074 Low Power CMOS Amplifier Design for Wearable Electrocardiogram Sensor
Authors: Ow Tze Weng, Suhaila Isaak, Yusmeeraz Yusof
Abstract:
The trend of health care screening devices in the world is increasingly towards the favor of portability and wearability, especially in the most common electrocardiogram (ECG) monitoring system. This is because these wearable screening devices are not restricting the patient’s freedom and daily activities. While the demand of low power and low cost biomedical system on chip (SoC) is increasing in exponential way, the front end ECG sensors are still suffering from flicker noise for low frequency cardiac signal acquisition, 50 Hz power line electromagnetic interference, and the large unstable input offsets due to the electrode-skin interface is not attached properly. In this paper, a high performance CMOS amplifier for ECG sensors that suitable for low power wearable cardiac screening is proposed. The amplifier adopts the highly stable folded cascode topology and later being implemented into RC feedback circuit for low frequency DC offset cancellation. By using 0.13 µm CMOS technology from Silterra, the simulation results show that this front end circuit can achieve a very low input referred noise of 1 pV/√Hz and high common mode rejection ratio (CMRR) of 174.05 dB. It also gives voltage gain of 75.45 dB with good power supply rejection ratio (PSSR) of 92.12 dB. The total power consumption is only 3 µW and thus suitable to be implemented with further signal processing and classification back end for low power biomedical SoC.Keywords: CMOS, ECG, amplifier, low power
Procedia PDF Downloads 24824073 Non-Linear Causality Inference Using BAMLSS and Bi-CAM in Finance
Authors: Flora Babongo, Valerie Chavez
Abstract:
Inferring causality from observational data is one of the fundamental subjects, especially in quantitative finance. So far most of the papers analyze additive noise models with either linearity, nonlinearity or Gaussian noise. We fill in the gap by providing a nonlinear and non-gaussian causal multiplicative noise model that aims to distinguish the cause from the effect using a two steps method based on Bayesian additive models for location, scale and shape (BAMLSS) and on causal additive models (CAM). We have tested our method on simulated and real data and we reached an accuracy of 0.86 on average. As real data, we considered the causality between financial indices such as S&P 500, Nasdaq, CAC 40 and Nikkei, and companies' log-returns. Our results can be useful in inferring causality when the data is heteroskedastic or non-injective.Keywords: causal inference, DAGs, BAMLSS, financial index
Procedia PDF Downloads 15124072 Antibiotic and Fungicide Exposure Reveal the Evolution of Soil-Lettuce System Resistome
Authors: Chenyu Huang, Minrong Cui, Hua Fang, Luqing Zhang, Yunlong Yu
Abstract:
The emergence and spread of antibiotic resistance genes (ARGs) have become a pressing issue in global agricultural production. However, understanding how these ARGs spread across different spatial scales, especially when exposed to both pesticides and antibiotics, has remained a challenge. Here, metagenomic assembly and binning methodologies were used to determine the mechanism of ARG propagation within soil-lettuce systems exposed to both fungicides and antibiotics. The results of our study showed that the presence of fungicide and antibiotic stresses had a significant impact on certain bacterial communities. Notably, we observed that ARGs were primarily transferred from the soil to the plant through plasmids. The selective pressure exerted by fungicides and antibiotics contributed to an increase in unique ARGs present on lettuce leaves. Moreover, ARGs located on chromosomes and plasmids followed different transmission patterns. The presence of diverse selective pressures, a result of compound treatments involving antibiotics and fungicides, amplifies this phenomenon. Consequently, there is a higher probability of bacteria developing multi-antibiotic resistance under the combined pressure of fungicides and antibiotics. In summary, our findings highlight that combined fungicide and antibiotic treatments are more likely to drive the acquisition of ARGs within the soil-plant system and may increase the risk of human ingestion.Keywords: soil-lettuce system, fungicide, antibiotic, ARG, transmission
Procedia PDF Downloads 11024071 Managing Incomplete PSA Observations in Prostate Cancer Data: Key Strategies and Best Practices for Handling Loss to Follow-Up and Missing Data
Authors: Madiha Liaqat, Rehan Ahmed Khan, Shahid Kamal
Abstract:
Multiple imputation with delta adjustment is a versatile and transparent technique for addressing univariate missing data in the presence of various missing mechanisms. This approach allows for the exploration of sensitivity to the missing-at-random (MAR) assumption. In this review, we outline the delta-adjustment procedure and illustrate its application for assessing the sensitivity to deviations from the MAR assumption. By examining diverse missingness scenarios and conducting sensitivity analyses, we gain valuable insights into the implications of missing data on our analyses, enhancing the reliability of our study's conclusions. In our study, we focused on assessing logPSA, a continuous biomarker in incomplete prostate cancer data, to examine the robustness of conclusions against plausible departures from the MAR assumption. We introduced several approaches for conducting sensitivity analyses, illustrating their application within the pattern mixture model (PMM) under the delta adjustment framework. This proposed approach effectively handles missing data, particularly loss to follow-up.Keywords: loss to follow-up, incomplete response, multiple imputation, sensitivity analysis, prostate cancer
Procedia PDF Downloads 8924070 Vibration-Based Data-Driven Model for Road Health Monitoring
Authors: Guru Prakash, Revanth Dugalam
Abstract:
A road’s condition often deteriorates due to harsh loading such as overload due to trucks, and severe environmental conditions such as heavy rain, snow load, and cyclic loading. In absence of proper maintenance planning, this results in potholes, wide cracks, bumps, and increased roughness of roads. In this paper, a data-driven model will be developed to detect these damages using vibration and image signals. The key idea of the proposed methodology is that the road anomaly manifests in these signals, which can be detected by training a machine learning algorithm. The use of various machine learning techniques such as the support vector machine and Radom Forest method will be investigated. The proposed model will first be trained and tested with artificially simulated data, and the model architecture will be finalized by comparing the accuracies of various models. Once a model is fixed, the field study will be performed, and data will be collected. The field data will be used to validate the proposed model and to predict the future road’s health condition. The proposed will help to automate the road condition monitoring process, repair cost estimation, and maintenance planning process.Keywords: SVM, data-driven, road health monitoring, pot-hole
Procedia PDF Downloads 8624069 General Architecture for Automation of Machine Learning Practices
Authors: U. Borasi, Amit Kr. Jain, Rakesh, Piyush Jain
Abstract:
Data collection, data preparation, model training, model evaluation, and deployment are all processes in a typical machine learning workflow. Training data needs to be gathered and organised. This often entails collecting a sizable dataset and cleaning it to remove or correct any inaccurate or missing information. Preparing the data for use in the machine learning model requires pre-processing it after it has been acquired. This often entails actions like scaling or normalising the data, handling outliers, selecting appropriate features, reducing dimensionality, etc. This pre-processed data is then used to train a model on some machine learning algorithm. After the model has been trained, it needs to be assessed by determining metrics like accuracy, precision, and recall, utilising a test dataset. Every time a new model is built, both data pre-processing and model training—two crucial processes in the Machine learning (ML) workflow—must be carried out. Thus, there are various Machine Learning algorithms that can be employed for every single approach to data pre-processing, generating a large set of combinations to choose from. Example: for every method to handle missing values (dropping records, replacing with mean, etc.), for every scaling technique, and for every combination of features selected, a different algorithm can be used. As a result, in order to get the optimum outcomes, these tasks are frequently repeated in different combinations. This paper suggests a simple architecture for organizing this largely produced “combination set of pre-processing steps and algorithms” into an automated workflow which simplifies the task of carrying out all possibilities.Keywords: machine learning, automation, AUTOML, architecture, operator pool, configuration, scheduler
Procedia PDF Downloads 5824068 Collaboration of Game Based Learning with Models Roaming the Stairs Using the Tajribi Method on the Eye PAI Lessons at the Ummul Mukminin Islamic Boarding School, Makassar South Sulawesi
Authors: Ratna Wulandari, Shahidin
Abstract:
This article aims to see how the Game Based Learning learning model with the Roaming The Stairs game makes a tajribi method can make PAI lessons active and interactive learning. This research uses a qualitative approach with a case study type of research. Data collection methods were carried out using interviews, observation, and documentation. Data analysis was carried out through the stages of data reduction, data display, and verification and drawing conclusions. The data validity test was carried out using the triangulation method. and drawing conclusions. The results of the research show that (1) children in grades 9A, 9B, and 9C like learning PAI using the Roaming The Stairs game (2) children in grades 9A, 9B, and 9C are active and can work in groups to solve problems in the Roaming The Stairs game (3) the class atmosphere becomes fun with learning method, namely learning while playing.Keywords: game based learning, Roaming The Stairs, Tajribi PAI
Procedia PDF Downloads 2224067 Wedding Organizer Strategy in the Era Covid-19 Pandemic In Surabaya, Indonesia
Authors: Rifky Cahya Putra
Abstract:
At this time of corona makes some countries affected difficult. As a result, many traders or companies are difficult to work in this pandemic era. So human activities in some fields must implement a new lifestyle or known as new normal. The transition from the one activity to another certainly requires high adaptation. So that almost in all sectors experience the impact of this phase, on of which is the wedding organizer. This research aims to find out what strategies are used so that the company can run in this pandemic. Techniques in data collection in the form interview to the owner of the wedding organizer and his team. Data analysis qualitative descriptive use interactive model analysis consisting of three main things, namely data reduction, data presentaion, and conclusion. For the result of the interview, the conclusion is that there are three strategies consisting of social media, sponsorship, and promotion.Keywords: strategy, wedding organizer, pandemic, indonesia
Procedia PDF Downloads 13524066 Research on Routing Protocol in Ship Dynamic Positioning Based on WSN Clustering Data Fusion System
Authors: Zhou Mo, Dennis Chow
Abstract:
In the dynamic positioning system (DPS) for vessels, the reliable information transmission between each note basically relies on the wireless protocols. From the perspective of cluster-based routing pro-tocols for wireless sensor networks, the data fusion technology based on the sleep scheduling mechanism and remaining energy in network layer is proposed, which applies the sleep scheduling mechanism to the routing protocols, considering the remaining energy of node and location information when selecting cluster-head. The problem of uneven distribution of nodes in each cluster is solved by the Equilibrium. At the same time, Classified Forwarding Mechanism as well as Redelivery Policy strategy is adopted to avoid congestion in the transmission of huge amount of data, reduce the delay in data delivery and enhance the real-time response. In this paper, a simulation test is conducted to improve the routing protocols, which turns out to reduce the energy consumption of nodes and increase the efficiency of data delivery.Keywords: DPS for vessel, wireless sensor network, data fusion, routing protocols
Procedia PDF Downloads 46724065 Tourism Policy Challenges in Post-Soviet Georgia
Authors: Merab Khokhobaia
Abstract:
The research of Georgian tourism policy challenges is important, as the tourism can play an increasing role for the economic growth and improvement of standard of living of the country even with scanty resources, at the expense of improved creative approaches. It is also important to make correct decisions at macroeconomic level, which will be accordingly reflected in the successful functioning of the travel companies and finally, in the improvement of economic indicators of the country. In order to correctly orient sectoral policy, it is important to precisely determine its role in the economy. Development of travel industry has been considered as one of the priorities in Georgia; the country has unique cultural heritage and traditions, as well as plenty of natural resources, which are a significant precondition for the development of tourism. Despite the factors mentioned above, the existing resources are not completely utilized and exploited. This work represents a study of subjective, as well as objective reasons of ineffective functioning of the sector. During the years of transformation experienced by Georgia, the role of travel industry in economic development of the country represented the subject of continual discussions. Such assessments were often biased and they did not rest on specific calculations. This topic became especially popular on the ground of market economy, because reliable statistical data have a particular significance in the designing of tourism policy. In order to deeply study the aforementioned issue, this paper analyzes monetary, as well as non-monetary indicators. The research widely included the tourism indicators system; we analyzed the flaws in reporting of the results of tourism sector in Georgia. Existing defects are identified and recommendations for their improvement are offered. For stable development tourism, similarly to other economic sectors, needs a well-designed policy from the perspective of national, as well as local, regional development. The tourism policy must be drawn up in order to efficiently achieve our goals, which were established in short-term and long-term dynamics on the national or regional scale of specific country. The article focuses on the role and responsibility of the state institutes in planning and implementation of the tourism policy. The government has various tools and levers, which may positively influence the processes. These levers are especially important in terms of international, as well as internal tourism development. Within the framework of this research, the regulatory documents, which are in force in relation to this industry, were also analyzed. The main attention is turned to their modernization and necessity of their compliance with European standards. It is a current issue to direct the efforts of state policy on support of business by implementing infrastructural projects, as well as by development of human resources, which may be possible by supporting the relevant higher and vocational studying-educational programs.Keywords: regional development, tourism industry, tourism policy, transition
Procedia PDF Downloads 26324064 Stereoselective Glycosylation and Functionalization of Unbiased Site of Sweet System via Dual-Catalytic Transition Metal Systems/Wittig Reaction
Authors: Mukul R. Gupta, Rajkumar Gandhi, Rajitha Sachan, Naveen K. Khare
Abstract:
The field of glycoscience has burgeoned in the last several decades, leading to the identification of many glycosides which could serve critical roles in a wide range of biological processes. This has prompted a resurgence in synthetic interest, with a particular focus on new approaches to construct the selective glycosidic bond. Despite the numerous elegant strategies and methods developed for the formation of glycosidic bonds, stereoselective construction of glycosides remains challenging. Here, we have recently developed the novel Hexafluoroisopropanol (HFIP) catalyzed stereoselective glycosylation methods by using KDN imidate glycosyl donor and a variety of alcohols in excellent yield. This method is broadly applicable to a wide range of substrates and with excellent selectivity of glycoside. Also, herein we are reporting the functionalization of the unbiased side of newly formed glycosides by dual-catalytic transition metal systems (Ru- or Fe-). We are using the innovative Reverse & Catalyst strategy, i.e., a reversible activation reaction by one catalyst with a functionalization reaction by another catalyst, together with enabling functionalization of substrates at their inherently unreactive sites. As well, we are targeting the diSia derivative synthesis by Wittig reaction. This synthetic method is applicable in mild conditions, functional group tolerance of the dual-catalytic systems and also highlights the potential of the multicatalytic approach to address challenging transformations to avoid multistep procedures in carbohydrate synthesis.Keywords: KDN, stereoselective glycosylation, dual-catalytic functionalization, Wittig reaction
Procedia PDF Downloads 19324063 MapReduce Algorithm for Geometric and Topological Information Extraction from 3D CAD Models
Authors: Ahmed Fradi
Abstract:
In a digital world in perpetual evolution and acceleration, data more and more voluminous, rich and varied, the new software solutions emerged with the Big Data phenomenon offer new opportunities to the company enabling it not only to optimize its business and to evolve its production model, but also to reorganize itself to increase competitiveness and to identify new strategic axes. Design and manufacturing industrial companies, like the others, face these challenges, data represent a major asset, provided that they know how to capture, refine, combine and analyze them. The objective of our paper is to propose a solution allowing geometric and topological information extraction from 3D CAD model (precisely STEP files) databases, with specific algorithm based on the programming paradigm MapReduce. Our proposal is the first step of our future approach to 3D CAD object retrieval.Keywords: Big Data, MapReduce, 3D object retrieval, CAD, STEP format
Procedia PDF Downloads 54124062 Data Hiding in Gray Image Using ASCII Value and Scanning Technique
Authors: R. K. Pateriya, Jyoti Bharti
Abstract:
This paper presents an approach for data hiding methods which provides a secret communication between sender and receiver. The data is hidden in gray-scale images and the boundary of gray-scale image is used to store the mapping information. In this an approach data is in ASCII format and the mapping is in between ASCII value of hidden message and pixel value of cover image, since pixel value of an image as well as ASCII value is in range of 0 to 255 and this mapping information is occupying only 1 bit per character of hidden message as compared to 8 bit per character thus maintaining good quality of stego image.Keywords: ASCII value, cover image, PSNR, pixel value, stego image, secret message
Procedia PDF Downloads 41624061 DCASH: Dynamic Cache Synchronization Algorithm for Heterogeneous Reverse Y Synchronizing Mobile Database Systems
Authors: Gunasekaran Raja, Kottilingam Kottursamy, Rajakumar Arul, Ramkumar Jayaraman, Krithika Sairam, Lakshmi Ravi
Abstract:
The synchronization server maintains a dynamically changing cache, which contains the data items which were requested and collected by the mobile node from the server. The order and presence of tuples in the cache changes dynamically according to the frequency of updates performed on the data, by the server and client. To synchronize, the data which has been modified by client and the server at an instant are collected, batched together by the type of modification (insert/ update/ delete), and sorted according to their update frequencies. This ensures that the DCASH (Dynamic Cache Synchronization Algorithm for Heterogeneous Reverse Y synchronizing Mobile Database Systems) gives priority to the frequently accessed data with high usage. The optimal memory management algorithm is proposed to manage data items according to their frequency, theorems were written to show the current mobile data activity is reverse Y in nature and the experiments were tested with 2g and 3g networks for various mobile devices to show the reduced response time and energy consumption.Keywords: mobile databases, synchronization, cache, response time
Procedia PDF Downloads 40624060 Unified Structured Process for Health Analytics
Authors: Supunmali Ahangama, Danny Chiang Choon Poo
Abstract:
Health analytics (HA) is used in healthcare systems for effective decision-making, management, and planning of healthcare and related activities. However, user resistance, the unique position of medical data content, and structure (including heterogeneous and unstructured data) and impromptu HA projects have held up the progress in HA applications. Notably, the accuracy of outcomes depends on the skills and the domain knowledge of the data analyst working on the healthcare data. The success of HA depends on having a sound process model, effective project management and availability of supporting tools. Thus, to overcome these challenges through an effective process model, we propose an HA process model with features from the rational unified process (RUP) model and agile methodology.Keywords: agile methodology, health analytics, unified process model, UML
Procedia PDF Downloads 50624059 Use of Life Cycle Data for State-Oriented Maintenance
Authors: Maximilian Winkens, Matthias Goerke
Abstract:
The state-oriented maintenance enables the preventive intervention before the failure of a component and guarantees avoidance of expensive breakdowns. Because the timing of the maintenance is defined by the component’s state, the remaining service life can be exhausted to the limit. The basic requirement for the state-oriented maintenance is the ability to define the component’s state. New potential for this is offered by gentelligent components. They are developed at the Corporative Research Centre 653 of the German Research Foundation (DFG). Because of their sensory ability they enable the registration of stresses during the component’s use. The data is gathered and evaluated. The methodology developed determines the current state of the gentelligent component based on the gathered data. This article presents this methodology as well as current research. The main focus of the current scientific work is to improve the quality of the state determination based on the life-cycle data analysis. The methodology developed until now evaluates the data of the usage phase and based on it predicts the timing of the gentelligent component’s failure. The real failure timing though, deviate from the predicted one because the effects from the production phase aren’t considered. The goal of the current research is to develop a methodology for state determination which considers both production and usage data.Keywords: state-oriented maintenance, life-cycle data, gentelligent component, preventive intervention
Procedia PDF Downloads 49524058 Understanding of Malaysian Community Disaster Resilience: Australian Scorecard Adaptation
Authors: Salizar Mohamed Ludin, Mohd Khairul Hasyimi Firdaus, Paul Arbon
Abstract:
Purpose: This paper aims to develop Malaysian Government and community-level critical thinking, planning and action for improving community disaster resilience by reporting Phase 1, Part 1 of a larger community disaster resilience measurement study about adapting the Torrens Resilience Institute Australian Community Disaster Resilience Scorecard to the Malaysian context. Methodology: Pparticipatory action research encouraged key people involved in managing the six most affected areas in the 2014 flooding of Kelantan in Malaysia’s north-east to participate in discussions about adapting and self-testing the Australian Community Disaster Resilience Scorecard to measure and improve their communities’ disaster resilience. Findings: Communities need to strengthen their disaster resilience through better communication, cross-community cooperation, maximizing opportunities to compare their plans, actions and reactions with those reported in research publications, and aligning their community disaster management with reported best practice internationally while acknowledging the need to adapt such practice to local contexts. Research implications: There is a need for a Malaysia-wide, simple-to-use, standardized disaster resilience scorecard to improve the quality, quantity and capability of healthcare and emergency services’ preparedness, and to facilitate urgent reallocation of aid. Value: This study is the first of its kind in Malaysia. The resulting community disaster resilience guideline based on participants’ feedback about the Kelantan floods and scorecard self-testing has the potential for further adaptation to suit contexts across Malaysia, as well as demonstrating how the scorecard can be adapted for international use.Keywords: community disaster resilience, CDR Scorecard, participatory action research, flooding, Malaysia
Procedia PDF Downloads 33624057 Preclinical Evidence of Pharmacological Effect from Medicinal Hemp
Authors: Muhammad nor Farhan Sa'At, Xin Y. Lim, Terence Y. C. Tan, Siti Hajar M. Rosli, Syazwani S. Ali, Ami F. Syed Mohamed
Abstract:
INTRODUCTION: Hemp (Cannabis sativa subsp. sativa), commonly used for industrial purposes, differs from marijuana by containing lower levels of delta-9-tetrahydronannabidiol- the principal psychoactive constituent in cannabis. Due to its non-psychoactive nature, there has been growing interest in hemp’s therapeutic potential, which has been investigated through pre-clinical and clinical study modalities. OBJECTIVE: To provide an overview of the current landscape of hemp research, through recent scientific findings specific to the pharmacological effects of the medicinal hemp plant and its derived compounds. METHODS: This review was conducted through a systematic search strategy according to the preferred reporting items for systematic review and meta-analysis-ScR (PRISMA-ScR) checklist on electronic databases including MEDLINE, OVID (OVFT, APC Journal Club, EBM Reviews), Cochrane Library Central and Clinicaltrials.gov. RESULTS: From 65 primary articles reviewed, there were 47 pre-clinical studies related to medicinal hemp. Interestingly, the hemp derivatives showed several potential activities such as anti-oxidative, anti-hypertensive, anti-inflammatory, anti-diabetic, anti-neuroinflammatory, anti-arthritic, anti-acne, and anti-microbial activities. Renal protective effects and estrogenic properties were also exhibited in vitro. CONCLUSION: Medicinal hemp possesses various pharmacological effects tested in vitro and in vivo. Information provided in this review could be used as tool to strengthen the study design of future clinical trial research.Keywords: Preclinical, Herbal Medicine, Hemp, Cannabis
Procedia PDF Downloads 13624056 A Hybrid System for Boreholes Soil Sample
Authors: Ali Ulvi Uzer
Abstract:
Data reduction is an important topic in the field of pattern recognition applications. The basic concept is the reduction of multitudinous amounts of data down to the meaningful parts. The Principal Component Analysis (PCA) method is frequently used for data reduction. The Support Vector Machine (SVM) method is a discriminative classifier formally defined by a separating hyperplane. In other words, given labeled training data, the algorithm outputs an optimal hyperplane which categorizes new examples. This study offers a hybrid approach that uses the PCA for data reduction and Support Vector Machines (SVM) for classification. In order to detect the accuracy of the suggested system, two boreholes taken from the soil sample was used. The classification accuracies for this dataset were obtained through using ten-fold cross-validation method. As the results suggest, this system, which is performed through size reduction, is a feasible system for faster recognition of dataset so our study result appears to be very promising.Keywords: feature selection, sequential forward selection, support vector machines, soil sample
Procedia PDF Downloads 45524055 Predicting Customer Purchasing Behaviour in Retail Marketing: A Research for a Supermarket Chain
Authors: Sabri Serkan Güllüoğlu
Abstract:
Analysis can be defined as the process of gathering, recording and researching data related to products and services, in order to learn something. But for marketers, analyses are not only used for learning but also an essential and critical part of the business, because this allows companies to offer products or services which are focused and well targeted. Market analysis also identify market trends, demographics, customer’s buying habits and important information on the competition. Data mining is used instead of traditional research, because it extracts predictive information about customer and sales from large databases. In contrast to traditional research, data mining relies on information that is already available. Simply the goal is to improve the efficiency of supermarkets. In this study, the purpose is to find dependency on products. For instance, which items are bought together, using association rules in data mining. Moreover, this information will be used for improving the profitability of customers such as increasing shopping time and sales of fewer sold items.Keywords: data mining, association rule mining, market basket analysis, purchasing
Procedia PDF Downloads 48324054 Predicting Medical Check-Up Patient Re-Coming Using Sequential Pattern Mining and Association Rules
Authors: Rizka Aisha Rahmi Hariadi, Chao Ou-Yang, Han-Cheng Wang, Rajesri Govindaraju
Abstract:
As the increasing of medical check-up popularity, there are a huge number of medical check-up data stored in database and have not been useful. These data actually can be very useful for future strategic planning if we mine it correctly. In other side, a lot of patients come with unpredictable coming and also limited available facilities make medical check-up service offered by hospital not maximal. To solve that problem, this study used those medical check-up data to predict patient re-coming. Sequential pattern mining (SPM) and association rules method were chosen because these methods are suitable for predicting patient re-coming using sequential data. First, based on patient personal information the data was grouped into … groups then discriminant analysis was done to check significant of the grouping. Second, for each group some frequent patterns were generated using SPM method. Third, based on frequent patterns of each group, pairs of variable can be extracted using association rules to get general pattern of re-coming patient. Last, discussion and conclusion was done to give some implications of the results.Keywords: patient re-coming, medical check-up, health examination, data mining, sequential pattern mining, association rules, discriminant analysis
Procedia PDF Downloads 64024053 Aviation versus Aerospace: A Differential Analysis of Workforce Jobs via Text Mining
Authors: Sarah Werner, Michael J. Pritchard
Abstract:
From pilots to engineers, the skills development within the aerospace industry is exceptionally broad. Employers often struggle with finding the right mixture of qualified skills to fill their organizational demands. This effort to find qualified talent is further complicated by the industrial delineation between two key areas: aviation and aerospace. In a broad sense, the aerospace industry overlaps with the aviation industry. In turn, the aviation industry is a smaller sector segment within the context of the broader definition of the aerospace industry. Furthermore, it could be conceptually argued that -in practice- there is little distinction between these two sectors (i.e., aviation and aerospace). However, through our unstructured text analysis of over 6,000 job listings captured, our team found a clear delineation between aviation-related jobs and aerospace-related jobs. Using techniques in natural language processing, our research identifies an integrated workforce skill pattern that clearly breaks between these two sectors. While the aviation sector has largely maintained its need for pilots, mechanics, and associated support personnel, the staffing needs of the aerospace industry are being progressively driven by integrative engineering needs. Increasingly, this is leading many aerospace-based organizations towards the acquisition of 'system level' staffing requirements. This research helps to better align higher educational institutions with the current industrial staffing complexities within the broader aerospace sector.Keywords: aerospace industry, job demand, text mining, workforce development
Procedia PDF Downloads 27324052 Continual Learning Using Data Generation for Hyperspectral Remote Sensing Scene Classification
Authors: Samiah Alammari, Nassim Ammour
Abstract:
When providing a massive number of tasks successively to a deep learning process, a good performance of the model requires preserving the previous tasks data to retrain the model for each upcoming classification. Otherwise, the model performs poorly due to the catastrophic forgetting phenomenon. To overcome this shortcoming, we developed a successful continual learning deep model for remote sensing hyperspectral image regions classification. The proposed neural network architecture encapsulates two trainable subnetworks. The first module adapts its weights by minimizing the discrimination error between the land-cover classes during the new task learning, and the second module tries to learn how to replicate the data of the previous tasks by discovering the latent data structure of the new task dataset. We conduct experiments on HSI dataset Indian Pines. The results confirm the capability of the proposed method.Keywords: continual learning, data reconstruction, remote sensing, hyperspectral image segmentation
Procedia PDF Downloads 266