Search results for: Dynamic Data Envelopment Analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 42832

Search results for: Dynamic Data Envelopment Analysis

41602 The Right to Data Portability and Its Influence on the Development of Digital Services

Authors: Roman Bieda

Abstract:

The General Data Protection Regulation (GDPR) will come into force on 25 May 2018 which will create a new legal framework for the protection of personal data in the European Union. Article 20 of GDPR introduces a right to data portability. This right allows for data subjects to receive the personal data which they have provided to a data controller, in a structured, commonly used and machine-readable format, and to transmit this data to another data controller. The right to data portability, by facilitating transferring personal data between IT environments (e.g.: applications), will also facilitate changing the provider of services (e.g. changing a bank or a cloud computing service provider). Therefore, it will contribute to the development of competition and the digital market. The aim of this paper is to discuss the right to data portability and its influence on the development of new digital services.

Keywords: data portability, digital market, GDPR, personal data

Procedia PDF Downloads 454
41601 End-to-End Spanish-English Sequence Learning Translation Model

Authors: Vidhu Mitha Goutham, Ruma Mukherjee

Abstract:

The low availability of well-trained, unlimited, dynamic-access models for specific languages makes it hard for corporate users to adopt quick translation techniques and incorporate them into product solutions. As translation tasks increasingly require a dynamic sequence learning curve; stable, cost-free opensource models are scarce. We survey and compare current translation techniques and propose a modified sequence to sequence model repurposed with attention techniques. Sequence learning using an encoder-decoder model is now paving the path for higher precision levels in translation. Using a Convolutional Neural Network (CNN) encoder and a Recurrent Neural Network (RNN) decoder background, we use Fairseq tools to produce an end-to-end bilingually trained Spanish-English machine translation model including source language detection. We acquire competitive results using a duo-lingo-corpus trained model to provide for prospective, ready-made plug-in use for compound sentences and document translations. Our model serves a decent system for large, organizational data translation needs. While acknowledging its shortcomings and future scope, it also identifies itself as a well-optimized deep neural network model and solution.

Keywords: attention, encoder-decoder, Fairseq, Seq2Seq, Spanish, translation

Procedia PDF Downloads 160
41600 Anomaly Detection in Financial Markets Using Tucker Decomposition

Authors: Salma Krafessi

Abstract:

The financial markets have a multifaceted, intricate environment, and enormous volumes of data are produced every day. To find investment possibilities, possible fraudulent activity, and market oddities, accurate anomaly identification in this data is essential. Conventional methods for detecting anomalies frequently fail to capture the complex organization of financial data. In order to improve the identification of abnormalities in financial time series data, this study presents Tucker Decomposition as a reliable multi-way analysis approach. We start by gathering closing prices for the S&P 500 index across a number of decades. The information is converted to a three-dimensional tensor format, which contains internal characteristics and temporal sequences in a sliding window structure. The tensor is then broken down using Tucker Decomposition into a core tensor and matching factor matrices, allowing latent patterns and relationships in the data to be captured. A possible sign of abnormalities is the reconstruction error from Tucker's Decomposition. We are able to identify large deviations that indicate unusual behavior by setting a statistical threshold. A thorough examination that contrasts the Tucker-based method with traditional anomaly detection approaches validates our methodology. The outcomes demonstrate the superiority of Tucker's Decomposition in identifying intricate and subtle abnormalities that are otherwise missed. This work opens the door for more research into multi-way data analysis approaches across a range of disciplines and emphasizes the value of tensor-based methods in financial analysis.

Keywords: tucker decomposition, financial markets, financial engineering, artificial intelligence, decomposition models

Procedia PDF Downloads 41
41599 Analyzing Keyword Networks for the Identification of Correlated Research Topics

Authors: Thiago M. R. Dias, Patrícia M. Dias, Gray F. Moita

Abstract:

The production and publication of scientific works have increased significantly in the last years, being the Internet the main factor of access and distribution of these works. Faced with this, there is a growing interest in understanding how scientific research has evolved, in order to explore this knowledge to encourage research groups to become more productive. Therefore, the objective of this work is to explore repositories containing data from scientific publications and to characterize keyword networks of these publications, in order to identify the most relevant keywords, and to highlight those that have the greatest impact on the network. To do this, each article in the study repository has its keywords extracted and in this way the network is  characterized, after which several metrics for social network analysis are applied for the identification of the highlighted keywords.

Keywords: bibliometrics, data analysis, extraction and data integration, scientometrics

Procedia PDF Downloads 228
41598 A Novel Heuristic for Analysis of Large Datasets by Selecting Wrapper-Based Features

Authors: Bushra Zafar, Usman Qamar

Abstract:

Large data sample size and dimensions render the effectiveness of conventional data mining methodologies. A data mining technique are important tools for collection of knowledgeable information from variety of databases and provides supervised learning in the form of classification to design models to describe vital data classes while structure of the classifier is based on class attribute. Classification efficiency and accuracy are often influenced to great extent by noisy and undesirable features in real application data sets. The inherent natures of data set greatly masks its quality analysis and leave us with quite few practical approaches to use. To our knowledge first time, we present a new approach for investigation of structure and quality of datasets by providing a targeted analysis of localization of noisy and irrelevant features of data sets. Machine learning is based primarily on feature selection as pre-processing step which offers us to select few features from number of features as a subset by reducing the space according to certain evaluation criterion. The primary objective of this study is to trim down the scope of the given data sample by searching a small set of important features which may results into good classification performance. For this purpose, a heuristic for wrapper-based feature selection using genetic algorithm and for discriminative feature selection an external classifier are used. Selection of feature based on its number of occurrence in the chosen chromosomes. Sample dataset has been used to demonstrate proposed idea effectively. A proposed method has improved average accuracy of different datasets is about 95%. Experimental results illustrate that proposed algorithm increases the accuracy of prediction of different diseases.

Keywords: data mining, generic algorithm, KNN algorithms, wrapper based feature selection

Procedia PDF Downloads 304
41597 Effect of Traffic Volume and Its Composition on Vehicular Speed under Mixed Traffic Conditions: A Kriging Based Approach

Authors: Subhadip Biswas, Shivendra Maurya, Satish Chandra, Indrajit Ghosh

Abstract:

Use of speed prediction models sometimes appears as a feasible alternative to laborious field measurement particularly, in case when field data cannot fulfill designer’s requirements. However, developing speed models is a challenging task specifically in the context of developing countries like India where vehicles with diverse static and dynamic characteristics use the same right of way without any segregation. Here the traffic composition plays a significant role in determining the vehicular speed. The present research was carried out to examine the effects of traffic volume and its composition on vehicular speed under mixed traffic conditions. Classified traffic volume and speed data were collected from different geometrically identical six lane divided arterials in New Delhi. Based on these field data, speed prediction models were developed for individual vehicle category adopting Kriging approximation technique, an alternative for commonly used regression. These models are validated with the data set kept aside earlier for validation purpose. The predicted speeds showed a great deal of agreement with the observed values and also the model outperforms all other existing speed models. Finally, the proposed models were utilized to evaluate the effect of traffic volume and its composition on speed.

Keywords: speed, Kriging, arterial, traffic volume

Procedia PDF Downloads 338
41596 Autonomic Threat Avoidance and Self-Healing in Database Management System

Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik

Abstract:

Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.

Keywords: autonomic computing, self-healing, threat avoidance, security

Procedia PDF Downloads 490
41595 Exploring the Role of Data Mining in Crime Classification: A Systematic Literature Review

Authors: Faisal Muhibuddin, Ani Dijah Rahajoe

Abstract:

This in-depth exploration, through a systematic literature review, scrutinizes the nuanced role of data mining in the classification of criminal activities. The research focuses on investigating various methodological aspects and recent developments in leveraging data mining techniques to enhance the effectiveness and precision of crime categorization. Commencing with an exposition of the foundational concepts of crime classification and its evolutionary dynamics, this study details the paradigm shift from conventional methods towards approaches supported by data mining, addressing the challenges and complexities inherent in the modern crime landscape. Specifically, the research delves into various data mining techniques, including K-means clustering, Naïve Bayes, K-nearest neighbour, and clustering methods. A comprehensive review of the strengths and limitations of each technique provides insights into their respective contributions to improving crime classification models. The integration of diverse data sources takes centre stage in this research. A detailed analysis explores how the amalgamation of structured data (such as criminal records) and unstructured data (such as social media) can offer a holistic understanding of crime, enriching classification models with more profound insights. Furthermore, the study explores the temporal implications in crime classification, emphasizing the significance of considering temporal factors to comprehend long-term trends and seasonality. The availability of real-time data is also elucidated as a crucial element in enhancing responsiveness and accuracy in crime classification.

Keywords: data mining, classification algorithm, naïve bayes, k-means clustering, k-nearest neigbhor, crime, data analysis, sistematic literature review

Procedia PDF Downloads 46
41594 Confirmatory Factor Analysis of Smartphone Addiction Inventory (SPAI) in the Yemeni Environment

Authors: Mohammed Al-Khadher

Abstract:

Currently, we are witnessing rapid advancements in the field of information and communications technology, forcing us, as psychologists, to combat the psychological and social effects of such developments. It also drives us to continually look for the development and preparation of measurement tools compatible with the changes brought about by the digital revolution. In this context, the current study aimed to identify the factor analysis of the Smartphone Addiction Inventory (SPAI) in the Republic of Yemen. The sample consisted of (1920) university students (1136 males and 784 females) who answered the inventory, and the data was analyzed using the statistical software (AMOS V25). The factor analysis results showed a goodness-of-fit of the data five-factor model with excellent indicators, as RMSEA-(.052), CFI-(.910), GFI-(.931), AGFI-(.915), TLI-(.897), NFI-(.895), RFI-(.880), and RMR-(.032). All within the ideal range to prove the model's fit of the scale’s factor analysis. The confirmatory factor analysis results showed factor loading in (4) items on (Time Spent), (4) items on (Compulsivity), (8) items on (Daily Life Interference), (5) items on (Craving), and (3) items on (Sleep interference); and all standard values of factor loading were statistically significant at the significance level (>.001).

Keywords: smartphone addiction inventory (SPAI), confirmatory factor analysis (CFA), yemeni students, people at risk of smartphone addiction

Procedia PDF Downloads 69
41593 A Safety Analysis Method for Multi-Agent Systems

Authors: Ching Louis Liu, Edmund Kazmierczak, Tim Miller

Abstract:

Safety analysis for multi-agent systems is complicated by the, potentially nonlinear, interactions between agents. This paper proposes a method for analyzing the safety of multi-agent systems by explicitly focusing on interactions and the accident data of systems that are similar in structure and function to the system being analyzed. The method creates a Bayesian network using the accident data from similar systems. A feature of our method is that the events in accident data are labeled with HAZOP guide words. Our method uses an Ontology to abstract away from the details of a multi-agent implementation. Using the ontology, our methods then constructs an “Interaction Map,” a graphical representation of the patterns of interactions between agents and other artifacts. Interaction maps combined with statistical data from accidents and the HAZOP classifications of events can be converted into a Bayesian Network. Bayesian networks allow designers to explore “what it” scenarios and make design trade-offs that maintain safety. We show how to use the Bayesian networks, and the interaction maps to improve multi-agent system designs.

Keywords: multi-agent system, safety analysis, safety model, integration map

Procedia PDF Downloads 400
41592 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 272
41591 The Development of the Website Learning the Local Wisdom in Phra Nakhon Si Ayutthaya Province

Authors: Bunthida Chunngam, Thanyanan Worasesthaphong

Abstract:

This research had objective to develop of the website learning the local wisdom in Phra Nakhon Si Ayutthaya province and studied satisfaction of system user. This research sample was multistage sample for 100 questionnaires, analyzed data to calculated reliability value with Cronbach’s alpha coefficient method α=0.82. This system had 3 functions which were system using, system feather evaluation and system accuracy evaluation which the statistics used for data analysis was descriptive statistics to explain sample feature so these statistics were frequency, percentage, mean and standard deviation. This data analysis result found that the system using performance quality had good level satisfaction (4.44 mean), system feather function analysis had good level satisfaction (4.11 mean) and system accuracy had good level satisfaction (3.74 mean).

Keywords: website, learning, local wisdom, Phra Nakhon Si Ayutthaya province

Procedia PDF Downloads 104
41590 A Multi-Agent Urban Traffic Simulator for Generating Autonomous Driving Training Data

Authors: Florin Leon

Abstract:

This paper describes a simulator of traffic scenarios tailored to facilitate autonomous driving model training for urban environments. With the rising prominence of self-driving vehicles, the need for diverse datasets is very important. The proposed simulator provides a flexible framework that allows the generation of custom scenarios needed for the validation and enhancement of trajectory prediction algorithms. Its controlled yet dynamic environment addresses the challenges associated with real-world data acquisition and ensures adaptability to diverse driving scenarios. By providing an adaptable solution for scenario creation and algorithm testing, this tool proves to be a valuable resource for advancing autonomous driving technology that aims to ensure safe and efficient self-driving vehicles.

Keywords: autonomous driving, car simulator, machine learning, model training, urban simulation environment

Procedia PDF Downloads 33
41589 Probability Sampling in Matched Case-Control Study in Drug Abuse

Authors: Surya R. Niraula, Devendra B Chhetry, Girish K. Singh, S. Nagesh, Frederick A. Connell

Abstract:

Background: Although random sampling is generally considered to be the gold standard for population-based research, the majority of drug abuse research is based on non-random sampling despite the well-known limitations of this kind of sampling. Method: We compared the statistical properties of two surveys of drug abuse in the same community: one using snowball sampling of drug users who then identified “friend controls” and the other using a random sample of non-drug users (controls) who then identified “friend cases.” Models to predict drug abuse based on risk factors were developed for each data set using conditional logistic regression. We compared the precision of each model using bootstrapping method and the predictive properties of each model using receiver operating characteristics (ROC) curves. Results: Analysis of 100 random bootstrap samples drawn from the snowball-sample data set showed a wide variation in the standard errors of the beta coefficients of the predictive model, none of which achieved statistical significance. One the other hand, bootstrap analysis of the random-sample data set showed less variation, and did not change the significance of the predictors at the 5% level when compared to the non-bootstrap analysis. Comparison of the area under the ROC curves using the model derived from the random-sample data set was similar when fitted to either data set (0.93, for random-sample data vs. 0.91 for snowball-sample data, p=0.35); however, when the model derived from the snowball-sample data set was fitted to each of the data sets, the areas under the curve were significantly different (0.98 vs. 0.83, p < .001). Conclusion: The proposed method of random sampling of controls appears to be superior from a statistical perspective to snowball sampling and may represent a viable alternative to snowball sampling.

Keywords: drug abuse, matched case-control study, non-probability sampling, probability sampling

Procedia PDF Downloads 477
41588 Customers’ Acceptability of Islamic Banking: Employees’ Perspective in Peshawar

Authors: Tahira Imtiaz, Karim Ullah

Abstract:

This paper aims to incorporate the banks employees’ perspective on acceptability of Islamic banking by the customers of Peshawar. A qualitative approach is adopted for which six in-depth interviews with employees of Islamic banks are conducted. The employees were asked to share their experience regarding customers’ acceptance attitude towards acceptability of Islamic banking. Collected data was analyzed through thematic analysis technique and its synthesis with the current literature. Through data analysis a theoretical framework is developed, which highlights the factors which drive customers towards Islamic banking, as witnessed by the employees. The practical implication of analyzed data evident that a new model could be developed on the basis of four determinants of human preference namely: inner satisfaction, time, faith and market forces.

Keywords: customers’ attraction, employees’ perspective, Islamic banking, Riba

Procedia PDF Downloads 314
41587 Simulation of Stretching and Fragmenting DNA by Microfluidic for Optimizing Microfluidic Devices

Authors: Shuyi Wu, Chuang Li, Quanshui Zheng, Luping Xu

Abstract:

Stretching and snipping DNA molecule by microfluidic has important application value in gene analysis by lab on a chip. Movement, deformation and fragmenting of DNA in microfluidic are typical fluid-solid coupling problems. An efficient and common simulation system for researching the movement, deformation and fragmenting of DNA by microfluidic has not been well developed. In our study, Brownian dynamics-finite element method (BD-FEM) is used to simulate the dynamic process of stretching and fragmenting DNA by contraction flow. The shape and parameters of micro-channels are changed to optimize the stretching and fragmenting properties of DNA. Our results indicate that strain rate, resulting from contraction microchannel, is the main control parameter for stretching and fragmenting DNA. There is good consistency between the simulation data and previous experimental result about the single DNA molecule behavior and averaged fragmenting properties in this study. BD-FEM method is an efficient calculating tool to research stretching and fragmenting behavior of single DNA molecule and optimize microfluidic devices for manipulating, stretching and fragmenting DNA.

Keywords: fragmenting, DNA, microfluidic, optimize.

Procedia PDF Downloads 310
41586 Dynamic Test for Sway-Mode Buckling of Columns

Authors: Boris Blostotsky, Elia Efraim

Abstract:

Testing of columns in sway mode is performed in order to determine the maximal allowable load limited by plastic deformations or their end connections and a critical load limited by columns stability. Motivation to determine accurate value of critical force is caused by its using as follow: - critical load is maximal allowable load for given column configuration and can be used as criterion of perfection; - it is used in calculation prescribed by standards for design of structural elements under combined action of compression and bending; - it is used for verification of theoretical analysis of stability at various end conditions of columns. In the present work a new non-destructive method for determination of columns critical buckling load in sway mode is proposed. The method allows performing measurements during the tests under loads that exceeds the columns critical load without losing its stability. The possibility of such loading is achieved by structure of the loading system. The system is performed as frame with rigid girder, one of the columns is the tested column and the other is additional two-hinged strut. Loading of the frame is carried out by the flexible traction element attached to the girder. The load applied on the tested column can achieve a values that exceed the critical load by choice of parameters of the traction element and the additional strut. The system lateral stiffness and the column critical load are obtained by the dynamic method. The experiment planning and the comparison between the experimental and theoretical values were performed based on the developed dependency of lateral stiffness of the system on vertical load, taking into account a semi-rigid connections of the column's ends. The agreement between the obtained results was established. The method can be used for testing of real full-size columns in industrial conditions.

Keywords: buckling, columns, dynamic method, semi-rigid connections, sway mode

Procedia PDF Downloads 299
41585 Seismic Performance Evaluation of Existing Building Using Structural Information Modeling

Authors: Byungmin Cho, Dongchul Lee, Taejin Kim, Minhee Lee

Abstract:

The procedure for the seismic retrofit of existing buildings includes the seismic evaluation. In the evaluation step, it is assessed whether the buildings have satisfactory performance against seismic load. Based on the results of that, the buildings are upgraded. To evaluate seismic performance of the buildings, it usually goes through the model transformation from elastic analysis to inelastic analysis. However, when the data is not delivered through the interwork, engineers should manually input the data. In this process, since it leads to inaccuracy and loss of information, the results of the analysis become less accurate. Therefore, in this study, the process for the seismic evaluation of existing buildings using structural information modeling is suggested. This structural information modeling makes the work economic and accurate. To this end, it is determined which part of the process could be computerized through the investigation of the process for the seismic evaluation based on ASCE 41. The structural information modeling process is developed to apply to the seismic evaluation using Perform 3D program usually used for the nonlinear response history analysis. To validate this process, the seismic performance of an existing building is investigated.

Keywords: existing building, nonlinear analysis, seismic performance, structural information modeling

Procedia PDF Downloads 365
41584 Dynamic Cardiac Mitochondrial Proteome Alterations after Ischemic Preconditioning

Authors: Abdelbary Prince, Said Moussa, Hyungkyu Kim, Eman Gouda, Jin Han

Abstract:

We compared the dynamic alterations of mitochondrial proteome of control, ischemia-reperfusion (IR) and ischemic preconditioned (IPC) rabbit hearts. Using 2-DE, we identified 29 mitochondrial proteins that were differentially expressed in the IR heart compared with the control and IPC hearts. For two of the spots, the expression patterns were confirmed by Western blotting analysis. These proteins included succinate dehydrogenase complex, Acyl-CoA dehydrogenase, carnitine acetyltransferase, dihydrolipoamide dehydrogenase, Atpase, ATP synthase, dihydrolipoamide succinyltransferase, ubiquinol-cytochrome c reductase, translation elongation factor, acyl-CoA dehydrogenase, actin alpha, succinyl-CoA Ligase, dihydrolipoamide S-succinyltransferase, citrate synthase, acetyl-Coenzyme A dehydrogenase, creatine kinase, isocitrate dehydrogenase, pyruvate dehydrogenase, prohibitin, NADH dehydrogenase (ubiquinone) Fe-S protein, enoyl Coenzyme A hydratase, superoxide dismutase [Mn], and 24-kDa subunit of complex I. Interestingly, most of these proteins are associated with the mitochondrial respiratory chain, antioxidant enzyme system, and energy metabolism. The results provide clues as to the cardioprotective mechanism of ischemic preconditioning at the protein level and may serve as potential biomarkers for detection of ischemia-induced cardiac injury.

Keywords: ischemic preconditioning, mitochondria, proteome, cardioprotection

Procedia PDF Downloads 333
41583 Estimates of Freshwater Content from ICESat-2 Derived Dynamic Ocean Topography

Authors: Adan Valdez, Shawn Gallaher, James Morison, Jordan Aragon

Abstract:

Global climate change has impacted atmospheric temperatures contributing to rising sea levels, decreasing sea ice, and increased freshening of high latitude oceans. This freshening has contributed to increased stratification inhibiting local mixing and nutrient transport and modifying regional circulations in polar oceans. In recent years, the Western Arctic has seen an increase in freshwater volume at an average rate of 397+-116 km3/year. The majority of the freshwater volume resides in the Beaufort Gyre surface lens driven by anticyclonic wind forcing, sea ice melt, and Arctic river runoff. The total climatological freshwater content is typically defined as water fresher than 34.8. The near-isothermal nature of Arctic seawater and non-linearities in the equation of state for near-freezing waters result in a salinity driven pycnocline as opposed to the temperature driven density structure seen in the lower latitudes. In this study, we investigate the relationship between freshwater content and remotely sensed dynamic ocean topography (DOT). In-situ measurements of freshwater content are useful in providing information on the freshening rate of the Beaufort Gyre; however, their collection is costly and time consuming. NASA’s Advanced Topographic Laser Altimeter System (ATLAS) derived dynamic ocean topography (DOT), and Air Expendable CTD (AXCTD) derived Freshwater Content are used to develop a linear regression model. In-situ data for the regression model is collected across the 150° West meridian, which typically defines the centerline of the Beaufort Gyre. Two freshwater content models are determined by integrating the freshwater volume between the surface and an isopycnal corresponding to reference salinities of 28.7 and 34.8. These salinities correspond to those of the winter pycnocline and total climatological freshwater content, respectively. Using each model, we determine the strength of the linear relationship between freshwater content and satellite derived DOT. The result of this modeling study could provide a future predictive capability of freshwater volume changes in the Beaufort-Chukchi Sea using non in-situ methods. Successful employment of the ICESat-2’s DOT approximation of freshwater content could potentially reduce reliance on field deployment platforms to characterize physical ocean properties.

Keywords: ICESat-2, dynamic ocean topography, freshwater content, beaufort gyre

Procedia PDF Downloads 62
41582 TAXAPRO, A Streamlined Pipeline to Analyze Shotgun Metagenomes

Authors: Sofia Sehli, Zainab El Ouafi, Casey Eddington, Soumaya Jbara, Kasambula Arthur Shem, Islam El Jaddaoui, Ayorinde Afolayan, Olaitan I. Awe, Allissa Dillman, Hassan Ghazal

Abstract:

The ability to promptly sequence whole genomes at a relatively low cost has revolutionized the way we study the microbiome. Microbiologists are no longer limited to studying what can be grown in a laboratory and instead are given the opportunity to rapidly identify the makeup of microbial communities in a wide variety of environments. Analyzing whole genome sequencing (WGS) data is a complex process that involves multiple moving parts and might be rather unintuitive for scientists that don’t typically work with this type of data. Thus, to help lower the barrier for less-computationally inclined individuals, TAXAPRO was developed at the first Omics Codeathon held virtually by the African Society for Bioinformatics and Computational Biology (ASBCB) in June 2021. TAXAPRO is an advanced metagenomics pipeline that accurately assembles organelle genomes from whole-genome sequencing data. TAXAPRO seamlessly combines WGS analysis tools to create a pipeline that automatically processes raw WGS data and presents organism abundance information in both a tabular and graphical format. TAXAPRO was evaluated using COVID-19 patient gut microbiome data. Analysis performed by TAXAPRO demonstrated a high abundance of Clostridia and Bacteroidia genera and a low abundance of Proteobacteria genera relative to others in the gut microbiome of patients hospitalized with COVID-19, consistent with the original findings derived using a different analysis methodology. This provides crucial evidence that the TAXAPRO workflow dispenses reliable organism abundance information overnight without the hassle of performing the analysis manually.

Keywords: metagenomics, shotgun metagenomic sequence analysis, COVID-19, pipeline, bioinformatics

Procedia PDF Downloads 189
41581 Numerical Investigation of Dynamic Stall over a Wind Turbine Pitching Airfoil by Using OpenFOAM

Authors: Mahbod Seyednia, Shidvash Vakilipour, Mehran Masdari

Abstract:

Computations for two-dimensional flow past a stationary and harmonically pitching wind turbine airfoil at a moderate value of Reynolds number (400000) are carried out by progressively increasing the angle of attack for stationary airfoil and at fixed pitching frequencies for rotary one. The incompressible Navier-Stokes equations in conjunction with Unsteady Reynolds Average Navier-Stokes (URANS) equations for turbulence modeling are solved by OpenFOAM package to investigate the aerodynamic phenomena occurred at stationary and pitching conditions on a NACA 6-series wind turbine airfoil. The aim of this study is to enhance the accuracy of numerical simulation in predicting the aerodynamic behavior of an oscillating airfoil in OpenFOAM. Hence, for turbulence modelling, k-ω-SST with low-Reynolds correction is employed to capture the unsteady phenomena occurred in stationary and oscillating motion of the airfoil. Using aerodynamic and pressure coefficients along with flow patterns, the unsteady aerodynamics at pre-, near-, and post-static stall regions are analyzed in harmonically pitching airfoil, and the results are validated with the corresponding experimental data possessed by the authors. The results indicate that implementing the mentioned turbulence model leads to accurate prediction of the angle of static stall for stationary airfoil and flow separation, dynamic stall phenomenon, and reattachment of the flow on the surface of airfoil for pitching one. Due to the geometry of the studied 6-series airfoil, the vortex on the upper surface of the airfoil during upstrokes is formed at the trailing edge. Therefore, the pattern flow obtained by our numerical simulations represents the formation and change of the trailing-edge vortex at near- and post-stall regions where this process determines the dynamic stall phenomenon.

Keywords: CFD, moderate Reynolds number, OpenFOAM, pitching oscillation, unsteady aerodynamics, wind turbine

Procedia PDF Downloads 188
41580 Adsorption Kinetics and Equilibria at an Air-Liquid Interface of Biosurfactant and Synthetic Surfactant

Authors: Sagheer A. Onaizi

Abstract:

The adsorption of anionic biosurfactant (surfactin) and anionic synthetic surfactant (sodium dodecylbenzenesulphonate, abbreviated as SDOBS) from phosphate buffer containing high concentrations of co- and counter-ions to the air-buffer interface has been investigated. The self-assembly of the two surfactants at the interface has been monitored through dynamic surface tension measurements. The equilibrium surface pressure-surfactant concentration data in the premicellar region were regressed using Gibbs adsorption equation. The predicted surface saturations for SDOBS and surfactin are and, respectively. The occupied area per an SDOBS molecule at the interface saturation condition is while that occupied by a surfactin molecule is. The surface saturations reported in this work for both surfactants are in a very good agreement with those obtained using expensive techniques such as neutron reflectometry, suggesting that the surface tension measurements coupled with appropriate theoretical analysis could provide useful information comparable to those obtained using highly sophisticated techniques.

Keywords: adsorption, air-liquid interface, biosurfactant, surface tension

Procedia PDF Downloads 694
41579 An Intelligent Prediction Method for Annular Pressure Driven by Mechanism and Data

Authors: Zhaopeng Zhu, Xianzhi Song, Gensheng Li, Shuo Zhu, Shiming Duan, Xuezhe Yao

Abstract:

Accurate calculation of wellbore pressure is of great significance to prevent wellbore risk during drilling. The traditional mechanism model needs a lot of iterative solving procedures in the calculation process, which reduces the calculation efficiency and is difficult to meet the demand of dynamic control of wellbore pressure. In recent years, many scholars have introduced artificial intelligence algorithms into wellbore pressure calculation, which significantly improves the calculation efficiency and accuracy of wellbore pressure. However, due to the ‘black box’ property of intelligent algorithm, the existing intelligent calculation model of wellbore pressure is difficult to play a role outside the scope of training data and overreacts to data noise, often resulting in abnormal calculation results. In this study, the multi-phase flow mechanism is embedded into the objective function of the neural network model as a constraint condition, and an intelligent prediction model of wellbore pressure under the constraint condition is established based on more than 400,000 sets of pressure measurement while drilling (MPD) data. The constraint of the multi-phase flow mechanism makes the prediction results of the neural network model more consistent with the distribution law of wellbore pressure, which overcomes the black-box attribute of the neural network model to some extent. The main performance is that the accuracy of the independent test data set is further improved, and the abnormal calculation values basically disappear. This method is a prediction method driven by MPD data and multi-phase flow mechanism, and it is the main way to predict wellbore pressure accurately and efficiently in the future.

Keywords: multiphase flow mechanism, pressure while drilling data, wellbore pressure, mechanism constraints, combined drive

Procedia PDF Downloads 162
41578 Geographic Information Systems and Remotely Sensed Data for the Hydrological Modelling of Mazowe Dam

Authors: Ellen Nhedzi Gozo

Abstract:

Unavailability of adequate hydro-meteorological data has always limited the analysis and understanding of hydrological behaviour of several dam catchments including Mazowe Dam in Zimbabwe. The problem of insufficient data for Mazowe Dam catchment analysis was solved by extracting catchment characteristics and aerial hydro-meteorological data from ASTER, LANDSAT, Shuttle Radar Topographic Mission SRTM remote sensing (RS) images using ILWIS, ArcGIS and ERDAS Imagine geographic information systems (GIS) software. Available observed hydrological as well as meteorological data complemented the use of the remotely sensed information. Ground truth land cover was mapped using a Garmin Etrex global positioning system (GPS) system. This information was then used to validate land cover classification detail that was obtained from remote sensing images. A bathymetry survey was conducted using a SONAR system connected to GPS. Hydrological modelling using the HBV model was then performed to simulate the hydrological process of the catchment in an effort to verify the reliability of the derived parameters. The model output shows a high Nash-Sutcliffe Coefficient that is close to 1 indicating that the parameters derived from remote sensing and GIS can be applied with confidence in the analysis of Mazowe Dam catchment.

Keywords: geographic information systems, hydrological modelling, remote sensing, water resources management

Procedia PDF Downloads 313
41577 An Architectural Model for APT Detection

Authors: Nam-Uk Kim, Sung-Hwan Kim, Tai-Myoung Chung

Abstract:

Typical security management systems are not suitable for detecting APT attack, because they cannot draw the big picture from trivial events of security solutions. Although SIEM solutions have security analysis engine for that, their security analysis mechanisms need to be verified in academic field. Although this paper proposes merely an architectural model for APT detection, we will keep studying on correlation analysis mechanism in the future.

Keywords: advanced persistent threat, anomaly detection, data mining

Procedia PDF Downloads 508
41576 Unlocking Synergy: Exploring the Impact of Integrating Knowledge Management and Competitive Intelligence for Synergistic Advantage for Efficient, Inclusive and Optimum Organizational Performance

Authors: Godian Asami Mabindah

Abstract:

The convergence of knowledge management (KM) and competitive intelligence (CI) has gained significant attention in recent years as organizations seek to enhance their competitive advantage in an increasingly complex and dynamic business environment. This research study aims to explore and understand the synergistic relationship between KM and CI and its impact on organizational performance. By investigating how the integration of KM and CI practices can contribute to decision-making, innovation, and competitive advantage, this study seeks to unlock the potential benefits and challenges associated with this integration. The research employs a mixed-methods approach to gather comprehensive data. A quantitative analysis is conducted using survey data collected from a diverse sample of organizations across different industries. The survey measures the extent of integration between KM and CI practices and examines the perceived benefits and challenges associated with this integration. Additionally, qualitative interviews are conducted with key organizational stakeholders to gain deeper insights into their experiences, perspectives, and best practices regarding the synergistic relationship. The findings of this study are expected to reveal several significant outcomes. Firstly, it is anticipated that organizations that effectively integrate KM and CI practices will outperform those that treat them as independent functions. The study aims to highlight the positive impact of this integration on decision-making, innovation, organizational learning, and competitive advantage. Furthermore, the research aims to identify critical success factors and enablers for achieving constructive interaction between KM and CI, such as leadership support, culture, technology infrastructure, and knowledge-sharing mechanisms. The implications of this research are far-reaching. Organizations can leverage the findings to develop strategies and practices that facilitate the integration of KM and CI, leading to enhanced competitive intelligence capabilities and improved knowledge management processes. Additionally, the research contributes to the academic literature by providing a comprehensive understanding of the synergistic relationship between KM and CI and proposing a conceptual framework that can guide future research in this area. By exploring the synergies between KM and CI, this study seeks to help organizations harness their collective power to gain a competitive edge in today's dynamic business landscape. The research provides practical insights and guidelines for organizations to effectively integrate KM and CI practices, leading to improved decision-making, innovation, and overall organizational performance.

Keywords: Competitive Intelligence, Knowledge Management, Organizational Performance, Incusivity, Optimum Performance

Procedia PDF Downloads 67
41575 Multi-Scale Damage and Mechanical Behavior of Sheet Molding Compound Composites Subjected to Fatigue, Dynamic, and Post-Fatigue Dynamic Loadings

Authors: M. Shirinbayan, J. Fitoussi, N. Abbasnezhad, A. Lucas, A. Tcharkhtchi

Abstract:

Sheet Molding Compounds (SMCs) with special microstructures are very attractive to use in automobile structures especially when they are accidentally subjected to collision type accidents because of their high energy absorption capacity. These are materials designated as standard SMC, Advanced Sheet Molding Compounds (A-SMC), Low-Density SMC (LD-SMC) and etc. In this study, testing methods have been performed to compare the mechanical responses and damage phenomena of SMC, LD-SMC, and A-SMC under quasi-static and high strain rate tensile tests. The paper also aims at investigating the effect of an initial pre-damage induced by fatigue on the tensile dynamic behavior of A-SMC. In the case of SMCs and A-SMCs, whatever the fibers orientation and applied strain rate are, the first observed phenomenon of damage corresponds to decohesion of the fiber-matrix interface which is followed by coalescence and multiplication of these micro-cracks and their propagations. For LD-SMCs, damage mechanisms depend on the presence of Hollow Glass Microspheres (HGM) and fibers orientation.

Keywords: SMC, Sheet Molding Compound, LD-SMC, Low-Density SMC, A-SMC, Advanced Sheet Molding Compounds, HGM, Hollow Glass Microspheres, damage

Procedia PDF Downloads 192
41574 Positive Affect, Negative Affect, Organizational and Motivational Factor on the Acceptance of Big Data Technologies

Authors: Sook Ching Yee, Angela Siew Hoong Lee

Abstract:

Big data technologies have become a trend to exploit business opportunities and provide valuable business insights through the analysis of big data. However, there are still many organizations that have yet to adopt big data technologies especially small and medium organizations (SME). This study uses the technology acceptance model (TAM) to look into several constructs in the TAM and other additional constructs which are positive affect, negative affect, organizational factor and motivational factor. The conceptual model proposed in the study will be tested on the relationship and influence of positive affect, negative affect, organizational factor and motivational factor towards the intention to use big data technologies to produce an outcome. Empirical research is used in this study by conducting a survey to collect data.

Keywords: big data technologies, motivational factor, negative affect, organizational factor, positive affect, technology acceptance model (TAM)

Procedia PDF Downloads 339
41573 Electrical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: electrical disaggregation, DTW, general appliance modeling, event detection

Procedia PDF Downloads 60