Search results for: data dependency graph
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24847

Search results for: data dependency graph

24637 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering  

Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi

Abstract:

In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.

Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering

Procedia PDF Downloads 126
24636 Encapsulation of Volatile Citronella Essential oil by Coacervation: Efficiency and Release Kinetic Study

Authors: Rafeqah Raslan, Mastura AbdManaf, Junaidah Jai, Istikamah Subuki, Ana Najwa Mustapa

Abstract:

The volatile citronella essential oil was encapsulated by simple coacervation and complex coacervation using gum Arabic and gelatin as wall material. Glutaraldehyde was used in the methodology as crosslinking agent. The citronella standard calibration graph was developed with R2 equal to 0.9523 for the accurate determination of encapsulation efficiency and release study. The release kinetic was analyzed based on Fick’s law of diffusion for polymeric system and linear graph of log fraction release over log time was constructed to determine the release rate constant, k and diffusion coefficient, n. Both coacervation methods in the present study produce encapsulation efficiency around 94%. The capsules morphology analysis supported the release kinetic mechanisms of produced capsules for both coacervation process.

Keywords: simple coacervation, complex coacervation, encapsulation efficiency, release kinetic study

Procedia PDF Downloads 298
24635 Data Collection with Bounded-Sized Messages in Wireless Sensor Networks

Authors: Min Kyung An

Abstract:

In this paper, we study the data collection problem in Wireless Sensor Networks (WSNs) adopting the two interference models: The graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR). The main issue of the problem is to compute schedules with the minimum number of timeslots, that is, to compute the minimum latency schedules, such that data from every node can be collected without any collision or interference to a sink node. While existing works studied the problem with unit-sized and unbounded-sized message models, we investigate the problem with the bounded-sized message model, and introduce a constant factor approximation algorithm. To the best known of our knowledge, our result is the first result of the data collection problem with bounded-sized model in both interference models.

Keywords: data collection, collision-free, interference-free, physical interference model, SINR, approximation, bounded-sized message model, wireless sensor networks

Procedia PDF Downloads 193
24634 Ontology-Based Systemizing of the Science Information Devoted to Waste Utilizing by Methanogenesis

Authors: Ye. Shapovalov, V. Shapovalov, O. Stryzhak, A. Salyuk

Abstract:

Over the past decades, amount of scientific information has been growing exponentially. It became more complicated to process and systemize this amount of data. The approach to systematization of scientific information on the production of biogas based on the ontological IT platform “T.O.D.O.S.” has been developed. It has been proposed to select semantic characteristics of each work for their further introduction into the IT platform “T.O.D.O.S.”. An ontological graph with a ranking function for previous scientific research and for a system of selection of microorganisms has been worked out. These systems provide high performance of information management of scientific information.

Keywords: ontology-based analysis, analysis of scientific data, methanogenesis, microorganism hierarchy, 'T.O.D.O.S.'

Procedia PDF Downloads 139
24633 A Methodology to Integrate Data in the Company Based on the Semantic Standard in the Context of Industry 4.0

Authors: Chang Qin, Daham Mustafa, Abderrahmane Khiat, Pierre Bienert, Paulo Zanini

Abstract:

Nowadays, companies are facing lots of challenges in the process of digital transformation, which can be a complex and costly undertaking. Digital transformation involves the collection and analysis of large amounts of data, which can create challenges around data management and governance. Furthermore, it is also challenged to integrate data from multiple systems and technologies. Although with these pains, companies are still pursuing digitalization because by embracing advanced technologies, companies can improve efficiency, quality, decision-making, and customer experience while also creating different business models and revenue streams. In this paper, the issue that data is stored in data silos with different schema and structures is focused. The conventional approaches to addressing this issue involve utilizing data warehousing, data integration tools, data standardization, and business intelligence tools. However, these approaches primarily focus on the grammar and structure of the data and neglect the importance of semantic modeling and semantic standardization, which are essential for achieving data interoperability. In this session, the challenge of data silos in Industry 4.0 is addressed by developing a semantic modeling approach compliant with Asset Administration Shell (AAS) models as an efficient standard for communication in Industry 4.0. The paper highlights how our approach can facilitate the data mapping process and semantic lifting according to existing industry standards such as ECLASS and other industrial dictionaries. It also incorporates the Asset Administration Shell technology to model and map the company’s data and utilize a knowledge graph for data storage and exploration.

Keywords: data interoperability in industry 4.0, digital integration, industrial dictionary, semantic modeling

Procedia PDF Downloads 71
24632 Row Detection and Graph-Based Localization in Tree Nurseries Using a 3D LiDAR

Authors: Ionut Vintu, Stefan Laible, Ruth Schulz

Abstract:

Agricultural robotics has been developing steadily over recent years, with the goal of reducing and even eliminating pesticides used in crops and to increase productivity by taking over human labor. The majority of crops are arranged in rows. The first step towards autonomous robots, capable of driving in fields and performing crop-handling tasks, is for robots to robustly detect the rows of plants. Recent work done towards autonomous driving between plant rows offers big robotic platforms equipped with various expensive sensors as a solution to this problem. These platforms need to be driven over the rows of plants. This approach lacks flexibility and scalability when it comes to the height of plants or distance between rows. This paper proposes instead an algorithm that makes use of cheaper sensors and has a higher variability. The main application is in tree nurseries. Here, plant height can range from a few centimeters to a few meters. Moreover, trees are often removed, leading to gaps within the plant rows. The core idea is to combine row detection algorithms with graph-based localization methods as they are used in SLAM. Nodes in the graph represent the estimated pose of the robot, and the edges embed constraints between these poses or between the robot and certain landmarks. This setup aims to improve individual plant detection and deal with exception handling, like row gaps, which are falsely detected as an end of rows. Four methods were developed for detecting row structures in the fields, all using a point cloud acquired with a 3D LiDAR as an input. Comparing the field coverage and number of damaged plants, the method that uses a local map around the robot proved to perform the best, with 68% covered rows and 25% damaged plants. This method is further used and combined with a graph-based localization algorithm, which uses the local map features to estimate the robot’s position inside the greater field. Testing the upgraded algorithm in a variety of simulated fields shows that the additional information obtained from localization provides a boost in performance over methods that rely purely on perception to navigate. The final algorithm achieved a row coverage of 80% and an accuracy of 27% damaged plants. Future work would focus on achieving a perfect score of 100% covered rows and 0% damaged plants. The main challenges that the algorithm needs to overcome are fields where the height of the plants is too small for the plants to be detected and fields where it is hard to distinguish between individual plants when they are overlapping. The method was also tested on a real robot in a small field with artificial plants. The tests were performed using a small robot platform equipped with wheel encoders, an IMU and an FX10 3D LiDAR. Over ten runs, the system achieved 100% coverage and 0% damaged plants. The framework built within the scope of this work can be further used to integrate data from additional sensors, with the goal of achieving even better results.

Keywords: 3D LiDAR, agricultural robots, graph-based localization, row detection

Procedia PDF Downloads 113
24631 Plotting of an Ideal Logic versus Resource Outflow Graph through Response Analysis on a Strategic Management Case Study Based Questionnaire

Authors: Vinay A. Sharma, Shiva Prasad H. C.

Abstract:

The initial stages of any project are often observed to be in a mixed set of conditions. Setting up the project is a tough task, but taking the initial decisions is rather not complex, as some of the critical factors are yet to be introduced into the scenario. These simple initial decisions potentially shape the timeline and subsequent events that might later be plotted on it. Proceeding towards the solution for a problem is the primary objective in the initial stages. The optimization in the solutions can come later, and hence, the resources deployed towards attaining the solution are higher than what they would have been in the optimized versions. A ‘logic’ that counters the problem is essentially the core of the desired solution. Thus, if the problem is solved, the deployment of resources has led to the required logic being attained. As the project proceeds along, the individuals working on the project face fresh challenges as a team and are better accustomed to their surroundings. The developed, optimized solutions are then considered for implementation, as the individuals are now experienced, and know better of the consequences and causes of possible failure, and thus integrate the adequate tolerances wherever required. Furthermore, as the team graduates in terms of strength, acquires prodigious knowledge, and begins its efficient transfer, the individuals in charge of the project along with the managers focus more on the optimized solutions rather than the traditional ones to minimize the required resources. Hence, as time progresses, the authorities prioritize attainment of the required logic, at a lower amount of dedicated resources. For empirical analysis of the stated theory, leaders and key figures in organizations are surveyed for their ideas on appropriate logic required for tackling a problem. Key-pointers spotted in successfully implemented solutions are noted from the analysis of the responses and a metric for measuring logic is developed. A graph is plotted with the quantifiable logic on the Y-axis, and the dedicated resources for the solutions to various problems on the X-axis. The dedicated resources are plotted over time, and hence the X-axis is also a measure of time. In the initial stages of the project, the graph is rather linear, as the required logic will be attained, but the consumed resources are also high. With time, the authorities begin focusing on optimized solutions, since the logic attained through them is higher, but the resources deployed are comparatively lower. Hence, the difference between consecutive plotted ‘resources’ reduces and as a result, the slope of the graph gradually increases. On an overview, the graph takes a parabolic shape (beginning on the origin), as with each resource investment, ideally, the difference keeps on decreasing, and the logic attained through the solution keeps increasing. Even if the resource investment is higher, the managers and authorities, ideally make sure that the investment is being made on a proportionally high logic for a larger problem, that is, ideally the slope of the graph increases with the plotting of each point.

Keywords: decision-making, leadership, logic, strategic management

Procedia PDF Downloads 90
24630 Mechanical Properties and Microstructures of the Directional Solidified Zn-Al-Cu Alloy

Authors: Mehmet Izzettin Yilmazer, Emin Cadirli

Abstract:

Zn-7wt.%Al-2.96wt.%Cu eutectic alloy was directionally solidified upwards with different temperature gradients (from 6.70 K/mm to 10.67 K/mm) at a constant growth rate (16.4 Km/s) and also different growth rate (from 8.3 micron/s to 166 micron/s) at a constant temperature gradient (10.67 K/mm) using a Bridgman–type growth apparatus.The values of eutectic spacing were measured from longitudinal and transverse sections of the samples. The dependency of microstructures on the G and V were determined with linear regression analysis and experimental equations were found as λl=8.953xVexp-0.49, λt=5.942xVexp-0.42 and λl=0.008xGexp-1.23, λt=0.024xGexp-0.93. The measurements of microhardness of directionally solidified samples were obtained by using a microhardness test device. The dependence of microhardness HV on temperature gradient and growth rate were analyzed. The dependency of microhardness on the G and V were also determined with linear regression analysis as HVl=110.66xVexp0.02, HVt=111.94xVexp0.02 and HVl=69.66xGexp0.17, HVt=68.86xGexp0.18. The experimental results show that the microhardness of the directionally solidified Zn-Al-Cu alloy increases with increasing the growth rate. The results obtained in this work were compared with the previous similar experimental results.

Keywords: directional solidification, eutectic alloys, microstructure, microhardness

Procedia PDF Downloads 430
24629 A Combinatorial Representation for the Invariant Measure of Diffusion Processes on Metric Graphs

Authors: Michele Aleandri, Matteo Colangeli, Davide Gabrielli

Abstract:

We study a generalization to a continuous setting of the classical Markov chain tree theorem. In particular, we consider an irreducible diffusion process on a metric graph. The unique invariant measure has an atomic component on the vertices and an absolutely continuous part on the edges. We show that the corresponding density at x can be represented by a normalized superposition of the weights associated to metric arborescences oriented toward the point x. A metric arborescence is a metric tree oriented towards its root. The weight of each oriented metric arborescence is obtained by the product of the exponential of integrals of the form ∫a/b², where b is the drift and σ² is the diffusion coefficient, along the oriented edges, for a weight for each node determined by the local orientation of the arborescence around the node and for the inverse of the diffusion coefficient at x. The metric arborescences are obtained by cutting the original metric graph along some edges.

Keywords: diffusion processes, metric graphs, invariant measure, reversibility

Procedia PDF Downloads 144
24628 KPI and Tool for the Evaluation of Competency in Warehouse Management for Furniture Business

Authors: Kritchakhris Na-Wattanaprasert

Abstract:

The objective of this research is to design and develop a prototype of a key performance indicator system this is suitable for warehouse management in a case study and use requirement. In this study, we design a prototype of key performance indicator system (KPI) for warehouse case study of furniture business by methodology in step of identify scope of the research and study related papers, gather necessary data and users requirement, develop key performance indicator base on balance scorecard, design pro and database for key performance indicator, coding the program and set relationship of database and finally testing and debugging each module. This study use Balance Scorecard (BSC) for selecting and grouping key performance indicator. The system developed by using Microsoft SQL Server 2010 is used to create the system database. In regard to visual-programming language, Microsoft Visual C# 2010 is chosen as the graphic user interface development tool. This system consists of six main menus: menu login, menu main data, menu financial perspective, menu customer perspective, menu internal, and menu learning and growth perspective. Each menu consists of key performance indicator form. Each form contains a data import section, a data input section, a data searches – edit section, and a report section. The system generates outputs in 5 main reports, the KPI detail reports, KPI summary report, KPI graph report, benchmarking summary report and benchmarking graph report. The user will select the condition of the report and period time. As the system has been developed and tested, discovers that it is one of the ways to judging the extent to warehouse objectives had been achieved. Moreover, it encourages the warehouse functional proceed with more efficiency. In order to be useful propose for other industries, can adjust this system appropriately. To increase the usefulness of the key performance indicator system, the recommendations for further development are as follows: -The warehouse should review the target value and set the better suitable target periodically under the situation fluctuated in the future. -The warehouse should review the key performance indicators and set the better suitable key performance indicators periodically under the situation fluctuated in the future for increasing competitiveness and take advantage of new opportunities.

Keywords: key performance indicator, warehouse management, warehouse operation, logistics management

Procedia PDF Downloads 416
24627 Efficient Heuristic Algorithm to Speed Up Graphcut in Gpu for Image Stitching

Authors: Tai Nguyen, Minh Bui, Huong Ninh, Tu Nguyen, Hai Tran

Abstract:

GraphCut algorithm has been widely utilized to solve various types of computer vision problems. Its expensive computational cost encouraged many researchers to improve the speed of the algorithm. Recent works proposed schemes that work on parallel computing platforms such as CUDA. However, the problem of low convergence speed prevents the usage of GraphCut for real time applications. In this paper, we propose global suppression heuristic to boost the conver-gence process of the algorithm. A parallel implementation of GraphCut algorithm on CUDA designed for the image stitching problem is introduced. Our method achieves up to 3× time boost on the graph of size 80 × 480 compared to the best sequential GraphCut algorithm while achieving satisfactory stitched images, suitable for panorama applications. Our source code will be soon available for further research.

Keywords: CUDA, graph cut, image stitching, texture synthesis, maxflow/mincut algorithm

Procedia PDF Downloads 107
24626 Quantifying Multivariate Spatiotemporal Dynamics of Malaria Risk Using Graph-Based Optimization in Southern Ethiopia

Authors: Yonas Shuke Kitawa

Abstract:

Background: Although malaria incidence has substantially fallen sharply over the past few years, the rate of decline varies by district, time, and malaria type. Despite this turn-down, malaria remains a major public health threat in various districts of Ethiopia. Consequently, the present study is aimed at developing a predictive model that helps to identify the spatio-temporal variation in malaria risk by multiple plasmodium species. Methods: We propose a multivariate spatio-temporal Bayesian model to obtain a more coherent picture of the temporally varying spatial variation in disease risk. The spatial autocorrelation in such a data set is typically modeled by a set of random effects that assign a conditional autoregressive prior distribution. However, the autocorrelation considered in such cases depends on a binary neighborhood matrix specified through the border-sharing rule. Over here, we propose a graph-based optimization algorithm for estimating the neighborhood matrix that merely represents the spatial correlation by exploring the areal units as the vertices of a graph and the neighbor relations as the series of edges. Furthermore, we used aggregated malaria count in southern Ethiopia from August 2013 to May 2019. Results: We recognized that precipitation, temperature, and humidity are positively associated with the malaria threat in the area. On the other hand, enhanced vegetation index, nighttime light (NTL), and distance from coastal areas are negatively associated. Moreover, nonlinear relationships were observed between malaria incidence and precipitation, temperature, and NTL. Additionally, lagged effects of temperature and humidity have a significant effect on malaria risk by either species. More elevated risk of P. falciparum was observed following the rainy season, and unstable transmission of P. vivax was observed in the area. Finally, P. vivax risks are less sensitive to environmental factors than those of P. falciparum. Conclusion: The improved inference was gained by employing the proposed approach in comparison to the commonly used border-sharing rule. Additionally, different covariates are identified, including delayed effects, and elevated risks of either of the cases were observed in districts found in the central and western regions. As malaria transmission operates in a spatially continuous manner, a spatially continuous model should be employed when it is computationally feasible.

Keywords: disease mapping, MSTCAR, graph-based optimization algorithm, P. falciparum, P. vivax, waiting matrix

Procedia PDF Downloads 51
24625 China's Middle East Policy and the Competition with the United States

Authors: Shabnam Dadparvar, Laijin Shen

Abstract:

This paper focuses on China’s policy in the Middle East and the rivalry with the U.S. The question is that what are the main factors on China’s Middle East policy and its competition with the U.S? The hypothesis regards to three effective factors: 'China’s energy dependency' on the Middle East, 'economy' and support for 'stability' in the Middle East. What is important in China’s competition with the U.S regarding to its Middle East policy is the substantial difference in ways of treating the countries of the region; China is committed to Westphalia model based on non-interference in internal affairs of the countries and respect the sovereignty of the governments. However, after 9/11, the U.S is seeking a balance between stability and change through intervention in the international affairs and in some cases is looking for a regime change. From the other hand, China, due to its dependency on the region’s energy welcomes America’s military presence in the region for providing stability. The authors by using a descriptive analytical method try to explain the situation of rivalry between China and the United States in Middle East. China is an 'emerging power' with high economic growth and in demand of more energy supply. The problem is that a rising power in the region is often a source of concern for hegemony.

Keywords: China's foreign policy, energy, hegemony, the Middle East

Procedia PDF Downloads 326
24624 An Iberian Study about Location of Parking Areas for Dangerous Goods

Authors: María Dolores Caro, Eugenio M. Fedriani, Ángel F. Tenorio

Abstract:

When lorries transport dangerous goods, there exist some legal stipulations in the European Union for assuring the security of the rest of road users as well as of those goods being transported. At this respect, lorry drivers cannot park in usual parking areas, because they must use parking areas with special conditions, including permanent supervision of security personnel. Moreover, drivers are compelled to satisfy additional regulations about resting and driving times, which involve in the practical possibility of reaching the suitable parking areas under these time parameters. The “European Agreement concerning the International Carriage of Dangerous Goods by Road” (ADR) is the basic regulation on transportation of dangerous goods imposed under the recommendations of the United Nations Economic Commission for Europe. Indeed, nowadays there are no enough parking areas adapted for dangerous goods and no complete study have suggested the best locations to build new areas or to adapt others already existing to provide the areas being necessary so that lorry drivers can follow all the regulations. The goal of this paper is to show how many additional parking areas should be built in the Iberian Peninsula to allow that lorry drivers may park in such areas under their restrictions in resting and driving time. To do so, we have modeled the problem via graph theory and we have applied a new efficient algorithm which determines an optimal solution for the problem of locating new parking areas to complement those already existing in the ADR for the Iberian Peninsula. The solution can be considered minimal since the number of additional parking areas returned by the algorithm is minimal in quantity. Obviously, graph theory is a natural way to model and solve the problem here proposed because we have considered as nodes: the already-existing parking areas, the loading-and-unloading locations and the bifurcations of roads; while each edge between two nodes represents the existence of a road between both nodes (the distance between nodes is the edge's weight). Except for bifurcations, all the nodes correspond to parking areas already existing and, hence, the problem corresponds to determining the additional nodes in the graph such that there are less up to 100 km between two nodes representing parking areas. (maximal distance allowed by the European regulations).

Keywords: dangerous goods, parking areas, Iberian peninsula, graph-based modeling

Procedia PDF Downloads 565
24623 Altered Network Organization in Mild Alzheimer's Disease Compared to Mild Cognitive Impairment Using Resting-State EEG

Authors: Chia-Feng Lu, Yuh-Jen Wang, Shin Teng, Yu-Te Wu, Sui-Hing Yan

Abstract:

Brain functional networks based on resting-state EEG data were compared between patients with mild Alzheimer’s disease (mAD) and matched patients with amnestic subtype of mild cognitive impairment (aMCI). We integrated the time–frequency cross mutual information (TFCMI) method to estimate the EEG functional connectivity between cortical regions and the network analysis based on graph theory to further investigate the alterations of functional networks in mAD compared with aMCI group. We aimed at investigating the changes of network integrity, local clustering, information processing efficiency, and fault tolerance in mAD brain networks for different frequency bands based on several topological properties, including degree, strength, clustering coefficient, shortest path length, and efficiency. Results showed that the disruptions of network integrity and reductions of network efficiency in mAD characterized by lower degree, decreased clustering coefficient, higher shortest path length, and reduced global and local efficiencies in the delta, theta, beta2, and gamma bands were evident. The significant changes in network organization can be used in assisting discrimination of mAD from aMCI in clinical.

Keywords: EEG, functional connectivity, graph theory, TFCMI

Procedia PDF Downloads 414
24622 Generation of Knowlege with Self-Learning Methods for Ophthalmic Data

Authors: Klaus Peter Scherer, Daniel Knöll, Constantin Rieder

Abstract:

Problem and Purpose: Intelligent systems are available and helpful to support the human being decision process, especially when complex surgical eye interventions are necessary and must be performed. Normally, such a decision support system consists of a knowledge-based module, which is responsible for the real assistance power, given by an explanation and logical reasoning processes. The interview based acquisition and generation of the complex knowledge itself is very crucial, because there are different correlations between the complex parameters. So, in this project (semi)automated self-learning methods are researched and developed for an enhancement of the quality of such a decision support system. Methods: For ophthalmic data sets of real patients in a hospital, advanced data mining procedures seem to be very helpful. Especially subgroup analysis methods are developed, extended and used to analyze and find out the correlations and conditional dependencies between the structured patient data. After finding causal dependencies, a ranking must be performed for the generation of rule-based representations. For this, anonymous patient data are transformed into a special machine language format. The imported data are used as input for algorithms of conditioned probability methods to calculate the parameter distributions concerning a special given goal parameter. Results: In the field of knowledge discovery advanced methods and applications could be performed to produce operation and patient related correlations. So, new knowledge was generated by finding causal relations between the operational equipment, the medical instances and patient specific history by a dependency ranking process. After transformation in association rules logically based representations were available for the clinical experts to evaluate the new knowledge. The structured data sets take account of about 80 parameters as special characteristic features per patient. For different extended patient groups (100, 300, 500), as well one target value as well multi-target values were set for the subgroup analysis. So the newly generated hypotheses could be interpreted regarding the dependency or independency of patient number. Conclusions: The aim and the advantage of such a semi-automatically self-learning process are the extensions of the knowledge base by finding new parameter correlations. The discovered knowledge is transformed into association rules and serves as rule-based representation of the knowledge in the knowledge base. Even more, than one goal parameter of interest can be considered by the semi-automated learning process. With ranking procedures, the most strong premises and also conjunctive associated conditions can be found to conclude the interested goal parameter. So the knowledge, hidden in structured tables or lists can be extracted as rule-based representation. This is a real assistance power for the communication with the clinical experts.

Keywords: an expert system, knowledge-based support, ophthalmic decision support, self-learning methods

Procedia PDF Downloads 238
24621 A Study on Mesh Size Dependency on Bed Expansion Zone in a Three-Phase Fluidized Bed Reactor

Authors: Liliana Patricia Olivo Arias

Abstract:

The present study focused on the hydrodynamic study in a three-phase fluidized bed reactor and the influence of important aspects, such as volume fractions (Hold up), velocity magnitude of gas, liquid and solid phases (hydrogen, gasoil, and gamma alumina), interactions of phases, through of drag models with the k-epsilon turbulence model. For this purpose was employed a Euler-Euler model and also considers the system is constituted of three phases, gaseous, liquid and solid, characterized by its physical and thermal properties, the transport processes that are developed within the transient regime. The proposed model of the three-phase fluidized bed reactor was solved numerically using the ANSYS-Fluent software with different mesh refinements on bed expansion zone in order to observe the influence of the hydrodynamic parameters and convergence criteria. With this model and the numerical simulations obtained for its resolution, it was possible to predict the results of the volume fractions (Hold ups) and the velocity magnitude for an unsteady system from the initial and boundaries conditions were established.

Keywords: three-phase fluidized bed system, CFD simulation, mesh dependency study, hydrodynamic study

Procedia PDF Downloads 143
24620 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 77
24619 On the Basis Number and the Minimum Cycle Bases of the Wreath Product of Paths with Wheels

Authors: M. M. M. Jaradat

Abstract:

For a given graph G, the set Ԑ of all subsets of E(G) forms an |E(G)| dimensional vector space over Z2 with vector addition X⊕Y = (X\Y ) [ (Y \X) and scalar multiplication 1.X = X and 0.X = Ø for all X, Yϵ Ԑ. The cycle space, C(G), of a graph G is the vector subspace of (E; ⊕; .) spanned by the cycles of G. Traditionally there have been two notions of minimality among bases of C(G). First, a basis B of G is called a d-fold if each edge of G occurs in at most d cycles of the basis B. The basis number, b(G), of G is the least non-negative integer d such that C(G) has a d-fold basis; a required basis of C(G) is a basis for which each edge of G belongs to at most b(G) elements of B. Second, a basis B is called a minimum cycle basis (MCB) if its total length Σ BϵB |B| is minimum among all bases of C(G). The lexicographic product GρH has the vertex set V (GρH) = V (G) x V (H) and the edge set E(GρH) = {(u1, v1)(u2, v2)|u1 = u2 and v1 v2 ϵ E(H); or u1u2 ϵ E(G) and there is α ϵ Aut(H) such that α (v1) = v2}. In this work, a construction of a minimum cycle basis for the wreath product of wheels with paths is presented. Also, the length of the longest cycle of a minimum cycle basis is determined. Moreover, the basis number for the wreath product of the same is investigated.

Keywords: cycle space, minimum cycle basis, basis number, wreath product

Procedia PDF Downloads 249
24618 An Owen Value for Cooperative Games with Pairwise a Priori Incompatibilities

Authors: Jose M. Gallardo, Nieves Jimenez, Andres Jimenez-Losada, Esperanza Lebron

Abstract:

A game with a priori incompatibilities is a triple (N,v,g) where (N,v) is a cooperative game, and (N,g) is a graph which establishes initial incompatibilities between some players. In these games, the negotiation has two stages. In the first stage, players can only negotiate with others with whom they are compatible. In the second stage, the grand coalition will be formed. We introduce a value for these games. Given a game with a priori incompatibility (N,v,g), we consider the family of coalitions without incompatibility relations among their players. This family is a normal set system or coalition configuration Ig. Therefore, we can assign to each game with a priori incompatibilities (N,v,g) a game with coalition configuration (N,v, Ig). Now, in order to obtain a payoff vector for (N,v,g), it suffices to calculate a payoff vector for (N,v, Ig). To this end, we apply a value for games with coalition configuration. In our case, we will use the dual configuration value, which has been studied in the literature. With this method, we obtain a value for games with a priori incompatibilities, which is called the Owen value for a priori incompatibilities. We provide a characterization of this value.

Keywords: cooperative game, game with coalition configuration, graph, independent set, Owen value, Shapley value

Procedia PDF Downloads 109
24617 Web Proxy Detection via Bipartite Graphs and One-Mode Projections

Authors: Zhipeng Chen, Peng Zhang, Qingyun Liu, Li Guo

Abstract:

With the Internet becoming the dominant channel for business and life, many IPs are increasingly masked using web proxies for illegal purposes such as propagating malware, impersonate phishing pages to steal sensitive data or redirect victims to other malicious targets. Moreover, as Internet traffic continues to grow in size and complexity, it has become an increasingly challenging task to detect the proxy service due to their dynamic update and high anonymity. In this paper, we present an approach based on behavioral graph analysis to study the behavior similarity of web proxy users. Specifically, we use bipartite graphs to model host communications from network traffic and build one-mode projections of bipartite graphs for discovering social-behavior similarity of web proxy users. Based on the similarity matrices of end-users from the derived one-mode projection graphs, we apply a simple yet effective spectral clustering algorithm to discover the inherent web proxy users behavior clusters. The web proxy URL may vary from time to time. Still, the inherent interest would not. So, based on the intuition, by dint of our private tools implemented by WebDriver, we examine whether the top URLs visited by the web proxy users are web proxies. Our experiment results based on real datasets show that the behavior clusters not only reduce the number of URLs analysis but also provide an effective way to detect the web proxies, especially for the unknown web proxies.

Keywords: bipartite graph, one-mode projection, clustering, web proxy detection

Procedia PDF Downloads 224
24616 Research on Dynamic Practical Byzantine Fault Tolerance Consensus Algorithm

Authors: Cao Xiaopeng, Shi Linkai

Abstract:

The practical Byzantine fault-tolerant algorithm does not add nodes dynamically. It is limited in practical application. In order to add nodes dynamically, Dynamic Practical Byzantine Fault Tolerance Algorithm (DPBFT) was proposed. Firstly, a new node sends request information to other nodes in the network. The nodes in the network decide their identities and requests. Then the nodes in the network reverse connect to the new node and send block information of the current network. The new node updates information. Finally, the new node participates in the next round of consensus, changes the view and selects the master node. This paper abstracts the decision of nodes into the undirected connected graph. The final consistency of the graph is used to prove that the proposed algorithm can adapt to the network dynamically. Compared with the PBFT algorithm, DPBFT has better fault tolerance and lower network bandwidth.

Keywords: practical byzantine, fault tolerance, blockchain, consensus algorithm, consistency analysis

Procedia PDF Downloads 108
24615 Handling Complexity of a Complex System Design: Paradigm, Formalism and Transformations

Authors: Hycham Aboutaleb, Bruno Monsuez

Abstract:

Current systems' complexity has reached a degree that requires addressing conception and design issues while taking into account environmental, operational, social, legal, and financial aspects. Therefore, one of the main challenges is the way complex systems are specified and designed. The exponentially growing effort, cost, and time investment of complex systems in modeling phase emphasize the need for a paradigm, a framework, and an environment to handle the system model complexity. For that, it is necessary to understand the expectations of the human user of the model and his limits. This paper presents a generic framework for designing complex systems, highlights the requirements a system model needs to fulfill to meet human user expectations, and suggests a graph-based formalism for modeling complex systems. Finally, a set of transformations are defined to handle the model complexity.

Keywords: higraph-based, formalism, system engineering paradigm, modeling requirements, graph-based transformations

Procedia PDF Downloads 381
24614 The Fabric of Culture: Deciphering the Discourse of Permitted and Prohibited Raw Materials for Clothing in Hadith Literature

Authors: Hadas Hirsch

Abstract:

Clothing is aimed at concealing and revealing the body, protecting it, and manifesting religious, political, and social declarations. The material and symbolic meanings of clothing and its raw materials are evaluated through the context of their social, cultural, and religious systems. The raw materials for clothing that were frequent and familiar in the 7th century Arab Peninsula were wool, leather, cotton, and some kinds of silk. The spread of the Muslim empire and the intersections with other religions and cultures enable the trickling of new raw materials that were unknown to Muslims or unaccepted. The sources for this research are hadith collections that discuss in details various kinds of textiles and their origin, together with a legal explanation that permits or prohibits its use. The paper will describe and analyze this discussion by contextualizing it in social, religious, and cultural reality that creates a structure of socio-religious dependency. The aim is not to identify, catalogue, and technically analyze fabrics but to reveal their role in Muslims’ life as a means of creating dependency for the community and setting borders inside and outside. The analysis is built upon a scale that starts with the most recommended raw materials, then comes the permitted ones and, in the end, the prohibited raw materials. This mapping will provide an insight into the ways textiles, as a cultural medium, help to shape and redefine identities and, at the same time, enable a sphere for creative expression within socio-cultural and religious limits and context. To sum up, hadith literature has the main role is characterizing Muslim clothing, from garments to textiles and colors, including multiple variations and contradicting aspects. The Muslim style of clothing and, in particular, textiles is a manifestation of the socio-religious structure of dependency that creates differentiated Muslim identity together with subdivision of gendered groups. Some other aspects are the tension between authenticity and imitation and the jurists’ pragmatic and practice attitude that enables an individual sphere of expression within the limits of jurisprudence.

Keywords: Hadith, jurisprudence, medieval Islam, material culture

Procedia PDF Downloads 72
24613 Traffic Prediction with Raw Data Utilization and Context Building

Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao

Abstract:

Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.

Keywords: traffic prediction, raw data utilization, context building, data reduction

Procedia PDF Downloads 102
24612 Modal Approach for Decoupling Damage Cost Dependencies in Building Stories

Authors: Haj Najafi Leila, Tehranizadeh Mohsen

Abstract:

Dependencies between diverse factors involved in probabilistic seismic loss evaluation are recognized to be an imperative issue in acquiring accurate loss estimates. Dependencies among component damage costs could be taken into account considering two partial distinct states of independent or perfectly-dependent for component damage states; however, in our best knowledge, there is no available procedure to take account of loss dependencies in story level. This paper attempts to present a method called "modal cost superposition method" for decoupling story damage costs subjected to earthquake ground motions dealt with closed form differential equations between damage cost and engineering demand parameters which should be solved in complex system considering all stories' cost equations by the means of the introduced "substituted matrixes of mass and stiffness". Costs are treated as probabilistic variables with definite statistic factors of median and standard deviation amounts and a presumed probability distribution. To supplement the proposed procedure and also to display straightforwardness of its application, one benchmark study has been conducted. Acceptable compatibility has been proven for the estimated damage costs evaluated by the new proposed modal and also frequently used stochastic approaches for entire building; however, in story level, insufficiency of employing modification factor for incorporating occurrence probability dependencies between stories has been revealed due to discrepant amounts of dependency between damage costs of different stories. Also, more dependency contribution in occurrence probability of loss could be concluded regarding more compatibility of loss results in higher stories than the lower ones, whereas reduction in incorporation portion of cost modes provides acceptable level of accuracy and gets away from time consuming calculations including some limited number of cost modes in high mode situation.

Keywords: dependency, story-cost, cost modes, engineering demand parameter

Procedia PDF Downloads 157
24611 EcoTeka, an Open-Source Software for Urban Ecosystem Restoration through Technology

Authors: Manon Frédout, Laëtitia Bucari, Mathias Aloui, Gaëtan Duhamel, Olivier Rovellotti, Javier Blanco

Abstract:

Ecosystems must be resilient to ensure cleaner air, better water and soil quality, and thus healthier citizens. Technology can be an excellent tool to support urban ecosystem restoration projects, especially when based on Open Source and promoting Open Data. This is the goal of the ecoTeka application: one single digital tool for tree management which allows decision-makers to improve their urban forestry practices, enabling more responsible urban planning and climate change adaptation. EcoTeka provides city councils with three main functionalities tackling three of their challenges: easier biodiversity inventories, better green space management, and more efficient planning. To answer the cities’ need for reliable tree inventories, the application has been first built with open data coming from the websites OpenStreetMap and OpenTrees, but it will also include very soon the possibility of creating new data. To achieve this, a multi-source algorithm will be elaborated, based on existing artificial intelligence Deep Forest, integrating open-source satellite images, 3D representations from LiDAR, and street views from Mapillary. This data processing will permit identifying individual trees' position, height, crown diameter, and taxonomic genus. To support urban forestry management, ecoTeka offers a dashboard for monitoring the city’s tree inventory and trigger alerts to inform about upcoming due interventions. This tool was co-constructed with the green space departments of the French cities of Alès, Marseille, and Rouen. The third functionality of the application is a decision-making tool for urban planning, promoting biodiversity and landscape connectivity metrics to drive ecosystem restoration roadmap. Based on landscape graph theory, we are currently experimenting with new methodological approaches to scale down regional ecological connectivity principles to local biodiversity conservation and urban planning policies. This methodological framework will couple graph theoretic approach and biological data, mainly biodiversity occurrences (presence/absence) data available on both international (e.g., GBIF), national (e.g., Système d’Information Nature et Paysage) and local (e.g., Atlas de la Biodiversté Communale) biodiversity data sharing platforms in order to help reasoning new decisions for ecological networks conservation and restoration in urban areas. An experiment on this subject is currently ongoing with Montpellier Mediterranee Metropole. These projects and studies have shown that only 26% of tree inventory data is currently geo-localized in France - the rest is still being done on paper or Excel sheets. It seems that technology is not yet used enough to enrich the knowledge city councils have about biodiversity in their city and that existing biodiversity open data (e.g., occurrences, telemetry, or genetic data), species distribution models, landscape graph connectivity metrics are still underexploited to make rational decisions for landscape and urban planning projects. This is the goal of ecoTeka: to support easier inventories of urban biodiversity and better management of urban spaces through rational planning and decisions relying on open databases. Future studies and projects will focus on the development of tools for reducing the artificialization of soils, selecting plant species adapted to climate change, and highlighting the need for ecosystem and biodiversity services in cities.

Keywords: digital software, ecological design of urban landscapes, sustainable urban development, urban ecological corridor, urban forestry, urban planning

Procedia PDF Downloads 46
24610 Network Connectivity Knowledge Graph Using Dwave Quantum Hybrid Solvers

Authors: Nivedha Rajaram

Abstract:

Hybrid Quantum solvers have been given prime focus in recent days by computation problem-solving domain industrial applications. D’Wave Quantum Computers are one such paragon of systems built using quantum annealing mechanism. Discrete Quadratic Models is a hybrid quantum computing model class supplied by D’Wave Ocean SDK - a real-time software platform for hybrid quantum solvers. These hybrid quantum computing modellers can be employed to solve classic problems. One such problem that we consider in this paper is finding a network connectivity knowledge hub in a huge network of systems. Using this quantum solver, we try to find out the prime system hub, which acts as a supreme connection point for the set of connected computers in a large network. This paper establishes an innovative problem approach to generate a connectivity system hub plot for a set of systems using DWave ocean SDK hybrid quantum solvers.

Keywords: quantum computing, hybrid quantum solver, DWave annealing, network knowledge graph

Procedia PDF Downloads 97
24609 Effecting the Unaffected Through the Effervescent Disk Theory, a Different Perspective of Media Effective Theories

Authors: Tarik Elaujali

Abstract:

This study examines a new media effective theory was developed by the author, it is called ‘The Effervescent Disk Theory’ (EDT). The theory main goal is to affect the unaffected audience who are either not exposing to a particular message or do not show interest in it. EDT suggest melting down messages that means to be affected within the media materials which are selected willingly by the audience themselves. A certain set of procedures to test EDT hypotheses were taken and illustrated in this study. A sample of 342 respondents (males & females) was collected from Tripoli University in Libya during the academic year 2013-2014. The designated sample is representing students who were failing to pass the English module for beginners’. This study aims to change the students’ negative notion about the importance of learning English, and to put their new idea into action. The theory seeks to affect audience cognition, emotions, and behaviors. EDT was applied in the present study alongside the media dependency theory. EDT hypotheses were confirmed, study results denoted that 73.6 percentage of the students responded positively and passed their English exam for beginners after being exposed selectively to their favorite TV program that contains a dissolved messages about the importance and vitality of learning English language.

Keywords: effervescent disk theory, selective exposure, media dependency, Libyan students

Procedia PDF Downloads 227
24608 Developing a Deep Understanding of the Immune Response in Hepatitis B Virus Infected Patients Using a Knowledge Driven Approach

Authors: Hanan Begali, Shahi Dost, Annett Ziegler, Markus Cornberg, Maria-Esther Vidal, Anke R. M. Kraft

Abstract:

Chronic hepatitis B virus (HBV) infection can be treated with nucleot(s)ide analog (NA), for example, which inhibits HBV replication. However, they have hardly any influence on the functional cure of HBV, which is defined by hepatitis B surface antigen (HBsAg) loss. NA needs to be taken life-long, which is not available for all patients worldwide. Additionally, NA-treated patients are still at risk of developing cirrhosis, liver failure, or hepatocellular carcinoma (HCC). Although each patient has the same components of the immune system, immune responses vary between patients. Therefore, a deeper understanding of the immune response against HBV in different patients is necessary to understand the parameters leading to HBV cure and to use this knowledge to optimize HBV therapies. This requires seamless integration of an enormous amount of diverse and fine-grained data from viral markers, e.g., hepatitis B core-related antigen (HBcrAg) and hepatitis B surface antigen (HBsAg). The data integration system relies on the assumption that profiling human immune systems requires the analysis of various variables (e.g., demographic data, treatments, pre-existing conditions, immune cell response, or HLA-typing) rather than only one. However, the values of these variables are collected independently. They are presented in a myriad of formats, e.g., excel files, textual descriptions, lab book notes, and images of flow cytometry dot plots. Additionally, patients can be identified differently in these analyses. This heterogeneity complicates the integration of variables, as data management techniques are needed to create a unified view in which individual formats and identifiers are transparent when profiling the human immune systems. The proposed study (HBsRE) aims at integrating heterogeneous data sets of 87 chronically HBV-infected patients, e.g., clinical data, immune cell response, and HLA-typing, with knowledge encoded in biomedical ontologies and open-source databases into a knowledge-driven framework. This new technique enables us to harmonize and standardize heterogeneous datasets in the defined modeling of the data integration system, which will be evaluated in the knowledge graph (KG). KGs are data structures that represent the knowledge and data as factual statements using a graph data model. Finally, the analytic data model will be applied on top of KG in order to develop a deeper understanding of the immune profiles among various patients and to evaluate factors playing a role in a holistic profile of patients with HBsAg level loss. Additionally, our objective is to utilize this unified approach to stratify patients for new effective treatments. This study is developed in the context of the project “Transforming big data into knowledge: for deep immune profiling in vaccination, infectious diseases, and transplantation (ImProVIT)”, which is a multidisciplinary team composed of computer scientists, infection biologists, and immunologists.

Keywords: chronic hepatitis B infection, immune response, knowledge graphs, ontology

Procedia PDF Downloads 92