Search results for: bond graph modeling
4678 Multi-Level Meta-Modeling for Enabling Dynamic Subtyping for Industrial Automation
Authors: Zoltan Theisz, Gergely Mezei
Abstract:
Modern industrial automation relies on service oriented concepts of Internet of Things (IoT) device modeling in order to provide a flexible and extendable environment for service meta-repository. However, state-of-the-art meta-modeling techniques prefer design-time modeling, which results in a heavy usage of class sometimes unnecessary static subtyping. Although this approach benefits from clear-cut object-oriented design principles, it also seals the model repository for further dynamic extensions. In this paper, a dynamic multi-level modeling approach is introduced that enables dynamic subtyping through a more relaxed partial instantiation mechanism. The approach is demonstrated on a simple sensor network example.Keywords: meta-modeling, dynamic subtyping, DMLA, industrial automation, arrowhead
Procedia PDF Downloads 3584677 The Effect of Enamel Surface Preparation on the Self-Etch Bonding of Orthodontic Tubes: An in Vitro Study
Authors: Fernandes A. C. B. C. J., de Jesus V. C., Sepideh N., Vilela OFGG, Somarin K. K., França R., Pinheiro F. H. S. L.
Abstract:
Objective: The purpose of this study was to look at the effect of pre-treatment of enamel with pumice and/or 37% phosphoric acid on the shear bond strength (SBS) of orthodontic tubes bonded to enamel while simultaneously evaluating the efficacy of orthodontic tubes bonded by self-etch primer (SEP). Materials and Methods: 39 of the crown halves were divided into 3 groups at random. Group, I was the control group utilizing both prophy paste and the conventional double etching pre-treatment method. Group II excluded the use of prophy paste prior to double etching. Group III excluded the use of both prophy paste and double etching and only utilized SEP. Bond strength of the orthodontic tubes was measured by SBS. One way ANOVA and Tukey’s HSD test were used to compare SBS values between the three groups. The statistical significance was set to p<0.05. Results: The difference in SBS values of groups I (36.672 ± 9.315 Mpa), II (34.242 ± 9.986 Mpa), and III (39.055 ± 5.565) were not statistically significant (P<0.05). Conclusion: This study suggested that the use of prophy paste or pre-acid etch of the enamel surface did not provide a statistically significant difference in SBS between the three groups.Keywords: shear bond strength, orthodontic bracket, self-etch primer, pumice, prophy
Procedia PDF Downloads 1774676 Robust Electrical Segmentation for Zone Coherency Delimitation Base on Multiplex Graph Community Detection
Authors: Noureddine Henka, Sami Tazi, Mohamad Assaad
Abstract:
The electrical grid is a highly intricate system designed to transfer electricity from production areas to consumption areas. The Transmission System Operator (TSO) is responsible for ensuring the efficient distribution of electricity and maintaining the grid's safety and quality. However, due to the increasing integration of intermittent renewable energy sources, there is a growing level of uncertainty, which requires a faster responsive approach. A potential solution involves the use of electrical segmentation, which involves creating coherence zones where electrical disturbances mainly remain within the zone. Indeed, by means of coherent electrical zones, it becomes possible to focus solely on the sub-zone, reducing the range of possibilities and aiding in managing uncertainty. It allows faster execution of operational processes and easier learning for supervised machine learning algorithms. Electrical segmentation can be applied to various applications, such as electrical control, minimizing electrical loss, and ensuring voltage stability. Since the electrical grid can be modeled as a graph, where the vertices represent electrical buses and the edges represent electrical lines, identifying coherent electrical zones can be seen as a clustering task on graphs, generally called community detection. Nevertheless, a critical criterion for the zones is their ability to remain resilient to the electrical evolution of the grid over time. This evolution is due to the constant changes in electricity generation and consumption, which are reflected in graph structure variations as well as line flow changes. One approach to creating a resilient segmentation is to design robust zones under various circumstances. This issue can be represented through a multiplex graph, where each layer represents a specific situation that may arise on the grid. Consequently, resilient segmentation can be achieved by conducting community detection on this multiplex graph. The multiplex graph is composed of multiple graphs, and all the layers share the same set of vertices. Our proposal involves a model that utilizes a unified representation to compute a flattening of all layers. This unified situation can be penalized to obtain (K) connected components representing the robust electrical segmentation clusters. We compare our robust segmentation to the segmentation based on a single reference situation. The robust segmentation proves its relevance by producing clusters with high intra-electrical perturbation and low variance of electrical perturbation. We saw through the experiences when robust electrical segmentation has a benefit and in which context.Keywords: community detection, electrical segmentation, multiplex graph, power grid
Procedia PDF Downloads 784675 Code Embedding for Software Vulnerability Discovery Based on Semantic Information
Authors: Joseph Gear, Yue Xu, Ernest Foo, Praveen Gauravaran, Zahra Jadidi, Leonie Simpson
Abstract:
Deep learning methods have been seeing an increasing application to the long-standing security research goal of automatic vulnerability detection for source code. Attention, however, must still be paid to the task of producing vector representations for source code (code embeddings) as input for these deep learning models. Graphical representations of code, most predominantly Abstract Syntax Trees and Code Property Graphs, have received some use in this task of late; however, for very large graphs representing very large code snip- pets, learning becomes prohibitively computationally expensive. This expense may be reduced by intelligently pruning this input to only vulnerability-relevant information; however, little research in this area has been performed. Additionally, most existing work comprehends code based solely on the structure of the graph at the expense of the information contained by the node in the graph. This paper proposes Semantic-enhanced Code Embedding for Vulnerability Discovery (SCEVD), a deep learning model which uses semantic-based feature selection for its vulnerability classification model. It uses information from the nodes as well as the structure of the code graph in order to select features which are most indicative of the presence or absence of vulnerabilities. This model is implemented and experimentally tested using the SARD Juliet vulnerability test suite to determine its efficacy. It is able to improve on existing code graph feature selection methods, as demonstrated by its improved ability to discover vulnerabilities.Keywords: code representation, deep learning, source code semantics, vulnerability discovery
Procedia PDF Downloads 1554674 A Graph SEIR Cellular Automata Based Model to Study the Spreading of a Transmittable Disease
Authors: Natasha Sharma, Kulbhushan Agnihotri
Abstract:
Cellular Automata are discrete dynamical systems which are based on local character and spatial disparateness of the spreading process. These factors are generally neglected by traditional models based on differential equations for epidemic spread. The aim of this work is to introduce an SEIR model based on cellular automata on graphs to imitate epidemic spreading. Distinctively, it is an SEIR-type model where the population is divided into susceptible, exposed, infected and recovered individuals. The results obtained from simulations are in accordance with the spreading behavior of a real time epidemics.Keywords: cellular automata, epidemic spread, graph, susceptible
Procedia PDF Downloads 4574673 An Approach to Maximize the Influence Spread in the Social Networks
Authors: Gaye Ibrahima, Mendy Gervais, Seck Diaraf, Ouya Samuel
Abstract:
In this paper, we consider the influence maximization in social networks. Here we give importance to initial diffuser called the seeds. The goal is to find efficiently a subset of k elements in the social network that will begin and maximize the information diffusion process. A new approach which treats the social network before to determine the seeds, is proposed. This treatment eliminates the information feedback toward a considered element as seed by extracting an acyclic spanning social network. At first, we propose two algorithm versions called SCG − algoritm (v1 and v2) (Spanning Connected Graphalgorithm). This algorithm takes as input data a connected social network directed or no. And finally, a generalization of the SCG − algoritm is proposed. It is called SG − algoritm (Spanning Graph-algorithm) and takes as input data any graph. These two algorithms are effective and have each one a polynomial complexity. To show the pertinence of our approach, two seeds set are determined and those given by our approach give a better results. The performances of this approach are very perceptible through the simulation carried out by the R software and the igraph package.Keywords: acyclic spanning graph, centrality measures, information feedback, influence maximization, social network
Procedia PDF Downloads 2484672 Deep Graph Embeddings for the Analysis of Short Heartbeat Interval Time Series
Authors: Tamas Madl
Abstract:
Sudden cardiac death (SCD) constitutes a large proportion of cardiovascular mortalities, provides little advance warning, and the risk is difficult to recognize based on ubiquitous, low cost medical equipment such as the standard, 12-lead, ten second ECG. Autonomic abnormalities have been shown to be strongly predictive of SCD risk; yet current methods are not trivially applicable to the brevity and low temporal and electrical resolution of standard ECGs. Here, we build horizontal visibility graph representations of very short inter-beat interval time series, and perform unsuper- vised representation learning in order to convert these variable size objects into fixed-length vectors preserving similarity rela- tions. We show that such representations facilitate classification into healthy vs. at-risk patients on two different datasets, the Mul- tiparameter Intelligent Monitoring in Intensive Care II and the PhysioNet Sudden Cardiac Death Holter Database. Our results suggest that graph representation learning of heartbeat interval time series facilitates robust classification even in sequences as short as ten seconds.Keywords: sudden cardiac death, heart rate variability, ECG analysis, time series classification
Procedia PDF Downloads 2324671 Validation and Interpretation about Precedence Diagram for Start to Finish Relationship by Graph Theory
Authors: Naoki Ohshima, Ken Kaminishi
Abstract:
Four types of dependencies, which are 'Finish-to-start', 'Finish-to-finish', 'Start-to-start' and 'Start-to-finish (S-F)' as logical relationship are modeled based on the definition by 'the predecessor activity is defined as an activity to come before a dependent activity in a schedule' in PMBOK. However, it is found a self-contradiction in the precedence diagram for S-F relationship by PMBOK. In this paper, author would like to validate logical relationship of S-F by Graph Theory and propose a new interpretation of the precedence diagram for S-F relationship.Keywords: project time management, sequence activity, start-to-finish relationship, precedence diagram, PMBOK
Procedia PDF Downloads 2694670 Cement Bond Characteristics of Artificially Fabricated Sandstones
Authors: Ashirgul Kozhagulova, Ainash Shabdirova, Galym Tokazhanov, Minh Nguyen
Abstract:
The synthetic rocks have been advantageous over the natural rocks in terms of availability and the consistent studying the impact of a particular parameter. The artificial rocks can be fabricated using variety of techniques such as mixing sand and Portland cement or gypsum, firing the mixture of sand and fine powder of borosilicate glass or by in-situ precipitation of calcite solution. In this study, sodium silicate solution has been used as the cementing agent for the quartz sand. The molded soft cylindrical sandstone samples are placed in the gas-tight pressure vessel, where the hardening of the material takes place as the chemical reaction between carbon dioxide and the silicate solution progresses. The vessel allows uniform disperse of carbon dioxide and control over the ambient gas pressure. Current paper shows how the bonding material is initially distributed in the intergranular space and the surface of the sand particles by the usage of Electron Microscopy and the Energy Dispersive Spectroscopy. During the study, the strength of the cement bond as a function of temperature is observed. The impact of cementing agent dosage on the micro and macro characteristics of the sandstone is investigated. The analysis of the cement bond at micro level helps to trace the changes to particles bonding damage after a potential yielding. Shearing behavior and compressional response have been examined resulting in the estimation of the shearing resistance and cohesion force of the sandstone. These are considered to be main input values to the mathematical prediction models of sand production from weak clastic oil reservoir formations.Keywords: artificial sanstone, cement bond, microstructure, SEM, triaxial shearing
Procedia PDF Downloads 1664669 Spectral Clustering from the Discrepancy View and Generalized Quasirandomness
Authors: Marianna Bolla
Abstract:
The aim of this paper is to compare spectral, discrepancy, and degree properties of expanding graph sequences. As we can prove equivalences and implications between them and the definition of the generalized (multiclass) quasirandomness of Lovasz–Sos (2008), they can be regarded as generalized quasirandom properties akin to the equivalent quasirandom properties of the seminal Chung-Graham-Wilson paper (1989) in the one-class scenario. Since these properties are valid for deterministic graph sequences, irrespective of stochastic models, the partial implications also justify for low-dimensional embedding of large-scale graphs and for discrepancy minimizing spectral clustering.Keywords: generalized random graphs, multiway discrepancy, normalized modularity spectra, spectral clustering
Procedia PDF Downloads 1954668 Single Cu‒N₄ Sites Enable Atomic Fe Clusters with High-Performance Oxygen Reduction Reaction
Abstract:
Atomically dispersed Fe‒N₄ catalysts are proven as promising alternatives to commercial Pt/C for the oxygen reduction reaction. Most reported Fe‒N₄ catalysts suffer from inferior O‒O bond-breaking capability due to superoxo-like O₂ adsorption, though the isolated dual-atomic metal sites strategy is extensively adopted. Atomic Fe clusters hold greater promise for promoting O‒O bond cleavage by forming peroxo-like O₂ adsorption. However, the excessively strong binding strength between Fe clusters and oxygenated intermediates sacrifices the activity. Here, we first report a Fex/Cu‒N@CF catalyst with atomic Fe clusters functionalized by adjacent single Cu‒N₄ sites anchoring on a porous carbon nanofiber membrane. The theoretical calculation indicates that the single Cu‒N₄ sites can modulate the electronic configuration of Fe clusters to reduce O₂* protonation reaction free energy, which ultimately enhances the electrocatalytic performance. Particularly, the Cu‒N₄ sites can increase the overlaps between the d orbitals of Fe and p orbitals of O to accelerate O‒O cleavage in OOH*. As a result, this unique atomic catalyst exhibits a half potential (E1/2) of 0.944 V in an alkaline medium exceeding that of commercial Pt/C, whereas acidic performance E1/2 = 0.815 V is comparable to Pt/C. This work shows the great potential of single atoms for improvements in atomic cluster catalysts.Keywords: Hierarchical porous fibers, atomic Fe clusters, Cu single atoms, oxygen reduction reaction; O-O bond cleavage
Procedia PDF Downloads 1144667 Chemical Partitioning of Trace Metals in Sub-Surface Sediments of Lake Acigol, Denizli, Turkey
Authors: M. Budakoglu, M. Karaman, D. Kiran, Z. Doner, B. Zeytuncu, B. Tanç, M. Kumral
Abstract:
Lake Acıgöl is one of the large saline lacustrine environment in Turkey. Eleven trace metals (Cr, Mn, Fe, Al, Co, Ni, Cu, Zn, Cd, Pb and As) in 9 surface and subsurface sediment samples from the Lake Acıgöl were analyzed with the bulk and sequential extraction analysis methods by ICP-MS to obtain the metal distribution patterns in this extreme environment. Five stepped sequential extraction technique (1- exchangeable, 2- bond to carbonates, 3- bond to iron and manganese oxides/hydroxides, 4- bond to organic matter and sulphides, and 5- residual fraction incorporated into clay and silicate mineral lattices) was used to characterize the various forms of metals in the <63μ size sediments. The metal contents (ppm) and their percentages for each extraction step were reported and compared with the results obtained from the total digestion. Results indicate that sum of the four fraction are in good agreement with the total digestion results of Ni, Cd, As, Zn, Cu and Fe with the satisfactory recoveries (94.04–109.0%) and the method used is reliable and repeatable for these elements. It was found that there were high correlations between Fe vs. Ni loads in the fraction of F2 and F4 with R2= 0,91 and 0,81, respectively. Comparison of totally 135 chemical analysis results in three sampling location and for 5 fraction between Fe-Co, Co-Ni and Fe-Ni element couples were presented elevated correlations with R2=0,98, 0,92 and 0,91, respectively.Keywords: Lake Acigol, sequancial extraction, recent lake sediment, geochemical speciation of heavy metals
Procedia PDF Downloads 4124666 Policy Compliance in Information Security
Authors: R. Manjula, Kaustav Bagchi, Sushant Ramesh, Anush Baskaran
Abstract:
In the past century, the emergence of information technology has had a significant positive impact on human life. While companies tend to be more involved in the completion of projects, the turn of the century has seen importance being given to investment in information security policies. These policies are essential to protect important data from adversaries, and thus following these policies has become one of the most important attributes revolving around information security models. In this research, we have focussed on the factors affecting information security policy compliance in two models : The theory of planned behaviour and the integration of the social bond theory and the involvement theory into a single model. Finally, we have given a proposal of where these theories would be successful.Keywords: information technology, information security, involvement theory, policies, social bond theory
Procedia PDF Downloads 3684665 Proposing an Architecture for Drug Response Prediction by Integrating Multiomics Data and Utilizing Graph Transformers
Authors: Nishank Raisinghani
Abstract:
Efficiently predicting drug response remains a challenge in the realm of drug discovery. To address this issue, we propose four model architectures that combine graphical representation with varying positions of multiheaded self-attention mechanisms. By leveraging two types of multi-omics data, transcriptomics and genomics, we create a comprehensive representation of target cells and enable drug response prediction in precision medicine. A majority of our architectures utilize multiple transformer models, one with a graph attention mechanism and the other with a multiheaded self-attention mechanism, to generate latent representations of both drug and omics data, respectively. Our model architectures apply an attention mechanism to both drug and multiomics data, with the goal of procuring more comprehensive latent representations. The latent representations are then concatenated and input into a fully connected network to predict the IC-50 score, a measure of cell drug response. We experiment with all four of these architectures and extract results from all of them. Our study greatly contributes to the future of drug discovery and precision medicine by looking to optimize the time and accuracy of drug response prediction.Keywords: drug discovery, transformers, graph neural networks, multiomics
Procedia PDF Downloads 1524664 Coupling of Reticular and Fuzzy Set Modelling in the Analysis of the Action Chains from Socio-Ecosystem, Case of the Renewable Natural Resources Management in Madagascar
Authors: Thierry Ganomanana, Dominique Hervé, Solo Randriamahaleo
Abstract:
Management of Malagasy renewable natural re-sources allows, in the case of forest, the mobilization of several actors with norms and/or territory. The interaction in this socio-ecosystem is represented by a graph of two different relationships in which most of action chains, from individual activities under the continuous of forest dynamic and discrete interventions by institutional, are also studied. The fuzzy set theory is adapted to graduate the elements of the set Illegal Activities in the space of sanction’s institution by his severity and in the space of degradation of forest by his extent.Keywords: fuzzy set, graph, institution, renewable resource, system
Procedia PDF Downloads 884663 Screen Method of Distributed Cooperative Navigation Factors for Unmanned Aerial Vehicle Swarm
Authors: Can Zhang, Qun Li, Yonglin Lei, Zhi Zhu, Dong Guo
Abstract:
Aiming at the problem of factor screen in distributed collaborative navigation of dense UAV swarm, an efficient distributed collaborative navigation factor screen method is proposed. The method considered the balance between computing load and positioning accuracy. The proposed algorithm utilized the factor graph model to implement a distributed collaborative navigation algorithm. The GNSS information of the UAV itself and the ranging information between the UAVs are used as the positioning factors. In this distributed scheme, a local factor graph is established for each UAV. The positioning factors of nodes with good geometric position distribution and small variance are selected to participate in the navigation calculation. To demonstrate and verify the proposed methods, the simulation and experiments in different scenarios are performed in this research. Simulation results show that the proposed scheme achieves a good balance between the computing load and positioning accuracy in the distributed cooperative navigation calculation of UAV swarm. This proposed algorithm has important theoretical and practical value for both industry and academic areas.Keywords: screen method, cooperative positioning system, UAV swarm, factor graph, cooperative navigation
Procedia PDF Downloads 784662 In-situ Performance of Pre-applied Bonded Waterproofing Membranes at Contaminated Test Slabs
Authors: Ulli Heinlein, Thomas Freimann
Abstract:
Pre-applied bonded membranes are used as positive-side waterproofing on concrete basements, are installed before the concrete work, and achieve a tear-resistant and waterproof bond with the subsequently placed fresh concrete. This bond increases redundancy compared to lose waterproofing membranes by preventing lateral water migrations in the event of damage. So far, the membranes have been tested in the laboratory, but it is not yet known how they behave on construction sites in the presence of dirt, soil, cement paste or moisture. This article, therefore, conducts investigations on six construction sites using 18 test slabs where the pre-applied bonded membranes are selectively contaminated or wetted. Subsequently, cores are taken, and the influence of the contaminations on the adhesive tensile strength and waterproof bond is tested. Pre-applied bonded membranes with smooth or granular but closed surfaces show no sensitivity to wetness, whereas open-pored membranes with nonwovens do not tolerate standing water. Contaminations decline the performance of all pre-applied bonded membranes since a separating layer is formed between the bonding layer and the concrete. The influence depends on the thickness of the contamination and its mechanical properties.Keywords: waterproofing, positive-side waterproofing, basement, pre-applied bonded waterproofing membrane, In-situ testing, lateral water migrations
Procedia PDF Downloads 1844661 N-Heterocyclic Carbene Based Dearomatized Iridium Complex as an Efficient Catalyst towards Carbon-Carbon Bond Formation via Hydrogen Borrowing Strategy
Authors: Mandeep Kaur, Jitendra K. Bera
Abstract:
The search for atom-economical and green synthetic methods for the synthesis of functionalized molecules has attracted much attention. Metal ligand cooperation (MLC) plays a pivotal role in organometallic catalysis to activate C−H, H−H, O−H, N−H and B−H bonds through reversible bond breaking and bond making process. Towards this goal, a bifunctional N─heterocyclic carbene (NHC) based pyridyl-functionalized amide ligand precursor, and corresponding dearomatized iridium complex was synthesized. The NMR and UV/Vis acid titration study have been done to prove the proton response nature of the iridium complex. Further, the dearomatized iridium complex explored as a catalyst on the platform of MLC via dearomatzation/aromatization mode of action towards atom economical α and β─alkylation of ketones and secondary alcohols by using primary alcohols through hydrogen borrowing methodology. The key features of the catalysis are high turnover frequency (TOF) values, low catalyst loading, low base loading and no waste product. The greener syntheses of quinoline, lactone derivatives and selective alkylation of drug molecules like pregnenolone and testosterone were also achieved successfully. Another structurally similar iridium complex was also synthesized with modified ligand precursor where a pendant amide unit was absent. The inactivity of this analogue iridium complex towards catalysis authenticated the participation of proton responsive imido sidearm of the ligand to accelerate the catalytic reaction. The mechanistic investigation through control experiments, NMR and deuterated labeling study, authenticate the borrowing hydrogen strategy.Keywords: C-C bond formation, hydrogen borrowing, metal ligand cooperation (MLC), n-heterocyclic carbene
Procedia PDF Downloads 1784660 Merging and Comparing Ontologies Generically
Authors: Xiuzhan Guo, Arthur Berrill, Ajinkya Kulkarni, Kostya Belezko, Min Luo
Abstract:
Ontology operations, e.g., aligning and merging, were studied and implemented extensively in different settings, such as categorical operations, relation algebras, and typed graph grammars, with different concerns. However, aligning and merging operations in the settings share some generic properties, e.g., idempotence, commutativity, associativity, and representativity, labeled by (I), (C), (A), and (R), respectively, which are defined on an ontology merging system (D~M), where D is a non-empty set of the ontologies concerned, ~ is a binary relation on D modeling ontology aligning and M is a partial binary operation on D modeling ontology merging. Given an ontology repository, a finite set O ⊆ D, its merging closure Ô is the smallest set of ontologies, which contains the repository and is closed with respect to merging. If (I), (C), (A), and (R) are satisfied, then both D and Ô are partially ordered naturally by merging, Ô is finite and can be computed, compared, and sorted efficiently, including sorting, selecting, and querying some specific elements, e.g., maximal ontologies and minimal ontologies. We also show that the ontology merging system, given by ontology V -alignment pairs and pushouts, satisfies the properties: (I), (C), (A), and (R) so that the merging system is partially ordered and the merging closure of a given repository with respect to pushouts can be computed efficiently.Keywords: ontology aligning, ontology merging, merging system, poset, merging closure, ontology V-alignment pair, ontology homomorphism, ontology V-alignment pair homomorphism, pushout
Procedia PDF Downloads 8924659 Numerical Modeling of Large Scale Dam Break Flows
Authors: Amanbek Jainakov, Abdikerim Kurbanaliev
Abstract:
The work presents the results of mathematical modeling of large-scale flows in areas with a complex topographic relief. The Reynolds-averaged Navier—Stokes equations constitute the basis of the three-dimensional unsteady modeling. The well-known Volume of Fluid method implemented in the solver interFoam of the open package OpenFOAM 2.3 is used to track the free-boundary location. The mathematical model adequacy is checked by comparing with experimental data. The efficiency of the applied technology is illustrated by the example of modeling the breakthrough of the dams of the Andijan (Uzbekistan) and Papan (near the Osh town, Kyrgyzstan) reservoir.Keywords: three-dimensional modeling, free boundary, the volume-of-fluid method, dam break, flood, OpenFOAM
Procedia PDF Downloads 4014658 Allocation of Mobile Units in an Urban Emergency Service System
Authors: Dimitra Alexiou
Abstract:
In an urban area the allocation placement of an emergency service mobile units, such as ambulances, police patrol must be designed so as to achieve a prompt response to demand locations. In this paper, a partition of a given urban network into distinct sub-networks is performed such that; the vertices in each component are close and simultaneously the difference of the sums of the corresponding population in the sub-networks is almost uniform. The objective here is to position appropriately in each sub-network a mobile emergency unit in order to reduce the response time to the demands. A mathematical model in the framework of graph theory is developed. In order to clarify the corresponding method a relevant numerical example is presented on a small network.Keywords: graph partition, emergency service, distances, location
Procedia PDF Downloads 4984657 DTI Connectome Changes in the Acute Phase of Aneurysmal Subarachnoid Hemorrhage Improve Outcome Classification
Authors: Sarah E. Nelson, Casey Weiner, Alexander Sigmon, Jun Hua, Haris I. Sair, Jose I. Suarez, Robert D. Stevens
Abstract:
Graph-theoretical information from structural connectomes indicated significant connectivity changes and improved acute prognostication in a Random Forest (RF) model in aneurysmal subarachnoid hemorrhage (aSAH), which can lead to significant morbidity and mortality and has traditionally been fraught by poor methods to predict outcome. This study’s hypothesis was that structural connectivity changes occur in canonical brain networks of acute aSAH patients, and that these changes are associated with functional outcome at six months. In a prospective cohort of patients admitted to a single institution for management of acute aSAH, patients underwent diffusion tensor imaging (DTI) as part of a multimodal MRI scan. A weighted undirected structural connectome was created of each patient’s images using Constant Solid Angle (CSA) tractography, with 176 regions of interest (ROIs) defined by the Johns Hopkins Eve atlas. ROIs were sorted into four networks: Default Mode Network, Executive Control Network, Salience Network, and Whole Brain. The resulting nodes and edges were characterized using graph-theoretic features, including Node Strength (NS), Betweenness Centrality (BC), Network Degree (ND), and Connectedness (C). Clinical (including demographics and World Federation of Neurologic Surgeons scale) and graph features were used separately and in combination to train RF and Logistic Regression classifiers to predict two outcomes: dichotomized modified Rankin Score (mRS) at discharge and at six months after discharge (favorable outcome mRS 0-2, unfavorable outcome mRS 3-6). A total of 56 aSAH patients underwent DTI a median (IQR) of 7 (IQR=8.5) days after admission. The best performing model (RF) combining clinical and DTI graph features had a mean Area Under the Receiver Operator Characteristic Curve (AUROC) of 0.88 ± 0.00 and Area Under the Precision Recall Curve (AUPRC) of 0.95 ± 0.00 over 500 trials. The combined model performed better than the clinical model alone (AUROC 0.81 ± 0.01, AUPRC 0.91 ± 0.00). The highest-ranked graph features for prediction were NS, BC, and ND. These results indicate reorganization of the connectome early after aSAH. The performance of clinical prognostic models was increased significantly by the inclusion of DTI-derived graph connectivity metrics. This methodology could significantly improve prognostication of aSAH.Keywords: connectomics, diffusion tensor imaging, graph theory, machine learning, subarachnoid hemorrhage
Procedia PDF Downloads 1884656 Suboptimal Retiree Allocations with Housing
Authors: Asiye Aydilek, Harun Aydilek
Abstract:
We investigate the costs of various suboptimal allocations in housing, consumption, bond and stock holdings of a retiree in a setting with recursive utility, considering the extensive empirical evidence that investors make suboptimal decisions in different ways. We find that suboptimal stock holdings impose only modest costs on the retiree. This may have a merit in explaining the limited stock investment in the data. The cost of suboptimal bond holdings is higher than that of stocks, but still small. This may partially explain why many more people hold bonds compared to stocks. We find that positive deviations from the optimal level are less costly relative to the negative ones in suboptimal housing allocations. This may help us to clarify why the elderly are over consuming housing, as seen in the housing data. The cost of suboptimal consumption is quite high and the highest of all. Our paper suggests that, in terms of welfare, the decisions of how much of liquid wealth to use for consumption and for saving are more important than the decision about the composition of liquid savings. Suboptimal stock holdings are twice more costly in power utility and suboptimal bond holdings are twenty times more costly in recursive utility. Recursive utility is superior to power utility in terms of rationalizing many people's preference for bonds instead of stocks in investment.Keywords: housing, recursive utility, retirement, suboptimal decisions, welfare cost
Procedia PDF Downloads 3164655 Top-K Shortest Distance as a Similarity Measure
Authors: Andrey Lebedev, Ilya Dmitrenok, JooYoung Lee, Leonard Johard
Abstract:
Top-k shortest path routing problem is an extension of finding the shortest path in a given network. Shortest path is one of the most essential measures as it reveals the relations between two nodes in a network. However, in many real world networks, whose diameters are small, top-k shortest path is more interesting as it contains more information about the network topology. Many variations to compute top-k shortest paths have been studied. In this paper, we apply an efficient top-k shortest distance routing algorithm to the link prediction problem and test its efficacy. We compare the results with other base line and state-of-the-art methods as well as with the shortest path. Then, we also propose a top-k distance based graph matching algorithm.Keywords: graph matching, link prediction, shortest path, similarity
Procedia PDF Downloads 3564654 Equity, Bonds, Institutional Debt and Economic Growth: Evidence from South Africa
Authors: Ashenafi Beyene Fanta, Daniel Makina
Abstract:
Economic theory predicts that finance promotes economic growth. Although the finance-growth link is among the most researched areas in financial economics, our understanding of the link between the two is still incomplete. This is caused by, among others, wrong econometric specifications, using weak proxies of financial development, and inability to address the endogeneity problem. Studies on the finance growth link in South Africa consistently report economic growth driving financial development. Early studies found that economic growth drives financial development in South Africa, and recent studies have confirmed this using different econometric models. However, the monetary aggregate (i.e. M2) utilized used in these studies is considered a weak proxy for financial development. Furthermore, the fact that the models employed do not address the endogeneity problem in the finance-growth link casts doubt on the validity of the conclusions. For this reason, the current study examines the finance growth link in South Africa using data for the period 1990 to 2011 by employing a generalized method of moments (GMM) technique that is capable of addressing endogeneity, simultaneity and omitted variable bias problems. Unlike previous cross country and country case studies that have also used the same technique, our contribution is that we account for the development of bond markets and non-bank financial institutions rather than being limited to stock market and banking sector development. We find that bond market development affects economic growth in South Africa, and no similar effect is observed for the bank and non-bank financial intermediaries and the stock market. Our findings show that examination of individual elements of the financial system is important in understanding the unique effect of each on growth. The observation that bond markets rather than private credit and stock market development promotes economic growth in South Africa induces an intriguing question as to what unique roles bond markets play that the intermediaries and equity markets are unable to play. Crucially, our results support observations in the literature that using appropriate measures of financial development is critical for policy advice. They also support the suggestion that individual elements of the financial system need to be studied separately to consider their unique roles in advancing economic growth. We believe that our understanding of the channels through which bond market contribute to growth would be a fertile ground for future research.Keywords: bond market, finance, financial sector, growth
Procedia PDF Downloads 4214653 Optimal Resource Configuration and Allocation Planning Problem for Bottleneck Machines and Auxiliary Tools
Authors: Yin-Yann Chen, Tzu-Ling Chen
Abstract:
This study presents the case of an actual Taiwanese semiconductor assembly and testing manufacturer. Three major bottleneck manufacturing processes, namely, die bond, wire bond, and molding, are analyzed to determine how to use finite resources to achieve the optimal capacity allocation. A medium-term capacity allocation planning model is developed by considering the optimal total profit to satisfy the promised volume demanded by customers and to obtain the best migration decision among production lines for machines and tools. Finally, sensitivity analysis based on the actual case is provided to explore the effect of various parameter levels.Keywords: capacity planning, capacity allocation, machine migration, resource configuration
Procedia PDF Downloads 4574652 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis
Authors: Meng Su
Abstract:
High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis
Procedia PDF Downloads 1054651 Models to Calculate Lattice Spacing, Melting Point and Lattice Thermal Expansion of Ga₂Se₃ Nanoparticles
Authors: Mustafa Saeed Omar
Abstract:
The formula which contains the maximum increase of mean bond length, melting entropy and critical particle radius is used to calculate lattice volume in nanoscale size crystals of Ga₂Se₃. This compound belongs to the binary group of III₂VI₃. The critical radius is calculated from the values of the first surface atomic layer height which is equal to 0.336nm. The size-dependent mean bond length is calculated by using an equation-free from fitting parameters. The size-dependent lattice parameter then is accordingly used to calculate the size-dependent lattice volume. The lattice size in the nanoscale region increases to about 77.6 A³, which is up to four times of its bulk state value 19.97 A³. From the values of the nanosize scale dependence of lattice volume, the nanoscale size dependence of melting temperatures is calculated. The melting temperature decreases with the nanoparticles size reduction, it becomes zero when the radius reaches to its critical value. Bulk melting temperature for Ga₂Se₃, for example, has values of 1293 K. From the size-dependent melting temperature and mean bond length, the size-dependent lattice thermal expansion is calculated. Lattice thermal expansion decreases with the decrease of nanoparticles size and reaches to its minimum value as the radius drops down to about 5nm.Keywords: Ga₂Se₃, lattice volume, lattice thermal expansion, melting point, nanoparticles
Procedia PDF Downloads 1664650 High-pressure Crystallographic Characterization of f-block Element Complexes
Authors: Nicholas B. Beck, Thomas E. Albrecht-Schönzart
Abstract:
High-pressure results in decreases in the bond lengths of metal-ligand bonds, which has proven to be incredibly informative in uncovering differences in bonding between lanthanide and actinide complexes. The degree of f-electron contribution to the metal ligand bonds has been observed to increase under pressure by a far greater degree in the actinides than the lanthanides, as revealed by spectroscopic studies. However, the actual changes in bond lengths have yet to be quantified, although computationally predicted. By using high-pressure crystallographic techniques, crystal structures of lanthanide complexes have been obtained at pressures up to 5 GPa for both hard and soft-donor ligands. These studies have revealed some unpredicted changes in the coordination environment as well as provided experimental support to computational resultsKeywords: crystallography, high-pressure, lanthanide, materials
Procedia PDF Downloads 1034649 Classification of Equations of Motion
Authors: Amritpal Singh Nafria, Rohit Sharma, Md. Shami Ansari
Abstract:
Up to now only five different equations of motion can be derived from velocity time graph without needing to know the normal and frictional forces acting at the point of contact. In this paper we obtained all possible requisite conditions to be considering an equation as an equation of motion. After that we classified equations of motion by considering two equations as fundamental kinematical equations of motion and other three as additional kinematical equations of motion. After deriving these five equations of motion, we examine the easiest way of solving a wide variety of useful numerical problems. At the end of the paper, we discussed the importance and educational benefits of classification of equations of motion.Keywords: velocity-time graph, fundamental equations, additional equations, requisite conditions, importance and educational benefits
Procedia PDF Downloads 787