Search results for: random graph
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2443

Search results for: random graph

1843 Citation Analysis of New Zealand Court Decisions

Authors: Tobias Milz, L. Macpherson, Varvara Vetrova

Abstract:

The law is a fundamental pillar of human societies as it shapes, controls and governs how humans conduct business, behave and interact with each other. Recent advances in computer-assisted technologies such as NLP, data science and AI are creating opportunities to support the practice, research and study of this pervasive domain. It is therefore not surprising that there has been an increase in investments into supporting technologies for the legal industry (also known as “legal tech” or “law tech”) over the last decade. A sub-discipline of particular appeal is concerned with assisted legal research. Supporting law researchers and practitioners to retrieve information from the vast amount of ever-growing legal documentation is of natural interest to the legal research community. One tool that has been in use for this purpose since the early nineteenth century is legal citation indexing. Among other use cases, they provided an effective means to discover new precedent cases. Nowadays, computer-assisted network analysis tools can allow for new and more efficient ways to reveal the “hidden” information that is conveyed through citation behavior. Unfortunately, access to openly available legal data is still lacking in New Zealand and access to such networks is only commercially available via providers such as LexisNexis. Consequently, there is a need to create, analyze and provide a legal citation network with sufficient data to support legal research tasks. This paper describes the development and analysis of a legal citation Network for New Zealand containing over 300.000 decisions from 125 different courts of all areas of law and jurisdiction. Using python, the authors assembled web crawlers, scrapers and an OCR pipeline to collect and convert court decisions from openly available sources such as NZLII into uniform and machine-readable text. This facilitated the use of regular expressions to identify references to other court decisions from within the decision text. The data was then imported into a graph-based database (Neo4j) with the courts and their respective cases represented as nodes and the extracted citations as links. Furthermore, additional links between courts of connected cases were added to indicate an indirect citation between the courts. Neo4j, as a graph-based database, allows efficient querying and use of network algorithms such as PageRank to reveal the most influential/most cited courts and court decisions over time. This paper shows that the in-degree distribution of the New Zealand legal citation network resembles a power-law distribution, which indicates a possible scale-free behavior of the network. This is in line with findings of the respective citation networks of the U.S. Supreme Court, Austria and Germany. The authors of this paper provide the database as an openly available data source to support further legal research. The decision texts can be exported from the database to be used for NLP-related legal research, while the network can be used for in-depth analysis. For example, users of the database can specify the network algorithms and metrics to only include specific courts to filter the results to the area of law of interest.

Keywords: case citation network, citation analysis, network analysis, Neo4j

Procedia PDF Downloads 92
1842 Analysing the Moderating Effect of Customer Loyalty on Long Run Repurchase Intentions

Authors: John Akpesiri Olotewo

Abstract:

One of the controversies in existing marketing literatures is on how to retain existing and new customers to have repurchase intention in the long-run; however, empirical answer to this question is scanty in existing studies. Thus, this study investigates the moderating effect of consumer loyalty on long-run repurchase intentions in telecommunication industry using Lagos State environs. The study adopted field survey research design using questionnaire to elicit responses from 250 respondents who were selected using random and stratified random sampling techniques from the telecommunication industry in Lagos State, Nigeria. The internal consistency of the research instrument was verified using the Cronbach’s alpha, the result of 0.89 implies the acceptability of the internal consistency of the survey instrument. The test of the research hypotheses were analyzed using Pearson Product Method of Correlation (PPMC), simple regression analysis and inferential statistics with the aid of Statistical Package for Social Science version 20.0 (SPSS). The study confirmed that customer satisfaction has a significant relationship with customer loyalty in the telecommunication industry; also Service quality has a significant relationship with customer loyalty to a brand; loyalty programs have a significant relationship with customer loyalty to a network operator in Nigeria and Customer loyalty has a significant effect on the long run repurchase intentions of the customer. The study concluded that one of the determinants of long term profitability of a business entity is the long run repurchase intentions of its customers which hinges on the level of brand loyalty of the customer. Thus, it was recommended that service providers in Nigeria should improve on factors like customer satisfaction, service quality, and loyalty programs in order to increase the loyalty of their customer to their brands thereby increasing their repurchase intentions.

Keywords: customer loyalty, long run repurchase intentions, brands, service quality and customer satisfaction

Procedia PDF Downloads 221
1841 Transformational Leadership Style of Principal and Conflict Management in Public Secondary Schools in North Central Nigeria

Authors: Odeh Regina Comfort, Angelina Okewu Ogwuche

Abstract:

The study investigated transformational leadership style of principal and conflict management in secondary schools in North Central Nigeria. A descriptive survey design was adopted. The population of the study comprised 34,473 teachers in 1949 public secondary schools in the study area. Proportionate stratified random sampling and simple random sampling techniques were used to select 39 public secondary schools and 689 respondents, respectively, for the study. The researcher utilized a self-structured questionnaire titled 'Influence of Transformational Leadership Style Questionnaire (ITLSQ)'. Face and content validity were ensured. The reliability index of 0.86 was obtained through Cronbach alpha statistics. The instrument was a modified Likert rating scale of Very High Extent (4), High Extent (3), Low Extent (2) and Very Low Extent (1). Mean, and standard deviation were used to answer 2 research questions, while chi-square goodness of fit was used to test the 2 hypotheses at 0.05 level of significance. The results among others indicate: that intellectual stimulation and individualized components of transformational leadership style of principal in public secondary schools in the study area have significant influence on conflict management in secondary schools. Based on the results, it was recommended that principals of secondary schools should be encouraged to practice the intellectual stimulation component of transformational leadership style that would help to consider teachers' levels of knowledge to decide what suits them to reach high levels of attainment thereby minimizing conflict in school settings; also transformational leadership should be taught to all people at all levels of secondary school especially that which pertains to individualized consideration to have a positive impact on the overall performance of teachers and this would help to minimize conflict in schools.

Keywords: conflict management, individualized consideration, intellectual stimulation, transformational leadership style

Procedia PDF Downloads 117
1840 Identifying Strategies for Improving Railway Services in Bangladesh

Authors: Armana Sabiha Huq, Tahmina Rahman Chowdhury

Abstract:

In this paper, based on the stated preference experiment, the service quality of Bangladesh Railway has been assessed, and particular importance has been given to investigate if there exists a relationship between service quality and safety. For investigation purposes, environmental and organizational factors were assumed to determine the safety performance of the railway. Data collected from the survey has been analyzed by importance-performance analysis (IPA). In this paper, a modification of the well-known importance-performance analysis (IPA) has been done by adopting the importance of the weights determined through a structural equation modeling (SEM) approach and by plotting the gap between importance and performance on a visual graph. It has been found that there exists a relationship between safety and serviceability to some extent. Limited resources are an important factor to improve the safety and serviceability condition of the BD railway. Moreover, it is observed that the limited resources available to monitor and improve the safety performance of railway.

Keywords: importance-performance analysis, GAP-IPA, SEM, serviceability, safety, factor analysis

Procedia PDF Downloads 123
1839 Deterministic and Stochastic Modeling of a Micro-Grid Management for Optimal Power Self-Consumption

Authors: D. Calogine, O. Chau, S. Dotti, O. Ramiarinjanahary, P. Rasoavonjy, F. Tovondahiniriko

Abstract:

Mafate is a natural circus in the north-western part of Reunion Island, without an electrical grid and road network. A micro-grid concept is being experimented in this area, composed of a photovoltaic production combined with electrochemical batteries, in order to meet the local population for self-consumption of electricity demands. This work develops a discrete model as well as a stochastic model in order to reach an optimal equilibrium between production and consumptions for a cluster of houses. The management of the energy power leads to a large linearized programming system, where the time interval of interest is 24 hours The experimental data are solar production, storage energy, and the parameters of the different electrical devices and batteries. The unknown variables to evaluate are the consumptions of the various electrical services, the energy drawn from and stored in the batteries, and the inhabitants’ planning wishes. The objective is to fit the solar production to the electrical consumption of the inhabitants, with an optimal use of the energies in the batteries by satisfying as widely as possible the users' planning requirements. In the discrete model, the different parameters and solutions of the linear programming system are deterministic scalars. Whereas in the stochastic approach, the data parameters and the linear programming solutions become random variables, then the distributions of which could be imposed or established by estimation from samples of real observations or from samples of optimal discrete equilibrium solutions.

Keywords: photovoltaic production, power consumption, battery storage resources, random variables, stochastic modeling, estimations of probability distributions, mixed integer linear programming, smart micro-grid, self-consumption of electricity.

Procedia PDF Downloads 95
1838 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 356
1837 Understanding Health Behavior Using Social Network Analysis

Authors: Namrata Mishra

Abstract:

Health of a person plays a vital role in the collective health of his community and hence the well-being of the society as a whole. But, in today’s fast paced technology driven world, health issues are increasingly being associated with human behaviors – their lifestyle. Social networks have tremendous impact on the health behavior of individuals. Many researchers have used social network analysis to understand human behavior that implicates their social and economic environments. It would be interesting to use a similar analysis to understand human behaviors that have health implications. This paper focuses on concepts of those behavioural analyses that have health implications using social networks analysis and provides possible algorithmic approaches. The results of these approaches can be used by the governing authorities for rolling out health plans, benefits and take preventive measures, while the pharmaceutical companies can target specific markets, helping health insurance companies to better model their insurance plans.

Keywords: breadth first search, directed graph, health behaviors, social network analysis

Procedia PDF Downloads 456
1836 Comparison between Continuous Genetic Algorithms and Particle Swarm Optimization for Distribution Network Reconfiguration

Authors: Linh Nguyen Tung, Anh Truong Viet, Nghien Nguyen Ba, Chuong Trinh Trong

Abstract:

This paper proposes a reconfiguration methodology based on a continuous genetic algorithm (CGA) and particle swarm optimization (PSO) for minimizing active power loss and minimizing voltage deviation. Both algorithms are adapted using graph theory to generate feasible individuals, and the modified crossover is used for continuous variable of CGA. To demonstrate the performance and effectiveness of the proposed methods, a comparative analysis of CGA with PSO for network reconfiguration, on 33-node and 119-bus radial distribution system is presented. The simulation results have shown that both CGA and PSO can be used in the distribution network reconfiguration and CGA outperformed PSO with significant success rate in finding optimal distribution network configuration.

Keywords: distribution network reconfiguration, particle swarm optimization, continuous genetic algorithm, power loss reduction, voltage deviation

Procedia PDF Downloads 169
1835 The Influence of Carbamazepine on the Activity of CYP3A4 in Patients with Alcoholism

Authors: Mikhail S. Zastrozhin, Valery V. Smirnov, Dmitry A. Sychev, Ludmila M. Savchenko, Evgeny A. Bryun, Mark O. Nechaev

Abstract:

Cytochrome P-450 isoenzyme 3A4 takes part in the biotransformation of medical drugs. The activity of CYP isoenzymes depends on genetic (polymorphisms of genes which encoded it) and phenotypic factors (a kind of food, a concomitant drug therapy). The aim of the study was to evaluate a carbamazepine effect on the CYP3A4 activity in patients with alcohol addiction. The study included 25 men with alcohol dependence, who received haloperidol during the exacerbation of the addiction. CYP3A4 activity was assessed by urinary 6-beta-hydroxycortisol/cortisol ratios measured by high performance liquid chromatography with mass spectrometry. The study modeled a graph and an equation of the logarithmic regression, that reflects the dependence of CYP3A4 activity on a dose of carbamazepine: y = 5,5 * 9,1 * 10-5 * x2. The study statistically significant demonstrates the effect of carbamazepine on CYP2D6 isozyme activity in patients with alcohol addiction.

Keywords: CYP3A4, biotransformation, carbamazepine, alcohol abuse

Procedia PDF Downloads 255
1834 Disadvantages and Drawbacks of Concrete Blocks and Fix Their Defects

Authors: Ehsan Sadie

Abstract:

Today, the cost of repair and maintenance of structures is very important and by studying the behavior of reinforced concrete structures Will become specified several factors such as : Design and calculation errors, lack of proper implementation of structural changes, the damage caused by the introduction of random loads, concrete corrosion and environmental conditions reduce durability of the structures . Meanwhile building codes alteration also cause changes in the assessment and review of the design and structure rather if necessary will be improved and strengthened in the future.

Keywords: concrete building , expandable cement, honeycombed surface , reinforcement corrosion

Procedia PDF Downloads 423
1833 Quantum Computing with Qudits on a Graph

Authors: Aleksey Fedorov

Abstract:

Building a scalable platform for quantum computing remains one of the most challenging tasks in quantum science and technologies. However, the implementation of most important quantum operations with qubits (quantum analogues of classical bits), such as multiqubit Toffoli gate, requires either a polynomial number of operation or a linear number of operations with the use of ancilla qubits. Therefore, the reduction of the number of operations in the presence of scalability is a crucial goal in quantum information processing. One of the most elegant ideas in this direction is to use qudits (multilevel systems) instead of qubits and rely on additional levels of qudits instead of ancillas. Although some of the already obtained results demonstrate a reduction of the number of operation, they suffer from high complexity and/or of the absence of scalability. We show a strong reduction of the number of operations for the realization of the Toffoli gate by using qudits for a scalable multi-qudit processor. This is done on the basis of a general relation between the dimensionality of qudits and their topology of connections, that we derived.

Keywords: quantum computing, qudits, Toffoli gates, gate decomposition

Procedia PDF Downloads 131
1832 Bayesian Approach for Moving Extremes Ranked Set Sampling

Authors: Said Ali Al-Hadhrami, Amer Ibrahim Al-Omari

Abstract:

In this paper, Bayesian estimation for the mean of exponential distribution is considered using Moving Extremes Ranked Set Sampling (MERSS). Three priors are used; Jeffery, conjugate and constant using MERSS and Simple Random Sampling (SRS). Some properties of the proposed estimators are investigated. It is found that the suggested estimators using MERSS are more efficient than its counterparts based on SRS.

Keywords: Bayesian, efficiency, moving extreme ranked set sampling, ranked set sampling

Procedia PDF Downloads 496
1831 Migration in Times of Uncertainty

Authors: Harman Jaggi, David Steinsaltz, Shripad Tuljapurkar

Abstract:

Understanding the effect of fluctuations on populations is crucial in the context of increasing habitat fragmentation, climate change, and biological invasions, among others. Migration in response to environmental disturbances enables populations to escape unfavorable conditions, benefit from new environments and thereby ride out fluctuations in variable environments. Would populations disperse if there is no uncertainty? Karlin showed in 1982 that when sub-populations experience distinct but fixed growth rates at different sites, greater mixing of populations will lower the overall growth rate relative to the most favorable site. Here we ask if and when environmental variability favors migration over no-migration. Specifically, in random environments, would a small amount of migration increase the overall long-run growth rate relative to the zero migration case? We use analysis and simulations to show how long-run growth rate changes with migration rate. Our results show that when fitness (dis)advantages fluctuate over time across sites, migration may allow populations to benefit from variability. When there is one best site with highest growth rate, the effect of migration on long-run growth rate depends on the difference in expected growth between sites, scaled by the variance of the difference. When variance is large, there is a substantial probability of an inferior site experiencing higher growth rate than its average. Thus, a high variance can compensate for a difference in average growth rates between sites. Positive correlations in growth rates across sites favor less migration. With multiple sites and large fluctuations, the length of shortest cycle (excursion) from the best site (on average) matters, and we explore the interplay between excursion length, average differences between sites and the size of fluctuations. Our findings have implications for conservation biology: even when there are superior sites in a sea of poor habitats, variability and habitat quality across space may be key to determining the importance of migration.

Keywords: migration, variable-environments, random, dispersal, fluctuations, habitat-quality

Procedia PDF Downloads 124
1830 Numerical Simulation of Flexural Strength of Steel Fiber Reinforced High Volume Fly Ash Concrete by Finite Element Analysis

Authors: Mahzabin Afroz, Indubhushan Patnaikuni, Srikanth Venkatesan

Abstract:

It is well-known that fly ash can be used in high volume as a partial replacement of cement to get beneficial effects on concrete. High volume fly ash (HVFA) concrete is currently emerging as a popular option to strengthen by fiber. Although studies have supported the use of fibers with fly ash, a unified model along with the incorporation into finite element software package to estimate the maximum flexural loads need to be developed. In this study, nonlinear finite element analysis of steel fiber reinforced high strength HVFA concrete beam under static loadings was conducted to investigate their failure modes in terms of ultimate load. First of all, the experimental investigation of mechanical properties of high strength HVFA concrete was done and validates with developed numerical model with the appropriate modeling of element size and mesh by ANSYS 16.2. To model the fiber within the concrete, three-dimensional random fiber distribution was simulated by spherical coordinate system. Three types of high strength HVFA concrete beams were analyzed reinforced with 0.5, 1 and 1.5% volume fractions of steel fibers with specific mechanical and physical properties. The result reveals that the use of nonlinear finite element analysis technique and three-dimensional random fiber orientation exhibited fairly good agreement with the experimental results of flexural strength, load deflection and crack propagation mechanism. By utilizing this improved model, it is possible to determine the flexural behavior of different types and proportions of steel fiber reinforced HVFA concrete beam under static load. So, this paper has the originality to predict the flexural properties of steel fiber reinforced high strength HVFA concrete by numerical simulations.

Keywords: finite element analysis, high volume fly ash, steel fibers, spherical coordinate system

Procedia PDF Downloads 121
1829 Saliency Detection Using a Background Probability Model

Authors: Junling Li, Fang Meng, Yichun Zhang

Abstract:

Image saliency detection has been long studied, while several challenging problems are still unsolved, such as detecting saliency inaccurately in complex scenes or suppressing salient objects in the image borders. In this paper, we propose a new saliency detection algorithm in order to solving these problems. We represent the image as a graph with superixels as nodes. By considering appearance similarity between the boundary and the background, the proposed method chooses non-saliency boundary nodes as background priors to construct the background probability model. The probability that each node belongs to the model is computed, which measures its similarity with backgrounds. Thus we can calculate saliency by the transformed probability as a metric. We compare our algorithm with ten-state-of-the-art salient detection methods on the public database. Experimental results show that our simple and effective approach can attack those challenging problems that had been baffling in image saliency detection.

Keywords: visual saliency, background probability, boundary knowledge, background priors

Procedia PDF Downloads 413
1828 A Column Generation Based Algorithm for Airline Cabin Crew Rostering Problem

Authors: Nan Xu

Abstract:

In airlines, the crew scheduling problem is usually decomposed into two stages: crew pairing and crew rostering. In the crew pairing stage, pairings are generated such that each flight is covered by exactly one pairing and the overall cost is minimized. In the crew rostering stage, the pairings generated in the crew pairing stage are combined with off days, training and other breaks to create individual work schedules. The paper focuses on cabin crew rostering problem, which is challenging due to the extremely large size and the complex working rules involved. In our approach, the objective of rostering consists of two major components. The first is to minimize the number of unassigned pairings and the second is to ensure the fairness to crew members. There are two measures of fairness to crew members, the number of overnight duties and the total fly-hour over a given period. Pairings should be assigned to each crew member so that their actual overnight duties and fly hours are as close to the expected average as possible. Deviations from the expected average are penalized in the objective function. Since several small deviations are preferred than a large deviation, the penalization is quadratic. Our model of the airline crew rostering problem is based on column generation. The problem is decomposed into a master problem and subproblems. The mater problem is modeled as a set partition problem and exactly one roster for each crew is picked up such that the pairings are covered. The restricted linear master problem (RLMP) is considered. The current subproblem tries to find columns with negative reduced costs and add them to the RLMP for the next iteration. When no column with negative reduced cost can be found or a stop criteria is met, the procedure ends. The subproblem is to generate feasible crew rosters for each crew member. A separate acyclic weighted graph is constructed for each crew member and the subproblem is modeled as resource constrained shortest path problems in the graph. Labeling algorithm is used to solve it. Since the penalization is quadratic, a method to deal with non-additive shortest path problem using labeling algorithm is proposed and corresponding domination condition is defined. The major contribution of our model is: 1) We propose a method to deal with non-additive shortest path problem; 2) Operation to allow relaxing some soft rules is allowed in our algorithm, which can improve the coverage rate; 3) Multi-thread techniques are used to improve the efficiency of the algorithm when generating Line-of-Work for crew members. Here a column generation based algorithm for the airline cabin crew rostering problem is proposed. The objective is to assign a personalized roster to crew member which minimize the number of unassigned pairings and ensure the fairness to crew members. The algorithm we propose in this paper has been put into production in a major airline in China and numerical experiments show that it has a good performance.

Keywords: aircrew rostering, aircrew scheduling, column generation, SPPRC

Procedia PDF Downloads 135
1827 Investigating the Neural Heterogeneity of Developmental Dyscalculia

Authors: Fengjuan Wang, Azilawati Jamaludin

Abstract:

Developmental Dyscalculia (DD) is defined as a particular learning difficulty with continuous challenges in learning requisite math skills that cannot be explained by intellectual disability or educational deprivation. Recent studies have increasingly recognized that DD is a heterogeneous, instead of monolithic, learning disorder with not only cognitive and behavioral deficits but so too neural dysfunction. In recent years, neuroimaging studies employed group comparison to explore the neural underpinnings of DD, which contradicted the heterogenous nature of DD and may obfuscate critical individual differences. This research aimed to investigate the neural heterogeneity of DD using case studies with functional near-infrared spectroscopy (fNIRS). A total of 54 aged 6-7 years old of children participated in this study, comprising two comprehensive cognitive assessments, an 8-minute resting state, and an 8-minute one-digit addition task. Nine children met the criteria of DD and scored at or below 85 (i.e., the 16th percentile) on the Mathematics or Math Fluency subtest of the Wechsler Individual Achievement Test, Third Edition (WIAT-III) (both subtest scores were 90 and below). The remaining 45 children formed the typically developing (TD) group. Resting-state data and brain activation in the inferior frontal gyrus (IFG), superior frontal gyrus (SFG), and intraparietal sulcus (IPS) were collected for comparison between each case and the TD group. Graph theory was used to analyze the brain network under the resting state. This theory represents the brain network as a set of nodes--brain regions—and edges—pairwise interactions across areas to reveal the architectural organizations of the nervous network. Next, a single-case methodology developed by Crawford et al. in 2010 was used to compare each case’s brain network indicators and brain activation against 45 TD children’s average data. Results showed that three out of the nine DD children displayed significant deviation from TD children’s brain indicators. Case 1 had inefficient nodal network properties. Case 2 showed inefficient brain network properties and weaker activation in the IFG and IPS areas. Case 3 displayed inefficient brain network properties with no differences in activation patterns. As a rise above, the present study was able to distill differences in architectural organizations and brain activation of DD vis-à-vis TD children using fNIRS and single-case methodology. Although DD is regarded as a heterogeneous learning difficulty, it is noted that all three cases showed lower nodal efficiency in the brain network, which may be one of the neural sources of DD. Importantly, although the current “brain norm” established for the 45 children is tentative, the results from this study provide insights not only for future work in “developmental brain norm” with reliable brain indicators but so too the viability of single-case methodology, which could be used to detect differential brain indicators of DD children for early detection and interventions.

Keywords: brain activation, brain network, case study, developmental dyscalculia, functional near-infrared spectroscopy, graph theory, neural heterogeneity

Procedia PDF Downloads 39
1826 Computational Chemical-Composition of Carbohydrates in the Context of Healthcare Informatics

Authors: S. Chandrasekaran, S. Nandita, M. Shivathmika, Srikrishnan Shivakumar

Abstract:

The objective of the research work is to analyze the computational chemical-composition of carbohydrates in the context of healthcare informatics. The computation involves the representation of complex chemical molecular structure of carbohydrate using graph theory and in a deployable Chemical Markup Language (CML). The parallel molecular structure of the chemical molecules with or without other adulterants for the sake of business profit can be analyzed in terms of robustness and derivatization measures. The rural healthcare program should create awareness in malnutrition to reduce ill-effect of decomposition and help the consumers to know the level of such energy storage mixtures in a quantitative way. The earlier works were based on the empirical and wet data which can vary from time to time but cannot be made to reuse the results of mining. The work is carried out on the quantitative computational chemistry on carbohydrates to provide a safe and secure right to food act and its regulations.

Keywords: carbohydrates, chemical-composition, chemical markup, robustness, food safety

Procedia PDF Downloads 361
1825 A Landscape of Research Data Repositories in Re3data.org Registry: A Case Study of Indian Repositories

Authors: Prashant Shrivastava

Abstract:

The purpose of this study is to explore re3dat.org registry to identify research data repositories registration workflow process. Further objective is to depict a graph for present development of research data repositories in India. Preliminarily with an approach to understand re3data.org registry framework and schema design then further proceed to explore the status of research data repositories of India in re3data.org registry. Research data repositories are getting wider relevance due to e-research concepts. Now available registry re3data.org is a good tool for users and researchers to identify appropriate research data repositories as per their research requirements. In Indian environment, a compatible National Research Data Policy is the need of the time to boost the management of research data. Registry for Research Data Repositories is a crucial tool to discover specific information in specific domain. Also, Research Data Repositories in India have not been studied. Re3data.org registry and status of Indian research data repositories both discussed in this study.

Keywords: research data, research data repositories, research data registry, re3data.org

Procedia PDF Downloads 308
1824 An Exhaustive All-Subsets Examination of Trade Theory on WTO Data

Authors: Masoud Charkhabi

Abstract:

We examine trade theory with this motivation. The full set of World Trade Organization data are organized into country-year pairs, each treated as a different entity. Topological Data Analysis reveals that among the 16 region and 240 region-year pairs there exists in fact a distinguishable group of region-period pairs. The generally accepted periods of shifts from dissimilar-dissimilar to similar-similar trade in goods among regions are examined from this new perspective. The period breaks are treated as cumulative and are flexible. This type of all-subsets analysis is motivated from computer science and is made possible with Lossy Compression and Graph Theory. The results question many patterns in similar-similar to dissimilar-dissimilar trade. They also show indications of economic shifts that only later become evident in other economic metrics.

Keywords: econometrics, globalization, network science, topological data, analysis, trade theory, visualization, world trade

Procedia PDF Downloads 355
1823 A Topological Study of an Urban Street Network and Its Use in Heritage Areas

Authors: Jose L. Oliver, Taras Agryzkov, Leandro Tortosa, Jose F. Vicent, Javier Santacruz

Abstract:

This paper aims to demonstrate how a topological study of an urban street network can be used as a tool to be applied to some heritage conservation areas in a city. In the last decades, we find different kinds of approaches in the discipline of Architecture and Urbanism based in the so-called Sciences of Complexity. In this context, this paper uses mathematics from the Network Theory. Hence, it proposes a methodology based in obtaining information from a graph, which is created from a network of urban streets. Then, it is used an algorithm that establishes a ranking of importance of the nodes of that network, from its topological point of view. The results are applied to a heritage area in a particular city, confronting the data obtained from the mathematical model, with the ones from the field work in the case study. As a result of this process, we may conclude the necessity of implementing some actions in the area, and where those actions would be more effective for the whole heritage site.

Keywords: graphs, heritage cities, spatial analysis, urban networks

Procedia PDF Downloads 375
1822 Security of Database Using Chaotic Systems

Authors: Eman W. Boghdady, A. R. Shehata, M. A. Azem

Abstract:

Database (DB) security demands permitting authorized users and prohibiting non-authorized users and intruders actions on the DB and the objects inside it. Organizations that are running successfully demand the confidentiality of their DBs. They do not allow the unauthorized access to their data/information. They also demand the assurance that their data is protected against any malicious or accidental modification. DB protection and confidentiality are the security concerns. There are four types of controls to obtain the DB protection, those include: access control, information flow control, inference control, and cryptographic. The cryptographic control is considered as the backbone for DB security, it secures the DB by encryption during storage and communications. Current cryptographic techniques are classified into two types: traditional classical cryptography using standard algorithms (DES, AES, IDEA, etc.) and chaos cryptography using continuous (Chau, Rossler, Lorenz, etc.) or discreet (Logistics, Henon, etc.) algorithms. The important characteristics of chaos are its extreme sensitivity to initial conditions of the system. In this paper, DB-security systems based on chaotic algorithms are described. The Pseudo Random Numbers Generators (PRNGs) from the different chaotic algorithms are implemented using Matlab and their statistical properties are evaluated using NIST and other statistical test-suits. Then, these algorithms are used to secure conventional DB (plaintext), where the statistical properties of the ciphertext are also tested. To increase the complexity of the PRNGs and to let pass all the NIST statistical tests, we propose two hybrid PRNGs: one based on two chaotic Logistic maps and another based on two chaotic Henon maps, where each chaotic algorithm is running side-by-side and starting from random independent initial conditions and parameters (encryption keys). The resulted hybrid PRNGs passed the NIST statistical test suit.

Keywords: algorithms and data structure, DB security, encryption, chaotic algorithms, Matlab, NIST

Procedia PDF Downloads 254
1821 Improving Search Engine Performance by Removing Indexes to Malicious URLs

Authors: Durga Toshniwal, Lokesh Agrawal

Abstract:

As the web continues to play an increasing role in information exchange, and conducting daily activities, computer users have become the target of miscreants which infects hosts with malware or adware for financial gains. Unfortunately, even a single visit to compromised web site enables the attacker to detect vulnerabilities in the user’s applications and force the downloading of multitude of malware binaries. We provide an approach to effectively scan the so-called drive-by downloads on the Internet. Drive-by downloads are result of URLs that attempt to exploit their visitors and cause malware to be installed and run automatically. To scan the web for malicious pages, the first step is to use a crawler to collect URLs that live on the Internet, and then to apply fast prefiltering techniques to reduce the amount of pages that are needed to be examined by precise, but slower, analysis tools (such as honey clients or antivirus programs). Although the technique is effective, it requires a substantial amount of resources. A main reason is that the crawler encounters many pages on the web that are legitimate and needs to be filtered. In this paper, to characterize the nature of this rising threat, we present implementation of a web crawler on Python, an approach to search the web more efficiently for pages that are likely to be malicious, filtering benign pages and passing remaining pages to antivirus program for detection of malwares. Our approaches starts from an initial seed of known, malicious web pages. Using these seeds, our system generates search engines queries to identify other malicious pages that are similar to the ones in the initial seed. By doing so, it leverages the crawling infrastructure of search engines to retrieve URLs that are much more likely to be malicious than a random page on the web. The results shows that this guided approach is able to identify malicious web pages more efficiently when compared to random crawling-based approaches.

Keywords: web crawler, malwares, seeds, drive-by-downloads, security

Procedia PDF Downloads 217
1820 Conjugal Relationship and Reproductive Decision-Making among Couples in Southwest Nigeria

Authors: Peter Olasupo Ogunjuyigbe, Sarafa Shittu

Abstract:

This paper emphasizes the relevance of conjugal relationship and spousal communication towards enhancing men’s involvement in contraceptive use among the Yorubas of South Western Nigeria. An understanding of males influence and the role they play in reproductive decision making can throw better light on mechanisms through which egalitarianness of husband/wife decision making influences contraceptive use. The objective of this study was to investigate how close conjugal relationships can be a good indicator of joint decision making among couples using data derived from a survey conducted in three states of South Western Nigeria. The study sample consisted of five hundred and twenty one (521) male respondents aged 15-59 years and five hundred and forty seven (547) female respondents aged 15-49 years. The study used both quantitative and qualitative approached to elicit information from the respondents. In order that the study would be truly representative of the towns, each of the study locations in the capital cities was divided into four strata: The traditional area, the migrant area, the mixed area (i.e. traditional and migrant), and the elite area. In the rural areas, selection of the respondents was by simple random sampling technique. However, the random selection was made in such a way that all the different parts of the locations were represented. Generally, the data collected were analysed at univariate, bivariate, and multivariate levels. Logistic regression models were employed to examine the interrelationships between male reproductive behaviour, conjugal relationship and contraceptive use. The study indicates that current use of contraceptive is high among this major ethnic group in Nigeria because of the improved level of communication among couples. The problem, however, is that men still have lower exposure rate when it comes to question of family planning information, education and counseling. This has serious implications on fertility regulation in Nigeria.

Keywords: behavior, conjugal, communication, counseling, spouse

Procedia PDF Downloads 123
1819 Jordan Curves in the Digital Plane with Respect to the Connectednesses given by Certain Adjacency Graphs

Authors: Josef Slapal

Abstract:

Digital images are approximations of real ones and, therefore, to be able to study them, we need the digital plane Z2 to be equipped with a convenient structure that behaves analogously to the Euclidean topology on the real plane. In particular, it is required that such a structure allows for a digital analogue of the Jordan curve theorem. We introduce certain adjacency graphs on the digital plane and prove digital Jordan curves for them thus showing that the graphs provide convenient structures on Z2 for the study and processing of digital images. Further convenient structures including the wellknown Khalimsky and Marcus-Wyse adjacency graphs may be obtained as quotients of the graphs introduced. Since digital Jordan curves represent borders of objects in digital images, the adjacency graphs discussed may be used as background structures on the digital plane for solving the problems of digital image processing that are closely related to borders like border detection, contour filling, pattern recognition, thinning, etc.

Keywords: digital plane, adjacency graph, Jordan curve, quotient adjacency

Procedia PDF Downloads 359
1818 The Effect of H2S on Crystal Structure

Authors: C. Venkataraman B. E., J. Nagarajan B. E., V. Srinivasan M. Tech

Abstract:

For a better understanding on sulfide stress corrosion cracking, a theoretical approach based on crystal structure, molecule behavior, flow of electrons and electrochemical reaction is developed. Its impact on different materials such as carbon steel, low alloy, alloy for sour (H2S) environments is studied. This paper describes the theories on various disaster and failures occurred in the industry by Stress Corrosion Cracking (SCC). Parameters such as pH of process fluid, partial pressure of CO2, O2, Chlorine, effect of internal pressure (crystal structure deformation by stress), and external environment condition are considered. An analytical line graph is then created for process fluid parameter verses time, temperature, induced/residual stress due to local pressure build-up. By comparison with the load test result of NACE and ASTM, it is possible to predict and simplify the control of SCC by use of materials like ferritic, Austenitic material in the oil and gas & petroleum industries.

Keywords: crystal structure deformation, failure assessment, alloy-environment combination, H2S

Procedia PDF Downloads 389
1817 A Relationship Extraction Method from Literary Fiction Considering Korean Linguistic Features

Authors: Hee-Jeong Ahn, Kee-Won Kim, Seung-Hoon Kim

Abstract:

The knowledge of the relationship between characters can help readers to understand the overall story or plot of the literary fiction. In this paper, we present a method for extracting the specific relationship between characters from a Korean literary fiction. Generally, methods for extracting relationships between characters in text are statistical or computational methods based on the sentence distance between characters without considering Korean linguistic features. Furthermore, it is difficult to extract the relationship with direction from text, such as one-sided love, because they consider only the weight of relationship, without considering the direction of the relationship. Therefore, in order to identify specific relationships between characters, we propose a statistical method considering linguistic features, such as syntactic patterns and speech verbs in Korean. The result of our method is represented by a weighted directed graph of the relationship between the characters. Furthermore, we expect that proposed method could be applied to the relationship analysis between characters of other content like movie or TV drama.

Keywords: data mining, Korean linguistic feature, literary fiction, relationship extraction

Procedia PDF Downloads 361
1816 Machine learning Assisted Selective Emitter design for Solar Thermophotovoltaic System

Authors: Ambali Alade Odebowale, Andargachew Mekonnen Berhe, Haroldo T. Hattori, Andrey E. Miroshnichenko

Abstract:

Solar thermophotovoltaic systems (STPV) have emerged as a promising solution to overcome the Shockley-Queisser limit, a significant impediment in the direct conversion of solar radiation into electricity using conventional solar cells. The STPV system comprises essential components such as an optical concentrator, selective emitter, and a thermophotovoltaic (TPV) cell. The pivotal element in achieving high efficiency in an STPV system lies in the design of a spectrally selective emitter or absorber. Traditional methods for designing and optimizing selective emitters are often time-consuming and may not yield highly selective emitters, posing a challenge to the overall system performance. In recent years, the application of machine learning techniques in various scientific disciplines has demonstrated significant advantages. This paper proposes a novel nanostructure composed of four-layered materials (SiC/W/SiO2/W) to function as a selective emitter in the energy conversion process of an STPV system. Unlike conventional approaches widely adopted by researchers, this study employs a machine learning-based approach for the design and optimization of the selective emitter. Specifically, a random forest algorithm (RFA) is employed for the design of the selective emitter, while the optimization process is executed using genetic algorithms. This innovative methodology holds promise in addressing the challenges posed by traditional methods, offering a more efficient and streamlined approach to selective emitter design. The utilization of a machine learning approach brings several advantages to the design and optimization of a selective emitter within the STPV system. Machine learning algorithms, such as the random forest algorithm, have the capability to analyze complex datasets and identify intricate patterns that may not be apparent through traditional methods. This allows for a more comprehensive exploration of the design space, potentially leading to highly efficient emitter configurations. Moreover, the application of genetic algorithms in the optimization process enhances the adaptability and efficiency of the overall system. Genetic algorithms mimic the principles of natural selection, enabling the exploration of a diverse range of emitter configurations and facilitating the identification of optimal solutions. This not only accelerates the design and optimization process but also increases the likelihood of discovering configurations that exhibit superior performance compared to traditional methods. In conclusion, the integration of machine learning techniques in the design and optimization of a selective emitter for solar thermophotovoltaic systems represents a groundbreaking approach. This innovative methodology not only addresses the limitations of traditional methods but also holds the potential to significantly improve the overall performance of STPV systems, paving the way for enhanced solar energy conversion efficiency.

Keywords: emitter, genetic algorithm, radiation, random forest, thermophotovoltaic

Procedia PDF Downloads 40
1815 Dislocation Density-Based Modeling of the Grain Refinement in Surface Mechanical Attrition Treatment

Authors: Reza Miresmaeili, Asghar Heydari Astaraee, Fereshteh Dolati

Abstract:

In the present study, an analytical model based on dislocation density model was developed to simulate grain refinement in surface mechanical attrition treatment (SMAT). The correlation between SMAT time and development in plastic strain on one hand, and dislocation density evolution, on the other hand, was established to simulate the grain refinement in SMAT. A dislocation density-based constitutive material law was implemented using VUHARD subroutine. A random sequence of shots is taken into consideration for multiple impacts model using Python programming language by utilizing a random function. The simulation technique was to model each impact in a separate run and then transferring the results of each run as initial conditions for the next run (impact). The developed Finite Element (FE) model of multiple impacts describes the coverage evolution in SMAT. Simulations were run to coverage levels as high as 4500%. It is shown that the coverage implemented in the FE model is equal to the experimental coverage. It is depicted that numerical SMAT coverage parameter is adequately conforming to the well-known Avrami model. Comparison between numerical results and experimental measurements for residual stresses and depth of deformation layers confirms the performance of the established FE model for surface engineering evaluations in SMA treatment. X-ray diffraction (XRD) studies of grain refinement, including resultant grain size and dislocation density, were conducted to validate the established model. The full width at half-maximum in XRD profiles can be used to measure the grain size. Numerical results and experimental measurements of grain refinement illustrate good agreement and show the capability of established FE model to predict the gradient microstructure in SMA treatment.

Keywords: dislocation density, grain refinement, severe plastic deformation, simulation, surface mechanical attrition treatment

Procedia PDF Downloads 122
1814 Evolving Knowledge Extraction from Online Resources

Authors: Zhibo Xiao, Tharini Nayanika de Silva, Kezhi Mao

Abstract:

In this paper, we present an evolving knowledge extraction system named AKEOS (Automatic Knowledge Extraction from Online Sources). AKEOS consists of two modules, including a one-time learning module and an evolving learning module. The one-time learning module takes in user input query, and automatically harvests knowledge from online unstructured resources in an unsupervised way. The output of the one-time learning is a structured vector representing the harvested knowledge. The evolving learning module automatically schedules and performs repeated one-time learning to extract the newest information and track the development of an event. In addition, the evolving learning module summarizes the knowledge learned at different time points to produce a final knowledge vector about the event. With the evolving learning, we are able to visualize the key information of the event, discover the trends, and track the development of an event.

Keywords: evolving learning, knowledge extraction, knowledge graph, text mining

Procedia PDF Downloads 445