Search results for: multilevel graph
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 589

Search results for: multilevel graph

409 Top-K Shortest Distance as a Similarity Measure

Authors: Andrey Lebedev, Ilya Dmitrenok, JooYoung Lee, Leonard Johard

Abstract:

Top-k shortest path routing problem is an extension of finding the shortest path in a given network. Shortest path is one of the most essential measures as it reveals the relations between two nodes in a network. However, in many real world networks, whose diameters are small, top-k shortest path is more interesting as it contains more information about the network topology. Many variations to compute top-k shortest paths have been studied. In this paper, we apply an efficient top-k shortest distance routing algorithm to the link prediction problem and test its efficacy. We compare the results with other base line and state-of-the-art methods as well as with the shortest path. Then, we also propose a top-k distance based graph matching algorithm.

Keywords: graph matching, link prediction, shortest path, similarity

Procedia PDF Downloads 334
408 Classification of Equations of Motion

Authors: Amritpal Singh Nafria, Rohit Sharma, Md. Shami Ansari

Abstract:

Up to now only five different equations of motion can be derived from velocity time graph without needing to know the normal and frictional forces acting at the point of contact. In this paper we obtained all possible requisite conditions to be considering an equation as an equation of motion. After that we classified equations of motion by considering two equations as fundamental kinematical equations of motion and other three as additional kinematical equations of motion. After deriving these five equations of motion, we examine the easiest way of solving a wide variety of useful numerical problems. At the end of the paper, we discussed the importance and educational benefits of classification of equations of motion.

Keywords: velocity-time graph, fundamental equations, additional equations, requisite conditions, importance and educational benefits

Procedia PDF Downloads 758
407 Reinventing Urban Governance: Sustainable Transport Solutions for Mitigating Climate Risks in Smart Cities

Authors: Jaqueline Nichi, Leila Da Costa Ferreira, Fabiana Barbi Seleguim, Gabriela Marques Di Giulio, Mariana Barbieri

Abstract:

The transport sector is responsible for approximately 55% of global greenhouse gas (GHG) emissions, in addition to pollution and other negative externalities, such as road accidents and congestion, that impact the routine of those who live in large cities. The objective of this article is to discuss the application and use of distinct mobility technologies such as climate adaptation and mitigation measures in the context of smart cities in the Global South. The documentary analysis is associated with 22 semi structured interviews with managers who work with mobility technologies in the public and private sectors and in civil society organizations to explore solutions in multilevel governance for smart and low-carbon mobility based on the case study from the city of São Paulo, Brazil. The hypothesis that innovation and technology to mitigate and adapt to climate impacts are not yet sufficient to make mobility more sustainable has been confirmed. The results indicate four relevant aspects for advancing a climate agenda in smart cities: integrated planning, coproduction of knowledge, experiments in governance, and new means of financing to guarantee the sustainable sociotechnical transition of the sector.

Keywords: urban mobility, climate change, smart cities, multilevel governance

Procedia PDF Downloads 24
406 An Efficient Hardware/Software Workflow for Multi-Cores Simulink Applications

Authors: Asma Rebaya, Kaouther Gasmi, Imen Amari, Salem Hasnaoui

Abstract:

Over these last years, applications such as telecommunications, signal processing, digital communication with advanced features (Multi-antenna, equalization..) witness a rapid evaluation accompanied with an increase of user exigencies in terms of latency, the power of computation… To satisfy these requirements, the use of hardware/software systems is a common solution; where hardware is composed of multi-cores and software is represented by models of computation, synchronous data flow (SDF) graph for instance. Otherwise, the most of the embedded system designers utilize Simulink for modeling. The issue is how to simplify the c code generation, for a multi-cores platform, of an application modeled by Simulink. To overcome this problem, we propose a workflow allowing an automatic transformation from the Simulink model to the SDF graph and providing an efficient schedule permitting to optimize the number of cores and to minimize latency. This workflow goes from a Simulink application and a hardware architecture described by IP.XACT language. Based on the synchronous and hierarchical behavior of both models, the Simulink block diagram is automatically transformed into an SDF graph. Once this process is successfully achieved, the scheduler calculates the optimal cores’ number needful by minimizing the maximum density of the whole application. Then, a core is chosen to execute a specific graph task in a specific order and, subsequently, a compatible C code is generated. In order to perform this proposal, we extend Preesm, a rapid prototyping tool, to take the Simulink model as entry input and to support the optimal schedule. Afterward, we compared our results to this tool results, using a simple illustrative application. The comparison shows that our results strictly dominate the Preesm results in terms of number of cores and latency. In fact, if Preesm needs m processors and latency L, our workflow need processors and latency L'< L.

Keywords: hardware/software system, latency, modeling, multi-cores platform, scheduler, SDF graph, Simulink model, workflow

Procedia PDF Downloads 241
405 Synchrotron Radiation and Inverse Compton Scattering in Astrophysical Plasma

Authors: S. S. Sathiesh

Abstract:

The aim of this project is to study the radiation mechanism synchrotron and Inverse Compton scattering. Theoretically, we discussed spectral energy distribution for both. Programming is done for plotting the graph of Power-law spectrum for synchrotron Radiation using fortran90. The importance of power law spectrum was discussed and studied to infer its physical parameters from the model fitting. We also discussed how to infer the physical parameters from the theoretically drawn graph, we have seen how one can infer B (magnetic field of the source), γ min, γ max, spectral indices (p1, p2) while fitting the curve to the observed data.

Keywords: blazars/quasars, beaming, synchrotron radiation, Synchrotron Self Compton, inverse Compton scattering, mrk421

Procedia PDF Downloads 391
404 Stock Market Prediction Using Convolutional Neural Network That Learns from a Graph

Authors: Mo-Se Lee, Cheol-Hwi Ahn, Kee-Young Kwahk, Hyunchul Ahn

Abstract:

Over the past decade, deep learning has been in spotlight among various machine learning algorithms. In particular, CNN (Convolutional Neural Network), which is known as effective solution for recognizing and classifying images, has been popularly applied to classification and prediction problems in various fields. In this study, we try to apply CNN to stock market prediction, one of the most challenging tasks in the machine learning research. In specific, we propose to apply CNN as the binary classifier that predicts stock market direction (up or down) by using a graph as its input. That is, our proposal is to build a machine learning algorithm that mimics a person who looks at the graph and predicts whether the trend will go up or down. Our proposed model consists of four steps. In the first step, it divides the dataset into 5 days, 10 days, 15 days, and 20 days. And then, it creates graphs for each interval in step 2. In the next step, CNN classifiers are trained using the graphs generated in the previous step. In step 4, it optimizes the hyper parameters of the trained model by using the validation dataset. To validate our model, we will apply it to the prediction of KOSPI200 for 1,986 days in eight years (from 2009 to 2016). The experimental dataset will include 14 technical indicators such as CCI, Momentum, ROC and daily closing price of KOSPI200 of Korean stock market.

Keywords: convolutional neural network, deep learning, Korean stock market, stock market prediction

Procedia PDF Downloads 407
403 One-off Separation of Multiple Types of Oil-in-Water Emulsions with Surface-Engineered Graphene-Based Multilevel Structure Materials

Authors: Han Longxiang

Abstract:

In the process of treating industrial oil wastewater with complex components, the traditional treatment methods (flotation, coagulation, microwave heating, etc.) often produce high operating costs, secondary pollution, and other problems. In order to solve these problems, the materials with high flux and stability applied to surfactant-stabilized emulsions separation have gained huge attention in the treatment of oily wastewater. Nevertheless, four stable oil-in-water emulsions can be formed due to different surfactants (surfactant-free, anionic surfactant, cationic surfactant, and non-ionic surfactant), and the previous advanced materials can only separate one or several of them, cannot effectively separate in one step. Herein, a facile synthesis method of graphene-based multilevel filter materials (GMFM) can efficiently separate the oil-in-water emulsions stabilized with different surfactants only through its gravity. The prepared materials with high stability of 20 cycles show a high flux of ~ 5000 L m-2 h-1 with a high separation efficiency of > 99.9 %. GMFM can effectively separate the emulsion stabilized by mixed surfactants and oily wastewater from factories. The results indicate that the GMFM has a wide range of applications in oil-in-water emulsions separation in industry and environmental science.

Keywords: emulsion, filtration, graphene, one-step

Procedia PDF Downloads 55
402 Managing Cognitive Load in Accounting: An Analysis of Three Instructional Designs in Financial Accounting

Authors: Seedwell Sithole

Abstract:

One of the persistent problems in accounting education is how to effectively support students’ learning. A promising technique to this issue is to investigate the extent that learning is determined by the design of instructional material. This study examines the academic performance of students using three instructional designs in financial accounting. Student’s performance scores and reported mental effort ratings were used to determine the instructional effectiveness. The findings of this study show that accounting students prefer graph and text designs that are integrated. The results suggest that spatially separated graph and text presentations in accounting should be reorganized to align with the requirements of human cognitive architecture.

Keywords: accounting, cognitive load, education, instructional preferences, students

Procedia PDF Downloads 116
401 CTHTC: A Convolution-Backed Transformer Architecture for Temporal Knowledge Graph Embedding with Periodicity Recognition

Authors: Xinyuan Chen, Mohd Nizam Husen, Zhongmei Zhou, Gongde Guo, Wei Gao

Abstract:

Temporal Knowledge Graph Completion (TKGC) has attracted increasing attention for its enormous value; however, existing models lack capabilities to capture both local interactions and global dependencies simultaneously with evolutionary dynamics, while the latest achievements in convolutions and Transformers haven't been employed in this area. What’s more, periodic patterns in TKGs haven’t been fully explored either. To this end, a multi-stage hybrid architecture with convolution-backed Transformers is introduced in TKGC tasks for the first time combining the Hawkes process to model evolving event sequences in a continuous-time domain. In addition, the seasonal-trend decomposition is adopted to identify periodic patterns. Experiments on six public datasets are conducted to verify model effectiveness against state-of-the-art (SOTA) methods. An extensive ablation study is carried out accordingly to evaluate architecture variants as well as the contributions of independent components in addition, paving the way for further potential exploitation. Besides complexity analysis, input sensitivity and safety challenges are also thoroughly discussed for comprehensiveness with novel methods.

Keywords: temporal knowledge graph completion, convolution, transformer, Hawkes process, periodicity

Procedia PDF Downloads 54
400 One-off Separation of Multiple Types of Oil-In-Water Emulsions With Surface-Engineered Graphene-Based Multilevel Structure Materials

Authors: Han Longxiang

Abstract:

In the process of treating industrial oily wastewater with complex components, the traditional treatment methods (flotation, coagulation, microwave heating, etc.) often produce high operating costs, secondary pollution, and other problems. In order to solve these problems, the materials with high flux and stability applied to surfactant-stabilized emulsions separation have gained huge attention in the treatment of oily wastewater. Nevertheless, four stable oil-in-water emulsions can be formed due to different surfactants (surfactant-free, anionic surfactant, cationic surfactant, and non-ionic surfactant), and the previous advanced materials can only separate one or several of them, cannot effectively separate in one step. Herein, a facile synthesis method of graphene-based multilevel filter materials (GMFM) which can efficiently separate the oil-in-water emulsions stabilized with different surfactants only through its gravity. The prepared materials with high stability of 20 cycles show a high flux of ~ 5000 L m-2 h-1 with a high separation efficiency of > 99.9 %. GMFM can effectively separate the emulsion stabilized by mixed surfactants and oily wastewater from factories. The results indicate that the GMFM have a wide range of applications in oil-in-water emulsions separation in industry and environmental science.

Keywords: emulsion, filtration, graphene, one-step

Procedia PDF Downloads 66
399 Modeling and Simulation of Underwater Flexible Manipulator as Raleigh Beam Using Bond Graph

Authors: Sumit Kumar, Sunil Kumar, Chandan Deep Singh

Abstract:

This paper presents modeling and simulation of flexible robot in an underwater environment. The underwater environment completely contrasts with ground or space environment. The robot in an underwater situation is subjected to various dynamic forces like buoyancy forces, hydrostatic and hydrodynamic forces. The underwater robot is modeled as Rayleigh beam. The developed model further allows estimating the deflection of tip in two directions. The complete dynamics of the underwater robot is analyzed, which is the main focus of this investigation. The control of robot trajectory is not discussed in this paper. Simulation is performed using Symbol Shakti software.

Keywords: bond graph modeling, dynamics. modeling, rayleigh beam, underwater robot

Procedia PDF Downloads 559
398 Reduced Switch Count Asymmetrical Multilevel Inverter Topology

Authors: Voodi Kalandhar, Veera Reddy, Yuva Tejasree

Abstract:

Researchers have become interested in multilevel inverters (MLI) because of their potential for medium- and high-power applications. MLIs are becoming more popular as a result of their ability to generate higher voltage levels, minimal power losses, small size, and low price. These inverters used in high voltage and high-power applications because the stress on the switch is low. Even though many traditional topologies, such as the cascaded H-bridge MLI, the flying capacitor MLI, and the diode clamped MLI, exist, they all have some drawbacks. A complicated control system is needed for the flying capacitor MLI to balance the voltage across the capacitor and diode clamped MLI requires more no of diodes when no of levels increases. Even though the cascaded H-Bridge MLI is popular in terms of modularity and simple control, it requires more no of isolated DC source. Therefore, a topology with fewer devices has always been necessary for greater efficiency and reliability. A new single-phase MLI topology has been introduced to minimize the required switch count in the circuit and increase output levels. With 3 dc voltage sources, 8 switches, and 13 levels at the output, this new single- phase MLI topology was developed. To demonstrate the proposed converter's superiority over the other MLI topologies currently in use, a thorough analysis of the proposed topology will be conducted.

Keywords: DC-AC converter, multi-level inverter (MLI), diodes, H-bridge inverter, switches

Procedia PDF Downloads 57
397 The Influence of Contextual Factors on Long-Term Contraceptive Use in East Java

Authors: Ni'mal Baroya, Andrei Ramani, Irma Prasetyowati

Abstract:

The access to reproduction health services, including with safe and effective contraception were human rights regardless of social stratum and residence. In addition to individual factors, family and contextual factors were also believed to be the cause in the use of contraceptive methods. This study aimed to assess the determinants of long-term contraceptive methods (LTCM) by considering all the factors at either the individual level or contextual level. Thereby, this study could provide basic information for program development of prevalence enhancement of MKJP in East Java. The research, which used cross-sectional design, utilized Riskesdas 2013 data, particularly in East Java Province for further analysis about multilevel modeling of MKJP application. The sample of this study consisted of 20.601 married women who were not in pregnant that were drawn by using probability sampling following the sampling technique of Riskesdas 2013. Variables in this study were including the independent variables at the individual level that consisted of education, age, occupation, access to family planning services (KB), economic status and residence. As independent variables in district level were the Human Development Index (HDI, henceforth as IPM) in each districts of East Java Province, the ratio of field officers, the ratio of midwives, the ratio of community health centers and the ratio of doctors. As for the dependent variable was the use of Long-Term Contraceptive Method (LTCM or MKJP). The data were analyzed by using chi-square test and Pearson product moment correlation. The multivariable analysis was using multilevel logistic regression with 95% of Confidence Interval (CI) at the significance level of p < 0.05 and 80% of strength test. The results showed a low CPR LTCM was concentrated in districts in Madura Island and the north coast. The women which were 25 to 35 or more than 35 years old, at least high school education, working, and middle-class social status were more likely to use LTCM or MKJP. The IPM and low PLKB ratio had implications for poor CPR LTCM / MKJP.

Keywords: multilevel, long-term contraceptive methods, east java, contextual factor

Procedia PDF Downloads 220
396 Integrating Time-Series and High-Spatial Remote Sensing Data Based on Multilevel Decision Fusion

Authors: Xudong Guan, Ainong Li, Gaohuan Liu, Chong Huang, Wei Zhao

Abstract:

Due to the low spatial resolution of MODIS data, the accuracy of small-area plaque extraction with a high degree of landscape fragmentation is greatly limited. To this end, the study combines Landsat data with higher spatial resolution and MODIS data with higher temporal resolution for decision-level fusion. Considering the importance of the land heterogeneity factor in the fusion process, it is superimposed with the weighting factor, which is to linearly weight the Landsat classification result and the MOIDS classification result. Three levels were used to complete the process of data fusion, that is the pixel of MODIS data, the pixel of Landsat data, and objects level that connect between these two levels. The multilevel decision fusion scheme was tested in two sites of the lower Mekong basin. We put forth a comparison test, and it was proved that the classification accuracy was improved compared with the single data source classification results in terms of the overall accuracy. The method was also compared with the two-level combination results and a weighted sum decision rule-based approach. The decision fusion scheme is extensible to other multi-resolution data decision fusion applications.

Keywords: image classification, decision fusion, multi-temporal, remote sensing

Procedia PDF Downloads 101
395 Programmed Speech to Text Summarization Using Graph-Based Algorithm

Authors: Hamsini Pulugurtha, P. V. S. L. Jagadamba

Abstract:

Programmed Speech to Text and Text Summarization Using Graph-based Algorithms can be utilized in gatherings to get the short depiction of the gathering for future reference. This gives signature check utilizing Siamese neural organization to confirm the personality of the client and convert the client gave sound record which is in English into English text utilizing the discourse acknowledgment bundle given in python. At times just the outline of the gathering is required, the answer for this text rundown. Thus, the record is then summed up utilizing the regular language preparing approaches, for example, solo extractive text outline calculations

Keywords: Siamese neural network, English speech, English text, natural language processing, unsupervised extractive text summarization

Procedia PDF Downloads 184
394 The Effects of Subjective and Objective Indicators of Inequality on Life Satisfaction in a Comparative Perspective Using a Multi-Level Analysis

Authors: Atefeh Bagherianziarat, Dana Hamplova

Abstract:

The inverse social gradient in life satisfaction (LS) is a well-established research finding. To estimate the influence of inequality on LS, most of the studies have explored the effect of the objective aspects of inequality or individuals’ socioeconomic status (SES). However, relatively fewer studies have confirmed recently the significant effect of the subjective aspect of inequality or subjective socioeconomic status (SSS) on life satisfaction over and above SES. In other words, it is confirmed by some studies that individuals’ perception of their unequal status in society or SSS can moderate the impact of their absolute unequal status on their life satisfaction. Nevertheless, this newly confirmed moderating link has not been affirmed to work likewise in societies with different levels of social inequality and also for people who believe in the value of equality, at different levels. In this study, we compared the moderative influence of subjective inequality on the link between objective inequality and life satisfaction. In particular, we focus on differences across welfare state regimes based on Esping-Andersen's theory. Also, we explored the moderative role of believing in the value of equality on the link between objective and subjective inequality on LS in the given societies. Since our studied variables were measured at both individual and country levels, we applied a multilevel analysis to the European Social Survey data (round 9). The results showed that people in deferent regimes reported statistically meaningful different levels of life satisfaction that is explained to different extends by their household income and their perception of their income inequality. The findings of the study supported the previous findings of the moderator influence of perceived inequality on the link between objective inequality and LS. However, this link is different in various welfare state regimes. The results of the multilevel modeling showed that country-level subjective equality is a positive predictor for individuals’ life satisfaction, while the GINI coefficient that was considered as the indicator of absolute inequality has a smaller effect on life satisfaction. Also, country-level subjective equality moderates the confirmed link between individuals’ income and their life satisfaction. It can be concluded that both individual and country-level subjective inequality slightly moderate the effect of individuals’ income on their life satisfaction.

Keywords: individual values, life satisfaction, multilevel analysis, objective inequality, subjective inequality, welfare regimes status

Procedia PDF Downloads 72
393 Ultraviolet Visible Spectroscopy Analysis on Transformer Oil by Correlating It with Various Oil Parameters

Authors: Rajnish Shrivastava, Y. R. Sood, Priti Pundir, Rahul Srivastava

Abstract:

Power transformer is one of the most important devices that are used in power station. Due to several fault impending upon it or due to ageing, etc its life gets lowered. So, it becomes necessary to have diagnosis of oil for fault analysis. Due to the chemical, electrical, thermal and mechanical stress the insulating material in the power transformer degraded. It is important to regularly assess the condition of oil and the remaining life of the power transformer. In this paper UV-VIS absorption graph area is correlated with moisture content, Flash point, IFT and Density of Transformer oil. Since UV-VIS absorption graph area varies accordingly with the variation in different transformer parameters. So by obtaining the correlation among different oil parameters for oil with respect to UV-VIS absorption area, decay contents of transformer oil can be predicted

Keywords: breakdown voltage (BDV), interfacial Tension (IFT), moisture content, ultra violet-visible rays spectroscopy (UV-VIS)

Procedia PDF Downloads 618
392 Clinical Efficacy of Indigenous Software for Automatic Detection of Stages of Retinopathy of Prematurity (ROP)

Authors: Joshi Manisha, Shivaram, Anand Vinekar, Tanya Susan Mathews, Yeshaswini Nagaraj

Abstract:

Retinopathy of prematurity (ROP) is abnormal blood vessel development in the retina of the eye in a premature infant. The principal object of the invention is to provide a technique for detecting demarcation line and ridge detection for a given ROP image that facilitates early detection of ROP in stage 1 and stage 2. The demarcation line is an indicator of Stage 1 of the ROP and the ridge is the hallmark of typically Stage 2 ROP. Thirty Retcam images of Asian Indian infants obtained during routine ROP screening have been used for the analysis. A graphical user interface has been developed to detect demarcation line/ridge and to extract ground truth. This novel algorithm uses multilevel vessel enhancement to enhance tubular structures in the digital ROP images. It has been observed that the orientation of the demarcation line/ridge is normal to the direction of the blood vessels, which is used for the identification of the ridge/ demarcation line. Quantitative analysis has been presented based on gold standard images marked by expert ophthalmologist. Image based analysis has been based on the length and the position of the detected ridge. In image based evaluation, average sensitivity and positive predictive value was found to be 92.30% and 85.71% respectively. In pixel based evaluation, average sensitivity, specificity, positive predictive value and negative predictive value achieved were 60.38%, 99.66%, 52.77% and 99.75% respectively.

Keywords: ROP, ridge, multilevel vessel enhancement, biomedical

Procedia PDF Downloads 371
391 Spatial Analysis and Determinants of Number of Antenatal Health Care Visit Among Pregnant Women in Ethiopia: Application of Spatial Multilevel Count Regression Models

Authors: Muluwerk Ayele Derebe

Abstract:

Background: Antenatal care (ANC) is an essential element in the continuum of reproductive health care for preventing preventable pregnancy-related morbidity and mortality. Objective: The aim of this study is to assess the spatial pattern and predictors of ANC visits in Ethiopia. Method: This study was done using Ethiopian Demographic and Health Survey data of 2016 among 7,174 pregnant women aged 15-49 years which was a nationwide community-based cross-sectional survey. Spatial analysis was done using Getis-Ord Gi* statistics to identify hot and cold spot areas of ANC visits. Multilevel glmmTMB packages adjusted for spatial effects were used in R software. Spatial multilevel count regression was conducted to identify predictors of antenatal care visits for pregnant women, and proportional change in variance was done to uncover the effect of individual and community-level factors of ANC visits. Results: The distribution of ANC visits was spatially clustered Moran’s I = 0.271, p<.0.001, ICC = 0.497, p<0.001). The highest spatial outlier areas of ANC visit was found in Amhara (South Wollo, Weast Gojjam, North Shewa), Oromo (west Arsi and East Harariga), Tigray (Central Tigray) and Benishangul-Gumuz (Asosa and Metekel) regions. The data was found with excess zeros (34.6%) and over-dispersed. The expected ANC visit of pregnant women with pregnancy complications was higher at 0.7868 [ARR= 2.1964, 95% CI: 1.8605, 2.5928, p-value <0.0001] compared to pregnant women who had no pregnancy complications. The expected ANC visit of a pregnant woman who lived in a rural area was 1.2254 times higher [ARR=3.4057, 95% CI: 2.1462, 5.4041, p-value <0.0001] as compared to a pregnant woman who lived in an urban. The study found dissimilar clusters with a low number of zero counts for a mean number of ANC visits surrounded by clusters with a higher number of counts of an average number of ANC visits when other variables held constant. Conclusion: This study found that the number of ANC visits in Ethiopia had a spatial pattern associated with socioeconomic, demographic, and geographic risk factors. Spatial clustering of ANC visits exists in all regions of Ethiopia. The predictor age of the mother, religion, mother’s education, husband’s education, mother's occupation, husband's occupation, signs of pregnancy complication, wealth index and marital status had a strong association with the number of ANC visits by each individual. At the community level, place of residence, region, age of the mother, sex of the household head, signs of pregnancy complications and distance to health facility factors had a strong association with the number of ANC visits.

Keywords: Ethiopia, ANC, spatial, multilevel, zero inflated Poisson

Procedia PDF Downloads 53
390 Power Iteration Clustering Based on Deflation Technique on Large Scale Graphs

Authors: Taysir Soliman

Abstract:

One of the current popular clustering techniques is Spectral Clustering (SC) because of its advantages over conventional approaches such as hierarchical clustering, k-means, etc. and other techniques as well. However, one of the disadvantages of SC is the time consuming process because it requires computing the eigenvectors. In the past to overcome this disadvantage, a number of attempts have been proposed such as the Power Iteration Clustering (PIC) technique, which is one of versions from SC; some of PIC advantages are: 1) its scalability and efficiency, 2) finding one pseudo-eigenvectors instead of computing eigenvectors, and 3) linear combination of the eigenvectors in linear time. However, its worst disadvantage is an inter-class collision problem because it used only one pseudo-eigenvectors which is not enough. Previous researchers developed Deflation-based Power Iteration Clustering (DPIC) to overcome problems of PIC technique on inter-class collision with the same efficiency of PIC. In this paper, we developed Parallel DPIC (PDPIC) to improve the time and memory complexity which is run on apache spark framework using sparse matrix. To test the performance of PDPIC, we compared it to SC, ESCG, ESCALG algorithms on four small graph benchmark datasets and nine large graph benchmark datasets, where PDPIC proved higher accuracy and better time consuming than other compared algorithms.

Keywords: spectral clustering, power iteration clustering, deflation-based power iteration clustering, Apache spark, large graph

Procedia PDF Downloads 155
389 Empirical Roughness Progression Models of Heavy Duty Rural Pavements

Authors: Nahla H. Alaswadko, Rayya A. Hassan, Bayar N. Mohammed

Abstract:

Empirical deterministic models have been developed to predict roughness progression of heavy duty spray sealed pavements for a dataset representing rural arterial roads. The dataset provides a good representation of the relevant network and covers a wide range of operating and environmental conditions. A sample with a large size of historical time series data for many pavement sections has been collected and prepared for use in multilevel regression analysis. The modelling parameters include road roughness as performance parameter and traffic loading, time, initial pavement strength, reactivity level of subgrade soil, climate condition, and condition of drainage system as predictor parameters. The purpose of this paper is to report the approaches adopted for models development and validation. The study presents multilevel models that can account for the correlation among time series data of the same section and to capture the effect of unobserved variables. Study results show that the models fit the data very well. The contribution and significance of relevant influencing factors in predicting roughness progression are presented and explained. The paper concludes that the analysis approach used for developing the models confirmed their accuracy and reliability by well-fitting to the validation data.

Keywords: roughness progression, empirical model, pavement performance, heavy duty pavement

Procedia PDF Downloads 140
388 A Multilevel Approach of Reproductive Preferences and Subsequent Behavior in India

Authors: Anjali Bansal

Abstract:

Reproductive preferences mainly deal with two questions: when a couple wants children and how many they want. Questions related to these desires are often included in the fertility surveys as they can provide relevant information on the subsequent behavior. The aim of the study is to observe whether respondent’s response to these questions changed over time or not. We also tried to identify socio- economic and demographic factors associated with the stability (or instability) of fertility preferences. For this purpose, we used IHDS1 (2004-05) and follow up survey IHDS2 (2011-12) data and applied bivariate, multivariate and multilevel repeated measure analysis to it to find the consistency between responses. From the analysis, we found that preferences of women changes over the course of time as from the bivariate analysis we have found that 52% of women are not consistent in their desired family size and huge inconsistency are found in desire to continue childbearing. To get a better overlook of these inconsistencies, we have computed Intra Class Correlation (ICC) which tries to explain the consistency between individuals on their fertility responses at two time periods. We also explored that husband’s desire for additional child specifically male offspring contribute to these variations. Our findings lead us to a cessation that in India, individuals fertility preferences changed over a seven-year time period as the Intra Class correlation comes out to be very small which explains the variations among individuals. Concerted efforts should be made, therefore, to educate people, and conduct motivational programs to promote family planning for family welfare.

Keywords: change, consistency, preferences, over time

Procedia PDF Downloads 143
387 Memetic Algorithm for Solving the One-To-One Shortest Path Problem

Authors: Omar Dib, Alexandre Caminada, Marie-Ange Manier

Abstract:

The purpose of this study is to introduce a novel approach to solve the one-to-one shortest path problem. A directed connected graph is assumed in which all edges’ weights are positive. Our method is based on a memetic algorithm in which we combine a genetic algorithm (GA) and a variable neighborhood search method (VNS). We compare our approximate method with two exact algorithms Dijkstra and Integer Programming (IP). We made experimentations using random generated, complete and real graph instances. In most case studies, numerical results show that our method outperforms exact methods with 5% average gap to the optimality. Our algorithm’s average speed is 20-times faster than Dijkstra and more than 1000-times compared to IP. The details of the experimental results are also discussed and presented in the paper.

Keywords: shortest path problem, Dijkstra’s algorithm, integer programming, memetic algorithm

Procedia PDF Downloads 439
386 Encapsulation of Volatile Citronella Essential oil by Coacervation: Efficiency and Release Kinetic Study

Authors: Rafeqah Raslan, Mastura AbdManaf, Junaidah Jai, Istikamah Subuki, Ana Najwa Mustapa

Abstract:

The volatile citronella essential oil was encapsulated by simple coacervation and complex coacervation using gum Arabic and gelatin as wall material. Glutaraldehyde was used in the methodology as crosslinking agent. The citronella standard calibration graph was developed with R2 equal to 0.9523 for the accurate determination of encapsulation efficiency and release study. The release kinetic was analyzed based on Fick’s law of diffusion for polymeric system and linear graph of log fraction release over log time was constructed to determine the release rate constant, k and diffusion coefficient, n. Both coacervation methods in the present study produce encapsulation efficiency around 94%. The capsules morphology analysis supported the release kinetic mechanisms of produced capsules for both coacervation process.

Keywords: simple coacervation, complex coacervation, encapsulation efficiency, release kinetic study

Procedia PDF Downloads 293
385 Stress Concentration Trend for Combined Loading Conditions

Authors: Aderet M. Pantierer, Shmuel Pantierer, Raphael Cordina, Yougashwar Budhoo

Abstract:

Stress concentration occurs when there is an abrupt change in geometry, a mechanical part under loading. These changes in geometry can include holes, notches, or cracks within the component. The modifications create larger stress within the part. This maximum stress is difficult to determine, as it is directly at the point of the minimum area. Strain gauges have yet to be developed to analyze stresses at such minute areas. Therefore, a stress concentration factor must be utilized. The stress concentration factor is a dimensionless parameter calculated solely on the geometry of a part. The factor is multiplied by the nominal, or average, stress of the component, which can be found analytically or experimentally. Stress concentration graphs exist for common loading conditions and geometrical configurations to aid in the determination of the maximum stress a part can withstand. These graphs were developed from historical data yielded from experimentation. This project seeks to verify a stress concentration graph for combined loading conditions. The aforementioned graph was developed using CATIA Finite Element Analysis software. The results of this analysis will be validated through further testing. The 3D modeled parts will be subjected to further finite element analysis using Patran-Nastran software. The finite element models will then be verified by testing physical specimen using a tensile testing machine. Once the data is validated, the unique stress concentration graph will be submitted for publication so it can aid engineers in future projects.

Keywords: stress concentration, finite element analysis, finite element models, combined loading

Procedia PDF Downloads 401
384 Plotting of an Ideal Logic versus Resource Outflow Graph through Response Analysis on a Strategic Management Case Study Based Questionnaire

Authors: Vinay A. Sharma, Shiva Prasad H. C.

Abstract:

The initial stages of any project are often observed to be in a mixed set of conditions. Setting up the project is a tough task, but taking the initial decisions is rather not complex, as some of the critical factors are yet to be introduced into the scenario. These simple initial decisions potentially shape the timeline and subsequent events that might later be plotted on it. Proceeding towards the solution for a problem is the primary objective in the initial stages. The optimization in the solutions can come later, and hence, the resources deployed towards attaining the solution are higher than what they would have been in the optimized versions. A ‘logic’ that counters the problem is essentially the core of the desired solution. Thus, if the problem is solved, the deployment of resources has led to the required logic being attained. As the project proceeds along, the individuals working on the project face fresh challenges as a team and are better accustomed to their surroundings. The developed, optimized solutions are then considered for implementation, as the individuals are now experienced, and know better of the consequences and causes of possible failure, and thus integrate the adequate tolerances wherever required. Furthermore, as the team graduates in terms of strength, acquires prodigious knowledge, and begins its efficient transfer, the individuals in charge of the project along with the managers focus more on the optimized solutions rather than the traditional ones to minimize the required resources. Hence, as time progresses, the authorities prioritize attainment of the required logic, at a lower amount of dedicated resources. For empirical analysis of the stated theory, leaders and key figures in organizations are surveyed for their ideas on appropriate logic required for tackling a problem. Key-pointers spotted in successfully implemented solutions are noted from the analysis of the responses and a metric for measuring logic is developed. A graph is plotted with the quantifiable logic on the Y-axis, and the dedicated resources for the solutions to various problems on the X-axis. The dedicated resources are plotted over time, and hence the X-axis is also a measure of time. In the initial stages of the project, the graph is rather linear, as the required logic will be attained, but the consumed resources are also high. With time, the authorities begin focusing on optimized solutions, since the logic attained through them is higher, but the resources deployed are comparatively lower. Hence, the difference between consecutive plotted ‘resources’ reduces and as a result, the slope of the graph gradually increases. On an overview, the graph takes a parabolic shape (beginning on the origin), as with each resource investment, ideally, the difference keeps on decreasing, and the logic attained through the solution keeps increasing. Even if the resource investment is higher, the managers and authorities, ideally make sure that the investment is being made on a proportionally high logic for a larger problem, that is, ideally the slope of the graph increases with the plotting of each point.

Keywords: decision-making, leadership, logic, strategic management

Procedia PDF Downloads 86
383 Human Posture Estimation Based on Multiple Viewpoints

Authors: Jiahe Liu, HongyangYu, Feng Qian, Miao Luo

Abstract:

This study aimed to address the problem of improving the confidence of key points by fusing multi-view information, thereby estimating human posture more accurately. We first obtained multi-view image information and then used the MvP algorithm to fuse this multi-view information together to obtain a set of high-confidence human key points. We used these as the input for the Spatio-Temporal Graph Convolution (ST-GCN). ST-GCN is a deep learning model used for processing spatio-temporal data, which can effectively capture spatio-temporal relationships in video sequences. By using the MvP algorithm to fuse multi-view information and inputting it into the spatio-temporal graph convolution model, this study provides an effective method to improve the accuracy of human posture estimation and provides strong support for further research and application in related fields.

Keywords: multi-view, pose estimation, ST-GCN, joint fusion

Procedia PDF Downloads 41
382 A Combinatorial Representation for the Invariant Measure of Diffusion Processes on Metric Graphs

Authors: Michele Aleandri, Matteo Colangeli, Davide Gabrielli

Abstract:

We study a generalization to a continuous setting of the classical Markov chain tree theorem. In particular, we consider an irreducible diffusion process on a metric graph. The unique invariant measure has an atomic component on the vertices and an absolutely continuous part on the edges. We show that the corresponding density at x can be represented by a normalized superposition of the weights associated to metric arborescences oriented toward the point x. A metric arborescence is a metric tree oriented towards its root. The weight of each oriented metric arborescence is obtained by the product of the exponential of integrals of the form ∫a/b², where b is the drift and σ² is the diffusion coefficient, along the oriented edges, for a weight for each node determined by the local orientation of the arborescence around the node and for the inverse of the diffusion coefficient at x. The metric arborescences are obtained by cutting the original metric graph along some edges.

Keywords: diffusion processes, metric graphs, invariant measure, reversibility

Procedia PDF Downloads 136
381 Efficient Heuristic Algorithm to Speed Up Graphcut in Gpu for Image Stitching

Authors: Tai Nguyen, Minh Bui, Huong Ninh, Tu Nguyen, Hai Tran

Abstract:

GraphCut algorithm has been widely utilized to solve various types of computer vision problems. Its expensive computational cost encouraged many researchers to improve the speed of the algorithm. Recent works proposed schemes that work on parallel computing platforms such as CUDA. However, the problem of low convergence speed prevents the usage of GraphCut for real time applications. In this paper, we propose global suppression heuristic to boost the conver-gence process of the algorithm. A parallel implementation of GraphCut algorithm on CUDA designed for the image stitching problem is introduced. Our method achieves up to 3× time boost on the graph of size 80 × 480 compared to the best sequential GraphCut algorithm while achieving satisfactory stitched images, suitable for panorama applications. Our source code will be soon available for further research.

Keywords: CUDA, graph cut, image stitching, texture synthesis, maxflow/mincut algorithm

Procedia PDF Downloads 98
380 An Iberian Study about Location of Parking Areas for Dangerous Goods

Authors: María Dolores Caro, Eugenio M. Fedriani, Ángel F. Tenorio

Abstract:

When lorries transport dangerous goods, there exist some legal stipulations in the European Union for assuring the security of the rest of road users as well as of those goods being transported. At this respect, lorry drivers cannot park in usual parking areas, because they must use parking areas with special conditions, including permanent supervision of security personnel. Moreover, drivers are compelled to satisfy additional regulations about resting and driving times, which involve in the practical possibility of reaching the suitable parking areas under these time parameters. The “European Agreement concerning the International Carriage of Dangerous Goods by Road” (ADR) is the basic regulation on transportation of dangerous goods imposed under the recommendations of the United Nations Economic Commission for Europe. Indeed, nowadays there are no enough parking areas adapted for dangerous goods and no complete study have suggested the best locations to build new areas or to adapt others already existing to provide the areas being necessary so that lorry drivers can follow all the regulations. The goal of this paper is to show how many additional parking areas should be built in the Iberian Peninsula to allow that lorry drivers may park in such areas under their restrictions in resting and driving time. To do so, we have modeled the problem via graph theory and we have applied a new efficient algorithm which determines an optimal solution for the problem of locating new parking areas to complement those already existing in the ADR for the Iberian Peninsula. The solution can be considered minimal since the number of additional parking areas returned by the algorithm is minimal in quantity. Obviously, graph theory is a natural way to model and solve the problem here proposed because we have considered as nodes: the already-existing parking areas, the loading-and-unloading locations and the bifurcations of roads; while each edge between two nodes represents the existence of a road between both nodes (the distance between nodes is the edge's weight). Except for bifurcations, all the nodes correspond to parking areas already existing and, hence, the problem corresponds to determining the additional nodes in the graph such that there are less up to 100 km between two nodes representing parking areas. (maximal distance allowed by the European regulations).

Keywords: dangerous goods, parking areas, Iberian peninsula, graph-based modeling

Procedia PDF Downloads 557