Search results for: Syntax Based Analysis
16400 Dissipation of Higher Mode using Numerical Integration Algorithm in Dynamic Analysis
Authors: Jin Sup Kim, Woo Young Jung, Minho Kwon
Abstract:
In general dynamic analyses, lower mode response is of interest, however the higher modes of spatially discretized equations generally do not represent the real behavior and not affects to global response much. Some implicit algorithms, therefore, are introduced to filter out the high-frequency modes using intended numerical error. The objective of this study is to introduce the P-method and PC α-method to compare that with dissipation method and Newmark method through the stability analysis and numerical example. PC α-method gives more accuracy than other methods because it based on the α-method inherits the superior properties of the implicit α-method. In finite element analysis, the PC α-method is more useful than other methods because it is the explicit scheme and it achieves the second order accuracy and numerical damping simultaneously.Keywords: Dynamic, α-Method, P-Method, PC α-Method, Newmark method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 307616399 The Pragmatist Basis of Material Hermeneutics
Authors: Juho Lindholm
Abstract:
Practical hermeneutics explores the emergence of meaning in scientific practice. Visual hermeneutics is its subclass which explores the emergence of meaning in instrumentally mediated interactions with scientific objects. There remains to be explained, upon what theory of meaning their discussions are based. Linguistic theories of meaning seem utterly inappropriate for the analysis of the non-linguistic meanings that such hermeneutics invoke. In this article, it will be shown by conceptual analysis that the so-called “pragmatic maxim” provides sufficient resources for the philosophical analysis of such meanings. The “pragmatic maxim” states that the meaning of a thing consists in the potential practical effects of that thing. Because this notion is not confined to language, it can be broadly applied to anything meaningful, including practices and the instruments which are part of practices.
Keywords: Hermeneutics, philosophy of science, pragmatism, theory of meaning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50416398 Application of Neural Network for Contingency Ranking Based on Combination of Severity Indices
Authors: S. Jadid, S. Jalilzadeh
Abstract:
In this paper, an improved technique for contingency ranking using artificial neural network (ANN) is presented. The proposed approach is based on multi-layer perceptrons trained by backpropagation to contingency analysis. Severity indices in dynamic stability assessment are presented. These indices are based on the concept of coherency and three dot products of the system variables. It is well known that some indices work better than others for a particular power system. This paper along with test results using several different systems, demonstrates that combination of indices with ANN provides better ranking than a single index. The presented results are obtained through the use of power system simulation (PSS/E) and MATLAB 6.5 software.Keywords: composite indices, transient stability, neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 222516397 Activity-Based Costing in the Hospitality Industry: A Case Study in a Hotel
Authors: Bita Mashayekhi, Mohammad Ara
Abstract:
The purpose of this study is to provide some empirical evidence about implementing Activity-Based Costing (ABC) in the hospitality industry in Iran. For this purpose, we consider the Tabriz International Hotel as our sample hotel and then gather the relevant data from its cost accounting system in 2012. Then, we use ABC as our costing method and compare the cost of each service unit with that cost which had been extracted for the traditional costing method. The results show a different cost per unit for two methods. Also, because of its more precise and detailed provided information, an ABC system facilitates the decision-making process for managers on decisions related to profitability analysis, budgeting, pricing, and so on.
Keywords: Activity-based costing, activity, cost driver, hospitality industry.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 524016396 Mathematical Model of Depletion of Forestry Resource: Effect of Synthetic Based Industries
Authors: Manisha Chaudhary, Joydip Dhar, Govind Prasad Sahu
Abstract:
A mathematical model is proposed considering the forest biomass density B(t), density of wood based industries W(t) and density of synthetic industries S(t). It is assumed that the forest biomass grows logistically in the absence of wood based industries, but depletion of forestry biomass is due to presence of wood based industries. The growth of wood based industries depends on B(t), while S(t) grows at a constant rate, independent of B(t). Further there is a competition between W(t) and S(t) according to market demand. The proposed model has four ecologically feasible steady states, namely, E1: forest biomass free and wood industries free equilibrium; E2: wood industries free equilibrium and two coexisting equilibria E∗1 , E∗2 . Behavior of the system near all feasible equilibria is analyzed using the stability theory of differential equations. In the proposed model, the natural depletion rate h1 is a crucial parameter and system exhibits Hopf-bifurcation about the non-trivial equilibrium with respect to h1. The analytical results are verified using numerical simulation.
Keywords: A mathematical model, Competition between wood based and synthetic industries, Hopf-bifurcation, Stability analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 349716395 Modeling and Performance Evaluation of LTE Networks with Different TCP Variants
Authors: Ghassan A. Abed, Mahamod Ismail, Kasmiran Jumari
Abstract:
Long Term Evolution (LTE) is a 4G wireless broadband technology developed by the Third Generation Partnership Project (3GPP) release 8, and it's represent the competitiveness of Universal Mobile Telecommunications System (UMTS) for the next 10 years and beyond. The concepts for LTE systems have been introduced in 3GPP release 8, with objective of high-data-rate, low-latency and packet-optimized radio access technology. In this paper, performance of different TCP variants during LTE network investigated. The performance of TCP over LTE is affected mostly by the links of the wired network and total bandwidth available at the serving base station. This paper describes an NS-2 based simulation analysis of TCP-Vegas, TCP-Tahoe, TCPReno, TCP-Newreno, TCP-SACK, and TCP-FACK, with full modeling of all traffics of LTE system. The Evaluation of the network performance with all TCP variants is mainly based on throughput, average delay and lost packet. The analysis of TCP performance over LTE ensures that all TCP's have a similar throughput and the best performance return to TCP-Vegas than other variants.Keywords: LTE; EUTRAN; 3GPPP, SAE; TCP Variants; NS-2
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 327616394 Financial Analysis Analogies for Software Risk
Authors: Masood Uzzafer
Abstract:
A dynamic software risk assessment model is presented. Analogies between dynamic financial analysis and software risk assessment models are established and based on these analogies it suggested that dynamic risk model for software projects is the way to move forward for the risk assessment of software project. It is shown how software risk assessment change during different phases of a software project and hence requires a dynamic risk assessment model to capture these variations. Further evolution of dynamic financial analysis models is discussed and mapped to the evolution of software risk assessment models.Keywords: Software Risk Assessment, Software ProjectManagement, Software Cost, Dynamic Modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 155416393 The Analysis of Deceptive and Truthful Speech: A Computational Linguistic Based Method
Authors: Seham El Kareh, Miramar Etman
Abstract:
Recently, detecting liars and extracting features which distinguish them from truth-tellers have been the focus of a wide range of disciplines. To the author’s best knowledge, most of the work has been done on facial expressions and body gestures but only few works have been done on the language used by both liars and truth-tellers. This paper sheds light on four axes. The first axis copes with building an audio corpus for deceptive and truthful speech for Egyptian Arabic speakers. The second axis focuses on examining the human perception of lies and proving our need for computational linguistic-based methods to extract features which characterize truthful and deceptive speech. The third axis is concerned with building a linguistic analysis program that could extract from the corpus the inter- and intra-linguistic cues for deceptive and truthful speech. The program built here is based on selected categories from the Linguistic Inquiry and Word Count program. Our results demonstrated that Egyptian Arabic speakers on one hand preferred to use first-person pronouns and present tense compared to the past tense when lying and their lies lacked of second-person pronouns, and on the other hand, when telling the truth, they preferred to use the verbs related to motion and the nouns related to time. The results also showed that there is a need for bigger data to prove the significance of words related to emotions and numbers.
Keywords: Egyptian Arabic corpus, computational analysis, deceptive features, forensic linguistics, human perception, truthful features.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 120316392 Machining Parameters Optimization of Developed Yttria Stabilized Zirconia Toughened Alumina Ceramic Inserts While Machining AISI 4340 Steel
Authors: Nilrudra Mandal, B Doloi, B Mondal
Abstract:
An attempt has been made to investigate the machinability of zirconia toughened alumina (ZTA) inserts while turning AISI 4340 steel. The insert was prepared by powder metallurgy process route and the machining experiments were performed based on Response Surface Methodology (RSM) design called Central Composite Design (CCD). The mathematical model of flank wear, cutting force and surface roughness have been developed using second order regression analysis. The adequacy of model has been carried out based on Analysis of variance (ANOVA) techniques. It can be concluded that cutting speed and feed rate are the two most influential factor for flank wear and cutting force prediction. For surface roughness determination, the cutting speed & depth of cut both have significant contribution. Key parameters effect on each response has also been presented in graphical contours for choosing the operating parameter preciously. 83% desirability level has been achieved using this optimized condition.Keywords: Analysis of variance (ANOVA), Central Composite Design (CCD), Response Surface Methodology (RSM), Zirconia Toughened Alumina (ZTA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 278216391 An Empirical Analysis of the Board Composition Concerning Logistics Competencies
Authors: Ingrid Göpfert, Michael Stephan, Wanja Wellbrock, Malte Ackermann
Abstract:
Empirical insights into the implementation of logistics competencies at the top management level are scarce. This paper addresses this issue with an explorative approach which is based on a dataset of 872 observations in the years 2000, 2004 and 2008 using quantitative content analysis from annual reports of the 500 publicly listed firms with the highest global research and development expenditures according to the British Department for Business Innovation and Skills. We find that logistics competencies are more pronounced in Asian companies than in their European or American counterparts. On an industrial level the results are quite mixed. Using partial point-biserial correlations we show that logistics competencies are positively related to financial performance.
Keywords: Logistics, supply chain management, content analysis, executive boards, multinational corporations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 211816390 Determination of Surface Roughness by Ball Burnishing Process Using Factorial Techniques
Authors: P. S. Dabeer, G. K. Purohit
Abstract:
Burnishing is a method of finishing and hardening machined parts by plastic deformation of the surface. Experimental work based on central composite second order rotatable design has been carried out on a lathe machine to establish the effects of ball burnishing parameters on the surface roughness of brass material. Analysis of the results by the analysis of variance technique and the F-test show that the parameters considered, have significant effects on the surface roughness.
Keywords: Ball burnishing, Response surface Methodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 247716389 Improving Taint Analysis of Android Applications Using Finite State Machines
Authors: Assad Maalouf, Lunjin Lu, James Lynott
Abstract:
We present a taint analysis that can automatically detect when string operations result in a string that is free of taints, where all the tainted patterns have been removed. This is an improvement on the conservative behavior of previous taint analyzers, where a string operation on a tainted string always leads to a tainted string unless the operation is manually marked as a sanitizer. The taint analysis is built on top of a string analysis that uses finite state automata to approximate the sets of values that string variables can take during the execution of a program. The proposed approach has been implemented as an extension of FlowDroid and experimental results show that the resulting taint analyzer is much more precise than the original FlowDroid.Keywords: Android, static analysis, string analysis, taint analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 66316388 An Investigation into Kanji Character Discrimination Process from EEG Signals
Authors: Hiroshi Abe, Minoru Nakayama
Abstract:
The frontal area in the brain is known to be involved in behavioral judgement. Because a Kanji character can be discriminated visually and linguistically from other characters, in Kanji character discrimination, we hypothesized that frontal event-related potential (ERP) waveforms reflect two discrimination processes in separate time periods: one based on visual analysis and the other based on lexcical access. To examine this hypothesis, we recorded ERPs while performing a Kanji lexical decision task. In this task, either a known Kanji character, an unknown Kanji character or a symbol was presented and the subject had to report if the presented character was a known Kanji character for the subject or not. The same response was required for unknown Kanji trials and symbol trials. As a preprocessing of signals, we examined the performance of a method using independent component analysis for artifact rejection and found it was effective. Therefore we used it. In the ERP results, there were two time periods in which the frontal ERP wavefoms were significantly different betweeen the unknown Kanji trials and the symbol trials: around 170ms and around 300ms after stimulus onset. This result supported our hypothesis. In addition, the result suggests that Kanji character lexical access may be fully completed by around 260ms after stimulus onset.Keywords: Character discrimination, Event-related Potential, IndependentComponent Analysis, Kanji, Lexical access.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 178416387 Implementation of Conceptual Real-Time Embedded Functional Design via Drive-by-Wire ECU Development
Authors: A. Ukaew, C. Chauypen
Abstract:
Design concepts of real-time embedded system can be realized initially by introducing novel design approaches. In this literature, model based design approach and in-the-loop testing were employed early in the conceptual and preliminary phase to formulate design requirements and perform quick real-time verification. The design and analysis methodology includes simulation analysis, model based testing, and in-the-loop testing. The design of conceptual driveby- wire, or DBW, algorithm for electronic control unit, or ECU, was presented to demonstrate the conceptual design process, analysis, and functionality evaluation. The concepts of DBW ECU function can be implemented in the vehicle system to improve electric vehicle, or EV, conversion drivability. However, within a new development process, conceptual ECU functions and parameters are needed to be evaluated. As a result, the testing system was employed to support conceptual DBW ECU functions evaluation. For the current setup, the system components were consisted of actual DBW ECU hardware, electric vehicle models, and control area network or CAN protocol. The vehicle models and CAN bus interface were both implemented as real-time applications where ECU and CAN protocol functionality were verified according to the design requirements. The proposed system could potentially benefit in performing rapid real-time analysis of design parameters for conceptual system or software algorithm development.Keywords: Drive-by-wire ECU, in-the-loop testing, modelbased design, real-time embedded system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 217616386 Graph Cuts Segmentation Approach Using a Patch-Based Similarity Measure Applied for Interactive CT Lung Image Segmentation
Authors: Aicha Majda, Abdelhamid El Hassani
Abstract:
Lung CT image segmentation is a prerequisite in lung CT image analysis. Most of the conventional methods need a post-processing to deal with the abnormal lung CT scans such as lung nodules or other lesions. The simplest similarity measure in the standard Graph Cuts Algorithm consists of directly comparing the pixel values of the two neighboring regions, which is not accurate because this kind of metrics is extremely sensitive to minor transformations such as noise or other artifacts problems. In this work, we propose an improved version of the standard graph cuts algorithm based on the Patch-Based similarity metric. The boundary penalty term in the graph cut algorithm is defined Based on Patch-Based similarity measurement instead of the simple intensity measurement in the standard method. The weights between each pixel and its neighboring pixels are Based on the obtained new term. The graph is then created using theses weights between its nodes. Finally, the segmentation is completed with the minimum cut/Max-Flow algorithm. Experimental results show that the proposed method is very accurate and efficient, and can directly provide explicit lung regions without any post-processing operations compared to the standard method.Keywords: Graph cuts, lung CT scan, lung parenchyma segmentation, patch based similarity metric.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 74316385 Fabrication of ZnO Nanorods Based Biosensor via Hydrothermal Method
Authors: Muhammad Tariq, Jafar Khan Kasi, Samiullah, Ajab Khan Kasi
Abstract:
Biosensors are playing vital role in industrial, clinical, and chemical analysis applications. Among other techniques, ZnO based biosensor is an easy approach due to its exceptional chemical and electrical properties. ZnO nanorods have positively charged isoelectric point which helps immobilize the negative charge glucose oxides (GOx). Here, we report ZnO nanorods based biosensors for the immobilization of GOx. The ZnO nanorods were grown by hydrothermal method on indium tin oxide substrate (ITO). The fabrication of biosensors was carried through batch processing using conventional photolithography. The buffer solutions of GOx were prepared in phosphate with a pH value of around 7.3. The biosensors effectively immobilized the GOx and result was analyzed by calculation of voltage and current on nanostructures.Keywords: Hydrothermal growth, zinc dioxide, biosensors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 105416384 Numerical Modelling of Dry Stone Masonry Structures Based on Finite-Discrete Element Method
Authors: Ž. Nikolić, H. Smoljanović, N. Živaljić
Abstract:
This paper presents numerical model based on finite-discrete element method for analysis of the structural response of dry stone masonry structures under static and dynamic loads. More precisely, each discrete stone block is discretized by finite elements. Material non-linearity including fracture and fragmentation of discrete elements as well as cyclic behavior during dynamic load are considered through contact elements which are implemented within a finite element mesh. The application of the model was conducted on several examples of these structures. The performed analysis shows high accuracy of the numerical results in comparison with the experimental ones and demonstrates the potential of the finite-discrete element method for modelling of the response of dry stone masonry structures.Keywords: Finite-discrete element method, dry stone masonry structures, static load, dynamic load.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161716383 An Ontology for Investment in Chinese Steel Company
Authors: Liming Chen, Baoxin Xiu, Zhaoyun Ding, Bin Liu, Xianqiang Zhu
Abstract:
In the era of big data, public investors are faced with more complicated information related to investment decisions than ever before. To survive in the fierce competition, it has become increasingly urgent for investors to combine multi-source knowledge and evaluate the companies’ true value efficiently. For this, a rule-based ontology reasoning method is proposed to support steel companies’ value assessment. Considering the delay in financial disclosure and based on cost-benefit analysis, this paper introduces the supply chain enterprises financial analysis and constructs the ontology model used to value the value of steel company. In addition, domain knowledge is formally expressed with the help of Web Ontology Language (OWL) language and SWRL (Semantic Web Rule Language) rules. Finally, a case study on a steel company in China proved the effectiveness of the method we proposed.
Keywords: Financial ontology, steel company, supply chain, ontology reasoning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 59516382 A New Approach for Prioritization of Failure Modes in Design FMEA using ANOVA
Authors: Sellappan Narayanagounder, Karuppusami Gurusami
Abstract:
The traditional Failure Mode and Effects Analysis (FMEA) uses Risk Priority Number (RPN) to evaluate the risk level of a component or process. The RPN index is determined by calculating the product of severity, occurrence and detection indexes. The most critically debated disadvantage of this approach is that various sets of these three indexes may produce an identical value of RPN. This research paper seeks to address the drawbacks in traditional FMEA and to propose a new approach to overcome these shortcomings. The Risk Priority Code (RPC) is used to prioritize failure modes, when two or more failure modes have the same RPN. A new method is proposed to prioritize failure modes, when there is a disagreement in ranking scale for severity, occurrence and detection. An Analysis of Variance (ANOVA) is used to compare means of RPN values. SPSS (Statistical Package for the Social Sciences) statistical analysis package is used to analyze the data. The results presented are based on two case studies. It is found that the proposed new methodology/approach resolves the limitations of traditional FMEA approach.Keywords: Failure mode and effects analysis, Risk priority code, Critical failure mode, Analysis of variance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 543816381 Tools for Analysis and Optimization of Standalone Green Microgrids
Authors: William Anderson, Kyle Kobold, Oleg Yakimenko
Abstract:
Green microgrids using mostly renewable energy (RE) for generation, are complex systems with inherent nonlinear dynamics. Among a variety of different optimization tools there are only a few ones that adequately consider this complexity. This paper evaluates applicability of two somewhat similar optimization tools tailored for standalone RE microgrids and also assesses a machine learning tool for performance prediction that can enhance the reliability of any chosen optimization tool. It shows that one of these microgrid optimization tools has certain advantages over another and presents a detailed routine of preparing input data to simulate RE microgrid behavior. The paper also shows how neural-network-based predictive modeling can be used to validate and forecast solar power generation based on weather time series data, which improves the overall quality of standalone RE microgrid analysis.Keywords: Microgrid, renewable energy, complex systems, optimization, predictive modeling, neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 106016380 Enhanced Frame-based Video Coding to Support Content-based Functionalities
Authors: Prabhudev Hosur, Rolando Carrasco
Abstract:
This paper presents the enhanced frame-based video coding scheme. The input source video to the enhanced frame-based video encoder consists of a rectangular-size video and shapes of arbitrarily-shaped objects on video frames. The rectangular frame texture is encoded by the conventional frame-based coding technique and the video object-s shape is encoded using the contour-based vertex coding. It is possible to achieve several useful content-based functionalities by utilizing the shape information in the bitstream at the cost of a very small overhead to the bitrate.
Keywords: Video coding, content-based, hyper video, interactivity, shape coding, polygon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 166216379 A Video-Based Observation and Analysis Method to Assess Human Movement and Behaviour in Crowded Areas
Authors: Shahrol Mohamaddan, Keith Case, Ana Sakura Zainal Abidin
Abstract:
Human movement in the real world provides important information for developing human behaviour models and simulations. However, it is difficult to assess ‘real’ human behaviour since there is no established method available. As part of the AUNTSUE (Accessibility and User Needs in Transport – Sustainable Urban Environments) project, this research aimed to propose a method to assess human movement and behaviour in crowded areas. The method is based on the three major steps of video recording, conceptual behavior modelling and video analysis. The focus is on individual human movement and behaviour in normal situations (panic situations are not considered) and the interactions between individuals in localized areas. Emphasis is placed on gaining knowledge of characteristics of human movement and behaviour in the real world that can be modelled in the virtual environment.
Keywords: Video observation, Human movement, Behaviour, Crowds, Ergonomics, AUNT-SUE
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 224516378 Multi-View Neural Network Based Gait Recognition
Authors: Saeid Fazli, Hadis Askarifar, Maryam Sheikh Shoaie
Abstract:
Human identification at a distance has recently gained growing interest from computer vision researchers. Gait recognition aims essentially to address this problem by identifying people based on the way they walk [1]. Gait recognition has 3 steps. The first step is preprocessing, the second step is feature extraction and the third one is classification. This paper focuses on the classification step that is essential to increase the CCR (Correct Classification Rate). Multilayer Perceptron (MLP) is used in this work. Neural Networks imitate the human brain to perform intelligent tasks [3].They can represent complicated relationships between input and output and acquire knowledge about these relationships directly from the data [2]. In this paper we apply MLP NN for 11 views in our database and compare the CCR values for these views. Experiments are performed with the NLPR databases, and the effectiveness of the proposed method for gait recognition is demonstrated.Keywords: Human motion analysis, biometrics, gait recognition, principal component analysis, MLP neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 210516377 Web Proxy Detection via Bipartite Graphs and One-Mode Projections
Authors: Zhipeng Chen, Peng Zhang, Qingyun Liu, Li Guo
Abstract:
With the Internet becoming the dominant channel for business and life, many IPs are increasingly masked using web proxies for illegal purposes such as propagating malware, impersonate phishing pages to steal sensitive data or redirect victims to other malicious targets. Moreover, as Internet traffic continues to grow in size and complexity, it has become an increasingly challenging task to detect the proxy service due to their dynamic update and high anonymity. In this paper, we present an approach based on behavioral graph analysis to study the behavior similarity of web proxy users. Specifically, we use bipartite graphs to model host communications from network traffic and build one-mode projections of bipartite graphs for discovering social-behavior similarity of web proxy users. Based on the similarity matrices of end-users from the derived one-mode projection graphs, we apply a simple yet effective spectral clustering algorithm to discover the inherent web proxy users behavior clusters. The web proxy URL may vary from time to time. Still, the inherent interest would not. So, based on the intuition, by dint of our private tools implemented by WebDriver, we examine whether the top URLs visited by the web proxy users are web proxies. Our experiment results based on real datasets show that the behavior clusters not only reduce the number of URLs analysis but also provide an effective way to detect the web proxies, especially for the unknown web proxies.
Keywords: Bipartite graph, clustering, one-mode projection, web proxy detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 74616376 Studying the Causes and Affecting Factors of Motorcycle Accidents A Case Study on the Road Accidents in Zanjan Province (IRAN) - 2007
Authors: A. Beheshti, S. Salkhordeh, H. Amini
Abstract:
Based on statistics released by Islamic Republic of Iran Police (IRIP), from among the total 9555 motorcycle accidents that happened in 2007, 857 riders died and 11219 one got injured. If we also consider the death toll and injuries of other vehicles' accidents resulted from traffic violation by motorcycle riders, then paying attention to the motorcycle accidents seems to be very necessary. Therefore, in this study we tried to investigate the traits and issues related to production, application, and training, along with causes of motorcycle accidents from 4 perspectives of road, human, environment and vehicle and also based on statistical and geographical analysis of accident-sheets prepared by Iran Road Patrol Department (IRPD). Unfamiliarity of riders with regulations and techniques of motorcycling, disuse of safety equipments, inefficiency of roads and design of junctions for safe trafficking of motorcycles and finally the lack of sufficient control of responsible organizations are among the major causes which lead to these accidents.Keywords: Motorcycle, Motorcycle riders, Road accidents, Statistical analysis of accidents.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 158216375 Analysis of Highway Slope Failure by an Application of the Stereographic Projection
Authors: Chin-Yu Lee, Iau-Teh Wang
Abstract:
The mountain road slope failures triggered by earthquake activities and torrential rain namely to create the disaster. Province Road No. 24 is a main route to the Wutai Township. The area of the study is located at the mileages between 46K and 47K along the road. However, the road has been suffered frequent damages as a result of landslide and slope failures during typhoon seasons. An understanding of the sliding behaviors in the area appears to be necessary. Slope failures triggered by earthquake activities and heavy rainfalls occur frequently. The study is to understand the mechanism of slope failures and to look for the way to deal with the situation. In order to achieve these objectives, this paper is based on theoretical and structural geology data interpretation program to assess the potential slope sliding behavior. The study showed an intimate relationship between the landslide behavior of the slopes and the stratum materials, based on structural geology analysis method to analysis slope stability and finds the slope safety coefficient to predict the sites of destroyed layer. According to the case study and parameter analyses results, the slope mainly slips direction compared to the site located in the southeast area. Find rainfall to result in the rise of groundwater level is main reason of the landslide mechanism. Future need to set up effective horizontal drain at corrective location, that can effective restrain mountain road slope failures and increase stability of slope.Keywords: slope stability analysis, Stereographic Projection, wedge Failure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 468716374 Application of Artificial Neural Network in Assessing Fill Slope Stability
Authors: An-Jui. Li, Kelvin Lim, Chien-Kuo Chiu, Benson Hsiung
Abstract:
This paper details the utilization of artificial intelligence (AI) in the field of slope stability whereby quick and convenient solutions can be obtained using the developed tool. The AI tool used in this study is the artificial neural network (ANN), while the slope stability analysis methods are the finite element limit analysis methods. The developed tool allows for the prompt prediction of the safety factors of fill slopes and their corresponding probability of failure (depending on the degree of variation of the soil parameters), which can give the practicing engineer a reasonable basis in their decision making. In fact, the successful use of the Extreme Learning Machine (ELM) algorithm shows that slope stability analysis is no longer confined to the conventional methods of modeling, which at times may be tedious and repetitive during the preliminary design stage where the focus is more on cost saving options rather than detailed design. Therefore, similar ANN-based tools can be further developed to assist engineers in this aspect.
Keywords: Landslide, limit analysis, ANN, soil properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 120716373 Market Segmentation and Conjoint Analysis for Apple Family Design
Authors: Abbas Al-Refaie, Nour Bata
Abstract:
A distributor of Apple products' experiences numerous difficulties in developing marketing strategies for new and existing mobile product entries that maximize customer satisfaction and the firm's profitability. This research, therefore, integrates market segmentation in platform-based product family design and conjoint analysis to identify iSystem combinations that increase customer satisfaction and business profits. First, the enhanced market segmentation grid is created. Then, the estimated demand model is formulated. Finally, the profit models are constructed then used to determine the ideal product family design that maximizes profit. Conjoint analysis is used to explore customer preferences with their satisfaction levels. A total of 200 surveys are collected about customer preferences. Then, simulation is used to determine the importance values for each attribute. Finally, sensitivity analysis is conducted to determine the product family design that maximizes both objectives. In conclusion, the results of this research shall provide great support to Apple distributors in determining the best marketing strategies that enhance their market share.
Keywords: Market segmentation, conjoint analysis, market strategies, optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 251816372 Correlation Analysis to Quantify Learning Outcomes for Different Teaching Pedagogies
Authors: Kanika Sood, Sijie Shang
Abstract:
A fundamental goal of education includes preparing students to become a part of the global workforce by making beneficial contributions to society. In this paper, we analyze student performance for multiple courses that involve different teaching pedagogies: a cooperative learning technique and an inquiry-based learning strategy. Student performance includes student engagement, grades, and attendance records. We perform this study in the Computer Science department for online and in-person courses for 450 students. We will perform correlation analysis to study the relationship between student scores and other parameters such as gender, mode of learning. We use natural language processing and machine learning to analyze student feedback data and performance data. We assess the learning outcomes of two teaching pedagogies for undergraduate and graduate courses to showcase the impact of pedagogical adoption and learning outcome as determinants of academic achievement. Early findings suggest that when using the specified pedagogies, students become experts on their topics and illustrate enhanced engagement with peers.
Keywords: Bag-of-words, cooperative learning, education, inquiry-based learning, in-person learning, Natural Language Processing, online learning, sentiment analysis, teaching pedagogy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8016371 Multidimensional Visualization Tools for Analysis of Expression Data
Authors: Urska Cvek, Marjan Trutschl, Randolph Stone II, Zanobia Syed, John L. Clifford, Anita L. Sabichi
Abstract:
Expression data analysis is based mostly on the statistical approaches that are indispensable for the study of biological systems. Large amounts of multidimensional data resulting from the high-throughput technologies are not completely served by biostatistical techniques and are usually complemented with visual, knowledge discovery and other computational tools. In many cases, in biological systems we only speculate on the processes that are causing the changes, and it is the visual explorative analysis of data during which a hypothesis is formed. We would like to show the usability of multidimensional visualization tools and promote their use in life sciences. We survey and show some of the multidimensional visualization tools in the process of data exploration, such as parallel coordinates and radviz and we extend them by combining them with the self-organizing map algorithm. We use a time course data set of transitional cell carcinoma of the bladder in our examples. Analysis of data with these tools has the potential to uncover additional relationships and non-trivial structures.Keywords: microarrays, visualization, parallel coordinates, radviz, self-organizing maps.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2508