Search results for: Data analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13484

Search results for: Data analysis

12164 Evaluating Emission Reduction Due to a Proposed Light Rail Service: A Micro-Level Analysis

Authors: Saeid Eshghi, Neeraj Saxena, Abdulmajeed Alsultan

Abstract:

Carbon dioxide (CO2) alongside other gas emissions in the atmosphere cause a greenhouse effect, resulting in an increase of the average temperature of the planet. Transportation vehicles are among the main contributors of CO2 emission. Stationary vehicles with initiated motors produce more emissions than mobile ones. Intersections with traffic lights that force the vehicles to become stationary for a period of time produce more CO2 pollution than other parts of the road. This paper focuses on analyzing the CO2 produced by the traffic flow at Anzac Parade Road - Barker Street intersection in Sydney, Australia, before and after the implementation of Light rail transport (LRT). The data are gathered during the construction phase of the LRT by collecting the number of vehicles on each path of the intersection for 15 minutes during the evening rush hour of 1 week (6-7 pm, July 04-31, 2018) and then multiplied by 4 to calculate the flow of vehicles in 1 hour. For analyzing the data, the microscopic simulation software “VISSIM” has been used. Through the analysis, the traffic flow was processed in three stages: before and after implementation of light rail train, and one during the construction phase. Finally, the traffic results were input into another software called “EnViVer”, to calculate the amount of CO2 during 1 h. The results showed that after the implementation of the light rail, CO2 will drop by a minimum of 13%. This finding provides an evidence that light rail is a sustainable mode of transport.

Keywords: Carbon dioxide, emission modeling, light rail, microscopic model, traffic flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 940
12163 Analysis of the Energetic Feature of the Loaded Gait with Variation of the Trunk Flexion Angle

Authors: Ji-il Park, Hyungtae Seo, Jihyuk Park, Kwang jin Choi, Kyung-Soo Kim, Soohyun Kim

Abstract:

The purpose of the research is to investigate the energetic feature of the backpack load on soldier’s gait with variation of the trunk flexion angle. It is believed that the trunk flexion variation of the loaded gait may cause a significant difference in the energy cost which is often in practice in daily life. To this end, seven healthy Korea military personnel participated in the experiment and are tested under three different walking postures comprised of the small, natural and large trunk flexion. There are around 5 degree differences of waist angle between each trunk flexion. The ground reaction forces were collected from the force plates and motion kinematic data are measured by the motion capture system. Based on these data, the impulses, momentums and mechanical works done on the center of body mass (COM) during the double support phase were computed. The result shows that the push-off and heel strike impulse are not relevant to the trunk flexion change, however the mechanical work by the push-off and heel strike were changed by the trunk flexion variation. It is because the vertical velocity of the COM during the double support phase is increased significantly with an increase in the trunk flexion. Therefore, we can know that the gait efficiency of the loaded gait depends on the trunk flexion angle. Also, even though the gravitational impulse and pre-collision momentum are changed by the trunk flexion variation, the after-collision momentum is almost constant regardless of the trunk flexion variation.

Keywords: Loaded gait, collision, impulse, gravity, heel strike, push-off, gait analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1833
12162 Improvement of the Q-System Using the Rock Engineering System: A Case Study of Water Conveyor Tunnel of Azad Dam

Authors: S. Golmohammadi, M. Noorian Bidgoli

Abstract:

Because the status and mechanical parameters of discontinuities in the rock mass are included in the calculations, various methods of rock engineering classification are often used as a starting point for the design of different types of structures. The Q-system is one of the most frequently used methods for stability analysis and determination of support systems of underground structures in rock, including tunnel. In this method, six main parameters of the rock mass, namely, the Rock Quality Designation (RQD), joint set number (Jn), joint roughness number (Jr), joint alteration number (Ja), joint water parameter (Jw) and Stress Reduction Factor (SRF) are required. In this regard, in order to achieve a reasonable and optimal design, identifying the effective parameters for the stability of the mentioned structures is one of the most important goals and the most necessary actions in rock engineering. Therefore, it is necessary to study the relationships between the parameters of a system and how they interact with each other and, ultimately, the whole system. In this research, it has been attempted to determine the most effective parameters (key parameters) from the six parameters of rock mass in the Q-system using the Rock Engineering System (RES) method to improve the relationships between the parameters in the calculation of the Q value. The RES system is, in fact, a method by which one can determine the degree of cause and effect of a system's parameters by making an interaction matrix. In this research, the geomechanical data collected from the water conveyor tunnel of Azad Dam were used to make the interaction matrix of the Q-system. For this purpose, instead of using the conventional methods that are always accompanied by defects such as uncertainty, the Q-system interaction matrix is coded using a technique that is actually a statistical analysis of the data and determining the correlation coefficient between them. So, the effect of each parameter on the system is evaluated with greater certainty. The results of this study show that the formed interaction matrix provides a reasonable estimate of the effective parameters in the Q-system. Among the six parameters of the Q-system, the SRF and Jr parameters have the maximum and minimum impact on the system, respectively, and also the RQD and Jw parameters have the maximum and minimum impact on the system, respectively. Therefore, by developing this method, we can obtain a more accurate relation to the rock mass classification by weighting the required parameters in the Q-system.

Keywords: Q-system, Rock Engineering System, statistical analysis, rock mass, tunnel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 278
12161 Simultaneous Term Structure Estimation of Hazard and Loss Given Default with a Statistical Model using Credit Rating and Financial Information

Authors: Tomohiro Ando, Satoshi Yamashita

Abstract:

The objective of this study is to propose a statistical modeling method which enables simultaneous term structure estimation of the risk-free interest rate, hazard and loss given default, incorporating the characteristics of the bond issuing company such as credit rating and financial information. A reduced form model is used for this purpose. Statistical techniques such as spline estimation and Bayesian information criterion are employed for parameter estimation and model selection. An empirical analysis is conducted using the information on the Japanese bond market data. Results of the empirical analysis confirm the usefulness of the proposed method.

Keywords: Empirical Bayes, Hazard term structure, Loss given default.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1659
12160 Understanding Factors Influencing E-Government Implementation in Saudi Arabia from an Organizational Perspective

Authors: M. Alassim, M. Alfayad, E. Abbott-Halpin

Abstract:

The purpose of this paper is to explore the organizational factors influencing the implementation of the e-government project within the public sector in Saudi Arabia. This project (also known as the Yesser programme) was established in Saudi Arabia in 2005 to control the e-government transformation process. The aims of the project are to provide a collaborative environment for government organizations to implement e-government and increase effectiveness and efficiency within the public sector. This paper sheds light on the organizational factors that have delayed implementation and achievement of the government’s vision and plans for Yesser. A qualitative approach was employed to understand those factors, by conducting a series of interviews with government officials for the data collection required. The analysis of the data uncovered seven organizational factors that are needed to advance implementation of the e-government project in Saudi Arabia and other similar states.

Keywords: E-government, e-transformation, ICT, Saudi Arabia, Yesser.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1021
12159 Comparative Quantitative Study on Learning Outcomes of Major Study Groups of an Information and Communication Technology Bachelor Educational Program

Authors: Kari Björn, Mikael Soini

Abstract:

Higher Education system reforms, especially Finnish system of Universities of Applied Sciences in 2014 are discussed. The new steering model is based on major legislative changes, output-oriented funding and open information. The governmental steering reform, especially the financial model and the resulting institutional level responses, such as a curriculum reforms are discussed, focusing especially in engineering programs. The paper is motivated by management need to establish objective steering-related performance indicators and to apply them consistently across all educational programs. The close relationship to governmental steering and funding model imply that internally derived indicators can be directly applied. Metropolia University of Applied Sciences (MUAS) as a case institution is briefly introduced, focusing on engineering education in Information and Communications Technology (ICT), and its related programs. The reform forced consolidation of previously separate smaller programs into fewer units of student application. New curriculum ICT students have a common first year before they apply for a Major. A framework of parallel and longitudinal comparisons is introduced and used across Majors in two campuses. The new externally introduced performance criteria are applied internally on ICT Majors using data ex-ante and ex-post of program merger.  A comparative performance of the Majors after completion of joint first year is established, focusing on previously omitted Majors for completeness of analysis. Some new research questions resulting from transfer of Majors between campuses and quota setting are discussed. Practical orientation identifies best practices to share or targets needing most attention for improvement. This level of analysis is directly applicable at student group and teaching team level, where corrective actions are possible, when identified. The analysis is quantitative and the nature of the corrective actions are not discussed. Causal relationships and factor analysis are omitted, because campuses, their staff and various pedagogical implementation details contain still too many undetermined factors for our limited data. Such qualitative analysis is left for further research. Further study must, however, be guided by the relevance of the observations.

Keywords: Engineering education, integrated curriculum, learning outcomes, performance measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 883
12158 Application-Specific Instruction Sets Processor with Implicit Registers to Improve Register Bandwidth

Authors: Ginhsuan Li, Chiuyun Hung, Desheng Chen, Yiwen Wang

Abstract:

Application-Specific Instruction (ASI ) set Processors (ASIP) have become an important design choice for embedded systems due to runtime flexibility, which cannot be provided by custom ASIC solutions. One major bottleneck in maximizing ASIP performance is the limitation on the data bandwidth between the General Purpose Register File (GPRF) and ASIs. This paper presents the Implicit Registers (IRs) to provide the desirable data bandwidth. An ASI Input/Output model is proposed to formulate the overheads of the additional data transfer between the GPRF and IRs, therefore, an IRs allocation algorithm is used to achieve the better performance by minimizing the number of extra data transfer instructions. The experiment results show an up to 3.33x speedup compared to the results without using IRs.

Keywords: Application-Specific Instruction-set Processors, data bandwidth, configurable processor, implicit register.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1532
12157 The Cost Structure of Intermodal Transportation: The Chilean Case

Authors: Mabel A. Leva

Abstract:

This study defines a methodology to compute unitary costs for freight transportation modes. The main objective was to gather relevant costs data to support the formulation and evaluation of railway, road, pipelines and port projects. This article will concentrate on the following steps: Compilation and analysis of relevant modal cost studies, Methodological adjustments to make cost figures comparable between studies, Definition of typology and scope of transportation modes, Analysis and validation of cost values for relevant freight transportation modes in Chile. In order to define the comparison methodology for the costs between the different transportation modes, it was necessary to consider that the relevant cost depends on who performs the comparison. Thus, for the transportation user (e.g. exporter) the pertinent costs are the mode tariffs, whereas from the operators perspective (e.g. rail manager), the pertinent costs are the operating costs of each mode.

Keywords: Intermodal costs, Logistics, Transportation costs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5501
12156 Reduced Dynamic Time Warping for Handwriting Recognition Based on Multidimensional Time Series of a Novel Pen Device

Authors: Muzaffar Bashir, Jürgen Kempf

Abstract:

The purpose of this paper is to present a Dynamic Time Warping technique which reduces significantly the data processing time and memory size of multi-dimensional time series sampled by the biometric smart pen device BiSP. The acquisition device is a novel ballpoint pen equipped with a diversity of sensors for monitoring the kinematics and dynamics of handwriting movement. The DTW algorithm has been applied for time series analysis of five different sensor channels providing pressure, acceleration and tilt data of the pen generated during handwriting on a paper pad. But the standard DTW has processing time and memory space problems which limit its practical use for online handwriting recognition. To face with this problem the DTW has been applied to the sum of the five sensor signals after an adequate down-sampling of the data. Preliminary results have shown that processing time and memory size could significantly be reduced without deterioration of performance in single character and word recognition. Further excellent accuracy in recognition was achieved which is mainly due to the reduced dynamic time warping RDTW technique and a novel pen device BiSP.

Keywords: Biometric character recognition, biometric person authentication, biometric smart pen BiSP, dynamic time warping DTW, online-handwriting recognition, multidimensional time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2399
12155 Dynamics of Marital Status and Information Search through Consumer Generated Media: An Exploratory Study

Authors: Shivakumar Krishnamurti, Ruchi Agarwal

Abstract:

The study examines the influence of marital status on consumers of products and services using blogs as a source of information. A pre-designed questionnaire was used to collect the primary data from the respondents (experiences). Data were collected from one hundred and eighty seven respondents residing in and around the Emirates of Sharjah and Dubai of the United Arab Emirates. The collected data was analyzed with the help of statistical tools such as averages, percentages, factor analysis, Student’s t-test and Structural Equation Modelling Technique. Objectives of the study are to know the reasons how married and unmarried or single consumers of products and services are motivated to use blogs as a source of information, to know whether the consumers of products and services irrespective of their marital status share their views and experiences with other bloggers and to know the respondents’ future intentions towards blogging. The study revealed the following: Majority of the respondents have the motivation to blog because they are willing to receive comments on what they post about services, convenience of blogs to search for information about services and products, by blogging respondents share information on the symptoms of a disease/ disorder that may be experienced by someone, helps to share information about ready to cook mix products and are keen to spend more time blogging in the future.

Keywords: Blog, consumer, information, marital status.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695
12154 Secure Socket Layer in the Network and Web Security

Authors: Roza Dastres, Mohsen Soori

Abstract:

In order to electronically exchange information between network users in the web of data, different software such as outlook is presented. So, the traffic of users on a site or even the floors of a building can be decreased as a result of applying a secure and reliable data sharing software. It is essential to provide a fast, secure and reliable network system in the data sharing webs to create an advanced communication systems in the users of network. In the present research work, different encoding methods and algorithms in data sharing systems is studied in order to increase security of data sharing systems by preventing the access of hackers to the transferred data. To increase security in the networks, the possibility of textual conversation between customers of a local network is studied. Application of the encryption and decryption algorithms is studied in order to increase security in networks by preventing hackers from infiltrating. As a result, a reliable and secure communication system between members of a network can be provided by preventing additional traffic in the website environment in order to increase speed, accuracy and security in the network and web systems of data sharing.

Keywords: Secure Socket Layer, Security of networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 493
12153 LCA/CFD Studies of Artisanal Brick Manufacture in Mexico

Authors: H. A. Lopez-Aguilar, E. A. Huerta-Reynoso, J. A. Gomez, J. A. Duarte-Moller, A. Perez-Hernandez

Abstract:

Environmental performance of artisanal brick manufacture was studied by Lifecycle Assessment (LCA) methodology and Computational Fluid Dynamics (CFD) analysis in Mexico. The main objective of this paper is to evaluate the environmental impact during artisanal brick manufacture. LCA cradle-to-gate approach was complemented with CFD analysis to carry out an Environmental Impact Assessment (EIA). The lifecycle includes the stages of extraction, baking and transportation to the gate. The functional unit of this study was the production of a single brick in Chihuahua, Mexico and the impact categories studied were carcinogens, respiratory organics and inorganics, climate change radiation, ozone layer depletion, ecotoxicity, acidification/ eutrophication, land use, mineral use and fossil fuels. Laboratory techniques for fuel characterization, gas measurements in situ, and AP42 emission factors were employed in order to calculate gas emissions for inventory data. The results revealed that the categories with greater impacts are ecotoxicity and carcinogens. The CFD analysis is helpful in predicting the thermal diffusion and contaminants from a defined source. LCA-CFD synergy complemented the EIA and allowed us to identify the problem of thermal efficiency within the system.

Keywords: LCA, CFD, brick, artisanal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1870
12152 Identifying Corruption in Legislation using Risk Analysis Methods

Authors: Chvalkovska, J., Jansky, P., Mejstrik, M.

Abstract:

The objective of this article is to discuss the potential of economic analysis as a tool for identification and evaluation of corruption in legislative acts. We propose that corruption be perceived as a risk variable within the legislative process. Therefore we find it appropriate to employ risk analysis methods, used in various fields of economics, for the evaluation of corruption in legislation. Furthermore we propose the incorporation of these methods into the so called corruption impact assessment (CIA), the general framework for detection of corruption in legislative acts. The applications of the risk analysis methods are demonstrated on examples of implementation of proposed CIA in the Czech Republic.

Keywords: corruption; corruption impact assessment (CIA); legislative; legislative process; risk analysis; Czech Republic

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2456
12151 Efficacy of Polyfluoroalkyl Substances Filtration with Low-Cost Organic Fiber Filter

Authors: Gautham Das, Edward Morrone, Erik Treble, Clinton Binder

Abstract:

The purpose of this study was to evaluate the efficacy of a low-cost filter regarding per- and polyfluoroalkyl substances (PFAS). PFAS is a commonly used man-made chemical that can be found in a variety of household and industrial products with deleterious effects on humans. The filter consists of a combination of low-cost materials which could be locally procured. Water testing results for 4 different PFAS contaminants indicated that for Perfluorooctane sulfonic acid (PFOS), the Agency for Toxic Substances and Disease Registry (ATSDR) regulation is 7 ppt, the initial concentration was 15 ppt, and the final concentration was 3.9 ppt. For Perfluorononanoic acid (PFNA), the ATSDR regulation is 10.5 ppt, the initial concentration was 15 ppt, and the final concentration was 3.9 ppt. For Perfluorooctanoic acid (PFOA), the ATSDR regulation is 11 ppt, the initial concentration was 15 ppt, and the final concentration was 3.9 ppt. For Perfluorohexane sulfonic acid (PFHxS), the ATSDR regulation is 70 ppt, the initial concentration was 15 ppt, and the final concentration was 3.9 ppt. The results indicated a 74% reduction in PFAS concentration in filtered samples. Statistical data through regression analysis showed 0.9 validity of the sample data. Initial tests show the efficiency of the proposed filter described could be far greater if tested at a greater scale. It is highly recommended further testing to be conducted to validate the data for an innovative solution to a ubiquitous problem.

Keywords: PFAS, PFOS, PFOA, PFHxS, low-cost filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 635
12150 The Performance Improvement of Automatic Modulation Recognition Using Simple Feature Manipulation, Analysis of the HOS, and Voted Decision

Authors: Heroe Wijanto, Sugihartono, Suhartono Tjondronegoro, Kuspriyanto

Abstract:

The use of High Order Statistics (HOS) analysis is expected to provide so many candidates of features that can be selected for pattern recognition. More candidates of the feature can be extracted using simple manipulation through a specific mathematical function prior to the HOS analysis. Feature extraction method using HOS analysis combined with Difference to the Nth-Power manipulation has been examined in application for Automatic Modulation Recognition (AMR) to perform scheme recognition of three digital modulation signal, i.e. QPSK-16QAM-64QAM in the AWGN transmission channel. The simulation results is reported when the analysis of HOS up to order-12 and the manipulation of Difference to the Nth-Power up to N = 4. The obtained accuracy rate of AMR using the method of Simple Decision obtained 90% in SNR > 10 dB in its classifier, while using the method of Voted Decision is 96% in SNR > 2 dB.

Keywords: modulation, automatic modulation recognition, feature analysis, feature manipulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2108
12149 Accurate HLA Typing at High-Digit Resolution from NGS Data

Authors: Yazhi Huang, Jing Yang, Dingge Ying, Yan Zhang, Vorasuk Shotelersuk, Nattiya Hirankarn, Pak Chung Sham, Yu Lung Lau, Wanling Yang

Abstract:

Human leukocyte antigen (HLA) typing from next generation sequencing (NGS) data has the potential for applications in clinical laboratories and population genetic studies. Here we introduce a novel technique for HLA typing from NGS data based on read-mapping using a comprehensive reference panel containing all known HLA alleles and de novo assembly of the gene-specific short reads. An accurate HLA typing at high-digit resolution was achieved when it was tested on publicly available NGS data, outperforming other newly-developed tools such as HLAminer and PHLAT.

Keywords: Human leukocyte antigens, next generation sequencing, whole exome sequencing, HLA typing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2618
12148 Grid Independence Study of Flow Past a Square Cylinder Using the Multi-Relaxation-Time Lattice Boltzmann Method

Authors: Shams-Ul-Islam, Hamid Rahman, Waqas Sarwar Abbasi

Abstract:

Numerical calculations of flow around a square cylinder are presented using the multi-relaxation-time lattice Boltzmann method at Reynolds number 150. The effects of upstream locations, downstream locations and blockage are investigated systematically. A detail analysis are given in terms of time-trace analysis of drag and lift coefficients, power spectra analysis of lift coefficient, vorticity contours visualizations and phase diagrams. A number of physical quantities mean drag coefficient, drag coefficient, Strouhal number and root-mean-square values of drag and lift coefficients are calculated and compared with the well resolved experimental data and numerical results available in open literature. The results had shown that the upstream, downstream and height of the computational domain are at least 7.5, 37.5 and 12 diameters of the cylinder, respectively.

Keywords: Grid independence, Multi-relaxation-time lattice Boltzmann method, Physical quantities, Square cylinder, Vorticity contours visualizations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3331
12147 PhilSHORE: Development of a WebGIS-Based Marine Spatial Planning Tool for Tidal Current Energy Resource Assessment and Site Suitability Analysis

Authors: Ma. Rosario Concepcion O. Ang, Luis Caezar Ian K. Panganiban, Charmyne B. Mamador, Oliver Dan G. De Luna, Michael D. Bausas, Joselito P. Cruz

Abstract:

PhilSHORE is a multi-site, multi-device and multicriteria decision support tool designed to support the development of tidal current energy in the Philippines. Its platform is based on Geographic Information Systems (GIS) which allows for the collection, storage, processing, analyses and display of geospatial data. Combining GIS tools with open source web development applications, PhilSHORE becomes a webGIS-based marine spatial planning tool. To date, PhilSHORE displays output maps and graphs of power and energy density, site suitability and site-device analysis. It enables stakeholders and the public easy access to the results of tidal current energy resource assessments and site suitability analyses. Results of the initial development show that PhilSHORE is a promising decision support tool for ORE project developments.

Keywords: GIS, Site Suitability Analysis, Tidal Current Energy Resource Assessment, WebGIS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2703
12146 Consideration a Novel Manner for Data Sending Quality in Heterogeneous Radio Networks

Authors: Mohammadreza Amini, Omid Moradtalab, Ebadollah Zohrevandi

Abstract:

In real-time networks a large number of application programs are relying on video data and heterogeneous data transmission techniques. The aim of this research is presenting a method for end-to-end vouch quality service in surface applicationlayer for sending video data in comparison form in wireless heterogeneous networks. This method tries to improve the video sending over the wireless heterogeneous networks with used techniques in surface layer, link and application. The offered method is showing a considerable improvement in quality observing by user. In addition to this, other specifications such as shortage of data load that had require to resending and limited the relation period length to require time for second data sending, help to be used the offered method in the wireless devices that have a limited energy. The presented method and the achieved improvement is simulated and presented in the NS-2 software.

Keywords: Heterogeneous wireless networks, adaptation mechanism, multi-level, Handoff, stop mechanism, graceful degrades, application layer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1660
12145 An Efficient 3D Animation Data Reduction Using Frame Removal

Authors: Jinsuk Yang, Choongjae Joo, Kyoungsu Oh

Abstract:

Existing methods in which the animation data of all frames are stored and reproduced as with vertex animation cannot be used in mobile device environments because these methods use large amounts of the memory. So 3D animation data reduction methods aimed at solving this problem have been extensively studied thus far and we propose a new method as follows. First, we find and remove frames in which motion changes are small out of all animation frames and store only the animation data of remaining frames (involving large motion changes). When playing the animation, the removed frame areas are reconstructed using the interpolation of the remaining frames. Our key contribution is to calculate the accelerations of the joints of individual frames and the standard deviations of the accelerations using the information of joint locations in the relevant 3D model in order to find and delete frames in which motion changes are small. Our methods can reduce data sizes by approximately 50% or more while providing quality which is not much lower compared to original animations. Therefore, our method is expected to be usefully used in mobile device environments or other environments in which memory sizes are limited.

Keywords: Data Reduction, Interpolation, Vertex Animation, 3D Animation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654
12144 Diagnosis of Intermittent High Vibration Peaks in Industrial Gas Turbine Using Advanced Vibrations Analysis

Authors: Abubakar Rashid, Muhammad Saad, Faheem Ahmed

Abstract:

This paper provides a comprehensive study pertaining to diagnosis of intermittent high vibrations on an industrial gas turbine using detailed vibrations analysis, followed by its rectification. Engro Polymer & Chemicals Limited, a Chlor-Vinyl complex located in Pakistan has a captive combined cycle power plant having two 28 MW gas turbines (make Hitachi) & one 15 MW steam turbine. In 2018, the organization faced an issue of high vibrations on one of the gas turbines. These high vibration peaks appeared intermittently on both compressor’s drive end (DE) & turbine’s non-drive end (NDE) bearing. The amplitude of high vibration peaks was between 150-170% on the DE bearing & 200-300% on the NDE bearing from baseline values. In one of these episodes, the gas turbine got tripped on “High Vibrations Trip” logic actuated at 155µm. Limited instrumentation is available on the machine, which is monitored with GE Bently Nevada 3300 system having two proximity probes installed at Turbine NDE, Compressor DE &at Generator DE & NDE bearings. Machine’s transient ramp-up & steady state data was collected using ADRE SXP & DSPI 408. Since only 01 key phasor is installed at Turbine high speed shaft, a derived drive key phasor was configured in ADRE to obtain low speed shaft rpm required for data analysis. By analyzing the Bode plots, Shaft center line plot, Polar plot & orbit plots; rubbing was evident on Turbine’s NDE along with increased bearing clearance of Turbine’s NDE radial bearing. The subject bearing was then inspected & heavy deposition of carbonized coke was found on the labyrinth seals of bearing housing with clear rubbing marks on shaft & housing covering at 20-25 degrees on the inner radius of labyrinth seals. The collected coke sample was tested in laboratory & found to be the residue of lube oil in the bearing housing. After detailed inspection & cleaning of shaft journal area & bearing housing, new radial bearing was installed. Before assembling the bearing housing, cleaning of bearing cooling & sealing air lines was also carried out as inadequate flow of cooling & sealing air can accelerate coke formation in bearing housing. The machine was then taken back online & data was collected again using ADRE SXP & DSPI 408 for health analysis. The vibrations were found in acceptable zone as per ISO standard 7919-3 while all other parameters were also within vendor defined range. As a learning from subject case, revised operating & maintenance regime has also been proposed to enhance machine’s reliability.

Keywords: ADRE, bearing, gas turbine, GE Bently Nevada, Hitachi, vibration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 666
12143 A Study of the Role of Perceived Risk and User Characteristics in Internet Purchase Intention

Authors: Ali Hajiha, Farhad Ghaffari, Nooshin Gholamali Tehrani

Abstract:

This study aims at investigating the empirical relationships between risk preference, internet preference, and internet knowledge which are known as user characteristics, in addition to perceived risk of the customers on the internet purchase intention. In order to test the relationships between the variables of model 174, a questionnaire was collected from the students with previous online experience. For the purpose of data analysis, confirmatory factor analysis (CFA) and structural equation model (SEM) was used. Test results show that the perceived risk affects the internet purchase intention, and increase or decrease of perceived risk influences the purchase intention when the customer does the internet shopping. Other factors such as internet preference, knowledge of the internet, and risk preference affect the internet purchase intention.

Keywords: Perceived risk, Internet preference, Internetknowledge, Risk preference, Internet purchase intention

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2471
12142 Diagnosis of the Heart Rhythm Disorders by Using Hybrid Classifiers

Authors: Sule Yucelbas, Gulay Tezel, Cuneyt Yucelbas, Seral Ozsen

Abstract:

In this study, it was tried to identify some heart rhythm disorders by electrocardiography (ECG) data that is taken from MIT-BIH arrhythmia database by subtracting the required features, presenting to artificial neural networks (ANN), artificial immune systems (AIS), artificial neural network based on artificial immune system (AIS-ANN) and particle swarm optimization based artificial neural network (PSO-NN) classifier systems. The main purpose of this study is to evaluate the performance of hybrid AIS-ANN and PSO-ANN classifiers with regard to the ANN and AIS. For this purpose, the normal sinus rhythm (NSR), atrial premature contraction (APC), sinus arrhythmia (SA), ventricular trigeminy (VTI), ventricular tachycardia (VTK) and atrial fibrillation (AF) data for each of the RR intervals were found. Then these data in the form of pairs (NSR-APC, NSR-SA, NSR-VTI, NSR-VTK and NSR-AF) is created by combining discrete wavelet transform which is applied to each of these two groups of data and two different data sets with 9 and 27 features were obtained from each of them after data reduction. Afterwards, the data randomly was firstly mixed within themselves, and then 4-fold cross validation method was applied to create the training and testing data. The training and testing accuracy rates and training time are compared with each other.

As a result, performances of the hybrid classification systems, AIS-ANN and PSO-ANN were seen to be close to the performance of the ANN system. Also, the results of the hybrid systems were much better than AIS, too. However, ANN had much shorter period of training time than other systems. In terms of training times, ANN was followed by PSO-ANN, AIS-ANN and AIS systems respectively. Also, the features that extracted from the data affected the classification results significantly.

Keywords: AIS, ANN, ECG, hybrid classifiers, PSO.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1912
12141 Validation of Visibility Data from Road Weather Information Systems by Comparing Three Data Resources: Case Study in Ohio

Authors: Fan Ye

Abstract:

Adverse weather conditions, particularly those with low visibility, are critical to the driving tasks. However, the direct relationship between visibility distances and traffic flow/roadway safety is uncertain due to the limitation of visibility data availability. The recent growth of deployment of Road Weather Information Systems (RWIS) makes segment-specific visibility information available which can be integrated with other Intelligent Transportation System, such as automated warning system and variable speed limit, to improve mobility and safety. Before applying the RWIS visibility measurements in traffic study and operations, it is critical to validate the data. Therefore, an attempt was made in the paper to examine the validity and viability of RWIS visibility data by comparing visibility measurements among RWIS, airport weather stations, and weather information recorded by police in crash reports, based on Ohio data. The results indicated that RWIS visibility measurements were significantly different from airport visibility data in Ohio, but no conclusion regarding the reliability of RWIS visibility could be drawn in the consideration of no verified ground truth in the comparisons. It was suggested that more objective methods are needed to validate the RWIS visibility measurements, such as continuous in-field measurements associated with various weather events using calibrated visibility sensors.

Keywords: Low visibility, RWIS, traffic safety, visibility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1329
12140 Hybrid Recommender Systems using Social Network Analysis

Authors: Kyoung-Jae Kim, Hyunchul Ahn

Abstract:

This study proposes novel hybrid social network analysis and collaborative filtering approach to enhance the performance of recommender systems. The proposed model selects subgroups of users in Internet community through social network analysis (SNA), and then performs clustering analysis using the information about subgroups. Finally, it makes recommendations using cluster-indexing CF based on the clustering results. This study tries to use the cores in subgroups as an initial seed for a conventional clustering algorithm. This model chooses five cores which have the highest value of degree centrality from SNA, and then performs clustering analysis by using the cores as initial centroids (cluster centers). Then, the model amplifies the impact of friends in social network in the process of cluster-indexing CF.

Keywords: Social network analysis, Recommender systems, Collaborative filtering, Customer relationship management

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2761
12139 Implementation of Neural Network Based Electricity Load Forecasting

Authors: Myint Myint Yi, Khin Sandar Linn, Marlar Kyaw

Abstract:

This paper proposed a novel model for short term load forecast (STLF) in the electricity market. The prior electricity demand data are treated as time series. The model is composed of several neural networks whose data are processed using a wavelet technique. The model is created in the form of a simulation program written with MATLAB. The load data are treated as time series data. They are decomposed into several wavelet coefficient series using the wavelet transform technique known as Non-decimated Wavelet Transform (NWT). The reason for using this technique is the belief in the possibility of extracting hidden patterns from the time series data. The wavelet coefficient series are used to train the neural networks (NNs) and used as the inputs to the NNs for electricity load prediction. The Scale Conjugate Gradient (SCG) algorithm is used as the learning algorithm for the NNs. To get the final forecast data, the outputs from the NNs are recombined using the same wavelet technique. The model was evaluated with the electricity load data of Electronic Engineering Department in Mandalay Technological University in Myanmar. The simulation results showed that the model was capable of producing a reasonable forecasting accuracy in STLF.

Keywords: Neural network, Load forecast, Time series, wavelettransform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2484
12138 Evaluating Spectral Relationships between Signals by Removing the Contribution of a Common, Periodic Source A Partial Coherence-based Approach

Authors: Antonio Mauricio F. L. Miranda de Sá

Abstract:

Partial coherence between two signals removing the contribution of a periodic, deterministic signal is proposed for evaluating the interrelationship in multivariate systems. The estimator expression was derived and shown to be independent of such periodic signal. Simulations were used for obtaining its critical value, which were found to be the same as those for Gaussian signals, as well as for evaluating the technique. An Illustration with eletroencephalografic (EEG) signals during photic stimulation is also provided. The application of the proposed technique in both simulation and real EEG data indicate that it seems to be very specific in removing the contribution of periodic sources. The estimate independence of the periodic signal may widen partial coherence application to signal analysis, since it could be used together with simple coherence to test for contamination in signals by a common, periodic noise source.

Keywords: Partial coherence, periodic input, spectral analysis, statistical signal processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1456
12137 A Comparison of Marginal and Joint Generalized Quasi-likelihood Estimating Equations Based On the Com-Poisson GLM: Application to Car Breakdowns Data

Authors: N. Mamode Khan, V. Jowaheer

Abstract:

In this paper, we apply and compare two generalized estimating equation approaches to the analysis of car breakdowns data in Mauritius. Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observation as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use the two types of quasi-likelihood estimation approaches to estimate the parameters of the model: marginal and joint generalized quasi-likelihood estimating equation approaches. Under-dispersion parameter is estimated to be around 2.14 justifying the appropriateness of Com-Poisson distribution in modelling underdispersed count responses recorded in this study.

Keywords: Breakdowns, under-dispersion, com-poisson, generalized linear model, marginal quasi-likelihood estimation, joint quasi-likelihood estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1460
12136 A Comparative Analysis of Zotero and Mendeley Reference Management Software

Authors: Sujit K. Basak

Abstract:

This paper presents a comparison of the reference management software between Zotero and Mendeley and the results were drawn by comparing the two software’s. The novelty of this paper is the comparative analysis of the software and it has shown that Mendeley can import more information from the Google Scholar for the researchers. This finding can help to know researchers to use the reference management software.

Keywords: Analysis, comparative analysis, zotero, researchers, Mendeley.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3190
12135 Analyzing Periurban Fringe with Rough Set

Authors: Benedetto Manganelli, Beniamino Murgante

Abstract:

The distinction among urban, periurban and rural areas represents a classical example of uncertainty in land classification. Satellite images, geostatistical analysis and all kinds of spatial data are very useful in urban sprawl studies, but it is important to define precise rules in combining great amounts of data to build complex knowledge about territory. Rough Set theory may be a useful method to employ in this field. It represents a different mathematical approach to uncertainty by capturing the indiscernibility. Two different phenomena can be indiscernible in some contexts and classified in the same way when combining available information about them. This approach has been applied in a case of study, comparing the results achieved with both Map Algebra technique and Spatial Rough Set. The study case area, Potenza Province, is particularly suitable for the application of this theory, because it includes 100 municipalities with different number of inhabitants and morphologic features.

Keywords: Land Classification, Map Algebra, Periurban Fringe, Rough Set, Urban Planning, Urban Sprawl.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1718