Search results for: paradigms of real analysis
9849 An Ontology Abstract Machine
Authors: Leong Lee, Jennifer Leopold, Julia Albath, Alton Coalter
Abstract:
As more people from non-technical backgrounds are becoming directly involved with large-scale ontology development, the focal point of ontology research has shifted from the more theoretical ontology issues to problems associated with the actual use of ontologies in real-world, large-scale collaborative applications. Recently the National Science Foundation funded a large collaborative ontology development project for which a new formal ontology model, the Ontology Abstract Machine (OAM), was developed to satisfy some unique functional and data representation requirements. This paper introduces the OAM model and the related algorithms that enable maintenance of an ontology that supports node-based user access. The successful software implementation of the OAM model and its subsequent acceptance by a large research community proves its validity and its real-world application value.Keywords: Ontology, Abstract Machine, Ontology Editor, WebbasedOntology Management System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14069848 Multiple Approaches for Ultrasonic Cavitation Monitoring of Oxygen-Loaded Nanodroplets
Authors: Simone Galati, Adriano Troia
Abstract:
Ultrasound (US) is widely used in medical field for a variety diagnostic techniques but, in recent years, it has also been creating great interest for therapeutic aims. Regarding drug delivery, the use of US as an activation source provides better spatial delivery confinement and limits the undesired side effects. However, at present there is no complete characterization at a fundamental level of the different signals produced by sono-activated nanocarriers. Therefore, the aim of this study is to obtain a metrological characterization of the cavitation phenomena induced by US through three parallel investigation approaches. US was focused into a channel of a customized phantom in which a solution with oxygen-loaded nanodroplets (OLNDs) was led to flow and the cavitation activity was monitored. Both quantitative and qualitative real-time analysis were performed giving information about the dynamics of bubble formation, oscillation and final implosion with respect to the working acoustic pressure and the type of nanodroplets, compared with pure water. From this analysis a possible interpretation of the observed results is proposed.Keywords: Cavitation, Drug Delivery, Nanodroplets, Ultrasound.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6039847 Non-Standard Monetary Policy Measures and Their Consequences
Authors: Aleksandra Nocoń (Szunke)
Abstract:
The study is a review of the literature concerning the consequences of non-standard monetary policy, which are used by central banks during unconventional periods, threatening banking sector instability. In particular, the attention was paid to the effects of non-standard monetary policy tools for financial markets. However, the empirical evidence about their effects and real consequences for financial markets is still not final. The main aim of the study is to survey consequences of standard and non-standard monetary policy instruments, implemented during the global financial crisis in the United States, United Kingdom and euro area, with particular attention to the results for the stabilization of global financial markets. The study consists mainly of the empirical review, indicating the impact of the implementation of these tools for financial markets. The following research methods were used in the study: literature studies, including domestic and foreign literature, cause and effect analysis and statistical analysis.
Keywords: Asset purchase facility, consequences of monetary policy instruments, non-standard monetary policy, Quantitative Easing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22319846 The Effect of Combining Real Experimentation With Virtual Experimentation on Students-Success
Authors: I. Oral, E. Bozkurt, H. Guzel
Abstract:
The purpose of this study was to investigate the effect of combining Real Experimentation (RE) With Virtual Experimentation (VE) on students- conceptual understanding of photo electric effect. To achieve this, a pre–post comparison study design was used that involved 46 undergraduate students. Two groups were set up for this study. Participants in the control group used RE to learn photo electric effect, whereas, participants in the experimental group used RE in the first part of the curriculum and VE in another part. Achievement test was given to the groups before and after the application as pre-test and post test. The independent samples t- test, one way Anova and Tukey HSD test were used for testing the data obtained from the study. According to the results of analyzes, the experimental group was found more successful than the control group.Keywords: Computer Based Teaching, Java, Physics Education, Virtual Laboratory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18089845 A New History Based Method to Handle the Recurring Concept Shifts in Data Streams
Authors: Hossein Morshedlou, Ahmad Abdollahzade Barforoush
Abstract:
Recent developments in storage technology and networking architectures have made it possible for broad areas of applications to rely on data streams for quick response and accurate decision making. Data streams are generated from events of real world so existence of associations, which are among the occurrence of these events in real world, among concepts of data streams is logical. Extraction of these hidden associations can be useful for prediction of subsequent concepts in concept shifting data streams. In this paper we present a new method for learning association among concepts of data stream and prediction of what the next concept will be. Knowing the next concept, an informed update of data model will be possible. The results of conducted experiments show that the proposed method is proper for classification of concept shifting data streams.Keywords: Data Stream, Classification, Concept Shift, History.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12789844 Grocery Customer Behavior Analysis using RFID-based Shopping Paths Data
Authors: In-Chul Jung, Young S. Kwon
Abstract:
Knowing about the customer behavior in a grocery has been a long-standing issue in the retailing industry. The advent of RFID has made it easier to collect moving data for an individual shopper's behavior. Most of the previous studies used the traditional statistical clustering technique to find the major characteristics of customer behavior, especially shopping path. However, in using the clustering technique, due to various spatial constraints in the store, standard clustering methods are not feasible because moving data such as the shopping path should be adjusted in advance of the analysis, which is time-consuming and causes data distortion. To alleviate this problem, we propose a new approach to spatial pattern clustering based on the longest common subsequence. Experimental results using real data obtained from a grocery confirm the good performance of the proposed method in finding the hot spot, dead spot and major path patterns of customer movements.Keywords: customer path, shopping behavior, exploratoryanalysis, LCS, RFID
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31489843 A Generator from Cascade Markov Model for Packet Loss and Subsequent Bit Error Description
Authors: Jaroslav Polec, Viliam Hirner, Michal Martinovič, Kvetoslava Kotuliaková
Abstract:
In this paper we present a novel error model for packet loss and subsequent error description. The proposed model simulates the error performance of wireless communication link. The model is designed as two independent Markov chains, where the first one is used for packet generation and the second one generates correctly and incorrectly transmitted bits for received packets from the first chain. The statistical analyses of real communication on the wireless link are used for determination of model-s parameters. Using the obtained parameters and the implementation of the generator, we collected generated traffic. The obtained results generated by proposed model are compared with the real data collection.Keywords: Wireless channel, error model, Markov chain, Elliot model, Gilbert model, generator, IEEE 802.11.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21139842 Evaluation of Easy-to-Use Energy Building Design Tools for Solar Access Analysis in Urban Contexts: Comparison of Friendly Simulation Design Tools for Architectural Practice in the Early Design Stage
Abstract:
Current building sector is focused on reduction of energy requirements, on renewable energy generation and on regeneration of existing urban areas. These targets need to be solved with a systemic approach, considering several aspects simultaneously such as climate conditions, lighting conditions, solar radiation, PV potential, etc. The solar access analysis is an already known method to analyze the solar potentials, but in current years, simulation tools have provided more effective opportunities to perform this type of analysis, in particular in the early design stage. Nowadays, the study of the solar access is related to the easiness of the use of simulation tools, in rapid and easy way, during the design process. This study presents a comparison of three simulation tools, from the point of view of the user, with the aim to highlight differences in the easy-to-use of these tools. Using a real urban context as case study, three tools; Ecotect, Townscope and Heliodon, are tested, performing models and simulations and examining the capabilities and output results of solar access analysis. The evaluation of the ease-to-use of these tools is based on some detected parameters and features, such as the types of simulation, requirements of input data, types of results, etc. As a result, a framework is provided in which features and capabilities of each tool are shown. This framework shows the differences among these tools about functions, features and capabilities. The aim of this study is to support users and to improve the integration of simulation tools for solar access with the design process.
Keywords: Solar access analysis, energy building design tools, urban planning, solar potential.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20689841 A Case Study on Appearance Based Feature Extraction Techniques and Their Susceptibility to Image Degradations for the Task of Face Recognition
Authors: Vitomir Struc, Nikola Pavesic
Abstract:
Over the past decades, automatic face recognition has become a highly active research area, mainly due to the countless application possibilities in both the private as well as the public sector. Numerous algorithms have been proposed in the literature to cope with the problem of face recognition, nevertheless, a group of methods commonly referred to as appearance based have emerged as the dominant solution to the face recognition problem. Many comparative studies concerned with the performance of appearance based methods have already been presented in the literature, not rarely with inconclusive and often with contradictory results. No consent has been reached within the scientific community regarding the relative ranking of the efficiency of appearance based methods for the face recognition task, let alone regarding their susceptibility to appearance changes induced by various environmental factors. To tackle these open issues, this paper assess the performance of the three dominant appearance based methods: principal component analysis, linear discriminant analysis and independent component analysis, and compares them on equal footing (i.e., with the same preprocessing procedure, with optimized parameters for the best possible performance, etc.) in face verification experiments on the publicly available XM2VTS database. In addition to the comparative analysis on the XM2VTS database, ten degraded versions of the database are also employed in the experiments to evaluate the susceptibility of the appearance based methods on various image degradations which can occur in "real-life" operating conditions. Our experimental results suggest that linear discriminant analysis ensures the most consistent verification rates across the tested databases.
Keywords: Biometrics, face recognition, appearance based methods, image degradations, the XM2VTS database.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22849840 Minimizing of Target Localization Error using Multi-robot System and Particle Filters
Authors: Jana Puchyova
Abstract:
In recent years a number of applications with multirobot systems (MRS) is growing in various areas. But their design is in practice often difficult and algorithms are proposed for the theoretical background and do not consider errors and noise in real conditions, so they are not usable in real environment. These errors are visible also in task of target localization enough, when robots try to find and estimate the position of the target by the sensors. Localization of target is possible also with one robot but as it was examined target finding and localization with group of mobile robots can estimate the target position more accurately and faster. The accuracy of target position estimation is made by cooperation of MRS and particle filtering. Advantage of usage the MRS with particle filtering was tested on task of fixed target localization by group of mobile robots.Keywords: Multi-robot system, particle filter, position estimation, target localization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15679839 Production Planning and Scheduling and SME
Authors: M. Heck, H. Vettiger
Abstract:
Small and medium-sized enterprises (SME) are the backbone of central Europe’s economies and have a significant contribution to the gross domestic product. Production planning and scheduling (PPS) is still a crucial element in manufacturing industries of the 21st century even though this area of research is more than a century old. The topic of PPS is well researched especially in the context of large enterprises in the manufacturing industry. However the implementation of PPS methodologies within SME is mostly unobserved. This work analyzes how PPS is implemented in SME with the geographical focus on Switzerland and its vicinity. Based on restricted resources compared to large enterprises, SME have to face different challenges. The real problem areas of selected enterprises in regards of PPS are identified and evaluated. For the identified real-life problem areas of SME clear and detailed recommendations are created, covering concepts and best practices and the efficient usage of PPS. Furthermore the economic and entrepreneurial value for companies is lined out and why the implementation of the introduced recommendations is advised.
Keywords: Central Europe, PPS, Production Planning, SME.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21809838 Simulation of Utility Accrual Scheduling and Recovery Algorithm in Multiprocessor Environment
Authors: A. Idawaty, O. Mohamed, A. Z. Zuriati
Abstract:
This paper presents the development of an event based Discrete Event Simulation (DES) for a recovery algorithm known Backward Recovery Global Preemptive Utility Accrual Scheduling (BR_GPUAS). This algorithm implements the Backward Recovery (BR) mechanism as a fault recovery solution under the existing Time/Utility Function/ Utility Accrual (TUF/UA) scheduling domain for multiprocessor environment. The BR mechanism attempts to take the faulty tasks back to its initial safe state and then proceeds to re-execute the affected section of the faulty tasks to enable recovery. Considering that faults may occur in the components of any system; a fault tolerance system that can nullify the erroneous effect is necessary to be developed. Current TUF/UA scheduling algorithm uses the abortion recovery mechanism and it simply aborts the erroneous task as their fault recovery solution. None of the existing algorithm in TUF/UA scheduling domain in multiprocessor scheduling environment have considered the transient fault and implement the BR mechanism as a fault recovery mechanism to nullify the erroneous effect and solve the recovery problem in this domain. The developed BR_GPUAS simulator has derived the set of parameter, events and performance metrics according to a detailed analysis of the base model. Simulation results revealed that BR_GPUAS algorithm can saved almost 20-30% of the accumulated utilities making it reliable and efficient for the real-time application in the multiprocessor scheduling environment.
Keywords: Time Utility Function/ Utility Accrual (TUF/UA) scheduling, Real-time system (RTS), Backward Recovery, Multiprocessor, Discrete Event Simulation (DES).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9699837 Multicriteria Decision Analysis for Development Ranking of Balkan Countries
Authors: C. Ardil
Abstract:
In this research, the Balkan peninsula countries' developmental integration into European Union represents the strategic economic development objectives of the countries in the region. In order to objectively analyze the level of economic development competition of Balkan Peninsula countries, the mathematical compromise programming technique of multicriteria evaluation is used in this ranking problem. The primary aim of this research is to explain the role and significance of the multicriteria method evaluation using a real example of compromise solutions. Using the mathematical compromise programming technique, twelve countries of the Balkan Peninsula are economically evaluated and mutually compared. The economic development evaluation of the countries is performed according to five evaluation criteria forming the basis for economic development evaluation. The multiattribute model is solved using the mathematical compromise programming technique for producing different Pareto solutions. The results obtained by the multicriteria evaluation gives the possibility of identification and evaluation of the most eminent economic development indicators for each country separately. Finally, in this way, the proposed method has proved to be a successful model for the evaluation of the Balkan peninsula countries' economic development competition.
Keywords: Balkan peninsula countries, standard deviation, multicriteria decision making, mathematical compromise programming, multicriteria decision making, multicriteria analysis, multicriteria decision analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7949836 A Fuzzy Mathematical Model for Order Acceptance and Scheduling Problem
Authors: E. Koyuncu
Abstract:
The problem of Order Acceptance and Scheduling (OAS) is defined as a joint decision of which orders to accept for processing and how to schedule them. Any linear programming model representing real-world situation involves the parameters defined by the decision maker in an uncertain way or by means of language statement. Fuzzy data can be used to incorporate vagueness in the real-life situation. In this study, a fuzzy mathematical model is proposed for a single machine OAS problem, where the orders are defined by their fuzzy due dates, fuzzy processing times, and fuzzy sequence dependent setup times. The signed distance method, one of the fuzzy ranking methods, is used to handle the fuzzy constraints in the model.
Keywords: Fuzzy mathematical programming, fuzzy ranking, order acceptance, single machine scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12839835 Calculus-based Runtime Verification
Authors: Xuan Qi, Changzhi Zhao
Abstract:
In this paper, a uniform calculus-based approach for synthesizing monitors checking correctness properties specified by a large variety of logics at runtime is provided, including future and past time logics, interval logics, state machine and parameterized temporal logics. We present a calculus mechanism to synthesize monitors from the logical specification for the incremental analysis of execution traces during test and real run. The monitor detects both good and bad prefix of a particular kind, namely those that are informative for the property under investigation. We elaborate the procedure of calculus as monitors.Keywords: calculus, eagle logic, monitor synthesis, runtime verification
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12589834 Real Time Monitoring of Long Slender Shaft by Distributed-Lumped Modeling Techniques
Authors: Sina Babadi, K. M. Ebrahimi
Abstract:
The aim of this paper is to determine the stress levels at the end of a long slender shaft such as a drilling assembly used in the oil or gas industry using a mathematical model in real-time. The torsional deflection experienced by this type of drilling shaft (about 4 KM length and 20 cm diameter hollow shaft with a thickness of 1 cm) can only be determined using a distributed modeling technique. The main objective of this project is to calculate angular velocity and torque at the end of the shaft by TLM method and also analyzing of the behavior of the system by transient response. The obtained result is compared with lumped modeling technique the importance of these results will be evident only after the mentioned comparison. Two systems have different transient responses and in this project because of the length of the shaft transient response is very important.Keywords: Distributed Lumped modeling, Lumped modeling, Drill string, Angular Velocity, Torque.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14619833 High Performance Electrocardiogram Steganography Based on Fast Discrete Cosine Transform
Authors: Liang-Ta Cheng, Ching-Yu Yang
Abstract:
Based on fast discrete cosine transform (FDCT), the authors present a high capacity and high perceived quality method for electrocardiogram (ECG) signal. By using a simple adjusting policy to the 1-dimentional (1-D) DCT coefficients, a large volume of secret message can be effectively embedded in an ECG host signal and be successfully extracted at the intended receiver. Simulations confirmed that the resulting perceived quality is good, while the hiding capability of the proposed method significantly outperforms that of existing techniques. In addition, our proposed method has a certain degree of robustness. Since the computational complexity is low, it is feasible for our method being employed in real-time applications.Keywords: Data hiding, ECG steganography, fast discrete cosine transform, 1-D DCT bundle, real-time applications.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8129832 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization
Authors: Hironori Karachi, Haruka Yamashita
Abstract:
Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.
Keywords: Data science, non-negative matrix factorization, missing data, quality of services.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4539831 Using Information Theory to Observe Natural Intelligence and Artificial Intelligence
Authors: Lipeng Zhang, Limei Li, Yanming Pearl Zhang
Abstract:
This paper takes a philosophical view as axiom, and reveals the relationship between information theory and Natural Intelligence and Artificial Intelligence under real world conditions. This paper also derives the relationship between natural intelligence and nature. According to communication principle of information theory, Natural Intelligence can be divided into real part and virtual part. Based on information theory principle that Information does not increase, the restriction mechanism of Natural Intelligence creativity is conducted. The restriction mechanism of creativity reveals the limit of natural intelligence and artificial intelligence. The paper provides a new angle to observe natural intelligence and artificial intelligence.Keywords: Natural intelligence, artificial intelligence, creativity, information theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19739830 Fuzzy Gauge Capability (Cg and Cgk) through Buckley Approach
Authors: Seyed Habib A. Rahmati, Mohsen Sadegh Amalnick
Abstract:
Different terms of the Statistical Process Control (SPC) has sketch in the fuzzy environment. However, Measurement System Analysis (MSA), as a main branch of the SPC, is rarely investigated in fuzzy area. This procedure assesses the suitability of the data to be used in later stages or decisions of the SPC. Therefore, this research focuses on some important measures of MSA and through a new method introduces the measures in fuzzy environment. In this method, which works based on Buckley approach, imprecision and vagueness nature of the real world measurement are considered simultaneously. To do so, fuzzy version of the gauge capability (Cg and Cgk) are introduced. The method is also explained through example clearly.Keywords: SPC, MSA, gauge capability, Cg, Cgk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 51789829 ANFIS Approach for Locating Faults in Underground Cables
Authors: Magdy B. Eteiba, Wael Ismael Wahba, Shimaa Barakat
Abstract:
This paper presents a fault identification, classification and fault location estimation method based on Discrete Wavelet Transform and Adaptive Network Fuzzy Inference System (ANFIS) for medium voltage cable in the distribution system.
Different faults and locations are simulated by ATP/EMTP, and then certain selected features of the wavelet transformed signals are used as an input for a training process on the ANFIS. Then an accurate fault classifier and locator algorithm was designed, trained and tested using current samples only. The results obtained from ANFIS output were compared with the real output. From the results, it was found that the percentage error between ANFIS output and real output is less than three percent. Hence, it can be concluded that the proposed technique is able to offer high accuracy in both of the fault classification and fault location.
Keywords: ANFIS, Fault location, Underground Cable, Wavelet Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27419828 DHT-LMS Algorithm for Sensorineural Loss Patients
Authors: Sunitha S. L., V. Udayashankara
Abstract:
Hearing impairment is the number one chronic disability affecting many people in the world. Background noise is particularly damaging to speech intelligibility for people with hearing loss especially for sensorineural loss patients. Several investigations on speech intelligibility have demonstrated sensorineural loss patients need 5-15 dB higher SNR than the normal hearing subjects. This paper describes Discrete Hartley Transform Power Normalized Least Mean Square algorithm (DHT-LMS) to improve the SNR and to reduce the convergence rate of the Least Means Square (LMS) for sensorineural loss patients. The DHT transforms n real numbers to n real numbers, and has the convenient property of being its own inverse. It can be effectively used for noise cancellation with less convergence time. The simulated result shows the superior characteristics by improving the SNR at least 9 dB for input SNR with zero dB and faster convergence rate (eigenvalue ratio 12) compare to time domain method and DFT-LMS.Keywords: Hearing Impairment, DHT-LMS, Convergence rate, SNR improvement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17259827 'Performance-Based' Seismic Methodology and Its Application in Seismic Design of Reinforced Concrete Structures
Authors: Jelena R. Pejović, Nina N. Serdar
Abstract:
This paper presents an analysis of the “Performance-Based” seismic design method, in order to overcome the perceived disadvantages and limitations of the existing seismic design approach based on force, in engineering practice. Bearing in mind, the specificity of the earthquake as a load and the fact that the seismic resistance of the structures solely depends on its behaviour in the nonlinear field, traditional seismic design approach based on force and linear analysis is not adequate. “Performance-Based” seismic design method is based on nonlinear analysis and can be used in everyday engineering practice. This paper presents the application of this method to eight-story high reinforced concrete building with combined structural system (reinforced concrete frame structural system in one direction and reinforced concrete ductile wall system in other direction). The nonlinear time-history analysis is performed on the spatial model of the structure using program Perform 3D, where the structure is exposed to forty real earthquake records. For considered building, large number of results were obtained. It was concluded that using this method we could, with a high degree of reliability, evaluate structural behavior under earthquake. It is obtained significant differences in the response of structures to various earthquake records. Also analysis showed that frame structural system had not performed well at the effect of earthquake records on soil like sand and gravel, while a ductile wall system had a satisfactory behavior on different types of soils.
Keywords: Ductile wall, frame system, nonlinear time-history analysis, performance-based methodology, RC building.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14959826 The Effect of User Comments on Traffic Application Usage
Authors: I. Gokasar, G. Bakioglu
Abstract:
With the unprecedented rates of technological improvements, people start to solve their problems with the help of technological tools. According to application stores and websites in which people evaluate and comment on the traffic apps, there are more than 100 traffic applications which have different features with respect to their purpose of usage ranging from the features of traffic apps for public transit modes to the features of traffic apps for private cars. This study focuses on the top 30 traffic applications which were chosen with respect to their download counts. All data about the traffic applications were obtained from related websites. The purpose of this study is to analyze traffic applications in terms of their categorical attributes with the help of developing a regression model. The analysis results suggest that negative interpretations (e.g., being deficient) does not lead to lower star ratings of the applications. However, those negative interpretations result in a smaller increase in star rate. In addition, women use higher star rates than men for the evaluation of traffic applications.
Keywords: Traffic App, real–time information, traffic congestion, regression analysis, dummy variables.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11789825 Implementation of a Low-Cost Instrumentation for an Open Cycle Wind Tunnel to Evaluate Pressure Coefficient
Authors: Cristian P. Topa, Esteban A. Valencia, Victor H. Hidalgo, Marco A. Martinez
Abstract:
Wind tunnel experiments for aerodynamic profiles display numerous advantages, such as: clean steady laminar flow, controlled environmental conditions, streamlines visualization, and real data acquisition. However, the experiment instrumentation usually is expensive, and hence, each test implies a incremented in design cost. The aim of this work is to select and implement a low-cost static pressure data acquisition system for a NACA 2412 airfoil in an open cycle wind tunnel. This work compares wind tunnel experiment with Computational Fluid Dynamics (CFD) simulation and parametric analysis. The experiment was evaluated at Reynolds of 1.65 e5, with increasing angles from -5° to 15°. The comparison between the approaches show good enough accuracy, between the experiment and CFD, additional parametric analysis results differ widely from the other methods, which complies with the lack of accuracy of the lateral approach due its simplicity.Keywords: Wind tunnel, low cost instrumentation, experimental testing, CFD simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8169824 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia
Authors: Yuyun Wabula, B. J. Dewancker
Abstract:
In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.
Keywords: Geolocation, Twitter, distribution analysis, human mobility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11929823 Uncertainty Multiple Criteria Decision Making Analysis for Stealth Combat Aircraft Selection
Authors: C. Ardil
Abstract:
Fuzzy set theory and its extensions (intuitionistic fuzzy sets, picture fuzzy sets, and neutrosophic sets) have been widely used to address imprecision and uncertainty in complex decision-making. However, they may struggle with inherent indeterminacy and inconsistency in real-world situations. This study introduces uncertainty sets as a promising alternative, offering a structured framework for incorporating both types of uncertainty into decision-making processes.This work explores the theoretical foundations and applications of uncertainty sets. A novel decision-making algorithm based on uncertainty set-based proximity measures is developed and demonstrated through a practical application: selecting the most suitable stealth combat aircraft.
The results highlight the effectiveness of uncertainty sets in ranking alternatives under uncertainty. Uncertainty sets offer several advantages, including structured uncertainty representation, robust ranking mechanisms, and enhanced decision-making capabilities due to their ability to account for ambiguity.Future research directions are also outlined, including comparative analysis with existing MCDM methods under uncertainty, sensitivity analysis to assess the robustness of rankings,and broader application to various MCDM problems with diverse complexities. By exploring these avenues, uncertainty sets can be further established as a valuable tool for navigating uncertainty in complex decision-making scenarios.
Keywords: Uncertainty set, stealth combat aircraft selection multiple criteria decision-making analysis, MCDM, uncertainty proximity analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1869822 Power Transformers Insulation Material Investigations: Partial Discharge
Authors: Jalal M. Abdallah
Abstract:
There is a great problem in testing and investigations the reliability of different type of transformers insulation materials. It summarized in how to create and simulate the real conditions of working transformer and testing its insulation materials for Partial Discharge PD, typically as in the working mode. A lot of tests may give untrue results as the physical behavior of the insulation material differs under tests from its working condition. In this work, the real working conditions were simulated, and a large number of specimens have been tested. The investigations first stage, begin with choosing samples of different types of insulation materials (papers, pressboards, etc.). The second stage, the samples were dried in ovens at 105 C0and 0.01bar for 48 hours, and then impregnated with dried and gasless oil (the water content less than 6 ppm.) at 105 C0and 0.01bar for 48 hours, after so specimen cooling at room pressure and temperature for 24 hours. The third stage is investigating PD for the samples using ICM PD measuring device. After that, a continuous test on oil-impregnated insulation materials (paper, pressboards) was developed, and the phase resolved partial discharge pattern of PD signals was measured. The important of this work in providing the industrial sector with trusted high accurate measuring results based on real simulated working conditions. All the PD patterns (results) associated with a discharge produced in well-controlled laboratory condition. They compared with other previous and other laboratory results. In addition, the influence of different temperatures condition on the partial discharge activities was studied.
Keywords: Transformers, insulation materials, voids, partial discharge (PD).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14319821 Instant Location Detection of Objects Moving at High-Speedin C-OTDR Monitoring Systems
Authors: Andrey V. Timofeev
Abstract:
The practical efficient approach is suggested to estimate the high-speed objects instant bounds in C-OTDR monitoring systems. In case of super-dynamic objects (trains, cars) is difficult to obtain the adequate estimate of the instantaneous object localization because of estimation lag. In other words, reliable estimation coordinates of monitored object requires taking some time for data observation collection by means of C-OTDR system, and only if the required sample volume will be collected the final decision could be issued. But it is contrary to requirements of many real applications. For example, in rail traffic management systems we need to get data of the dynamic objects localization in real time. The way to solve this problem is to use the set of statistical independent parameters of C-OTDR signals for obtaining the most reliable solution in real time. The parameters of this type we can call as «signaling parameters» (SP). There are several the SP’s which carry information about dynamic objects instant localization for each of COTDR channels. The problem is that some of these parameters are very sensitive to dynamics of seismoacoustic emission sources, but are non-stable. On the other hand, in case the SP is very stable it becomes insensitive as rule. This report contains describing of the method for SP’s co-processing which is designed to get the most effective dynamic objects localization estimates in the C-OTDR monitoring system framework.
Keywords: C-OTDR-system, co-processing of signaling parameters, high-speed objects localization, multichannel monitoring systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19079820 Fast Wavelet Image Denoising Based on Local Variance and Edge Analysis
Authors: Gaoyong Luo
Abstract:
The approach based on the wavelet transform has been widely used for image denoising due to its multi-resolution nature, its ability to produce high levels of noise reduction and the low level of distortion introduced. However, by removing noise, high frequency components belonging to edges are also removed, which leads to blurring the signal features. This paper proposes a new method of image noise reduction based on local variance and edge analysis. The analysis is performed by dividing an image into 32 x 32 pixel blocks, and transforming the data into wavelet domain. Fast lifting wavelet spatial-frequency decomposition and reconstruction is developed with the advantages of being computationally efficient and boundary effects minimized. The adaptive thresholding by local variance estimation and edge strength measurement can effectively reduce image noise while preserve the features of the original image corresponding to the boundaries of the objects. Experimental results demonstrate that the method performs well for images contaminated by natural and artificial noise, and is suitable to be adapted for different class of images and type of noises. The proposed algorithm provides a potential solution with parallel computation for real time or embedded system application.Keywords: Edge strength, Fast lifting wavelet, Image denoising, Local variance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2028