Search results for: mileage based reliability
10879 Lexicon-Based Sentiment Analysis for Stock Movement Prediction
Authors: Zane Turner, Kevin Labille, Susan Gauch
Abstract:
Sentiment analysis is a broad and expanding field that aims to extract and classify opinions from textual data. Lexicon-based approaches are based on the use of a sentiment lexicon, i.e., a list of words each mapped to a sentiment score, to rate the sentiment of a text chunk. Our work focuses on predicting stock price change using a sentiment lexicon built from financial conference call logs. We present a method to generate a sentiment lexicon based upon an existing probabilistic approach. By using a domain-specific lexicon, we outperform traditional techniques and demonstrate that domain-specific sentiment lexicons provide higher accuracy than generic sentiment lexicons when predicting stock price change.
Keywords: Lexicon, sentiment analysis, stock movement prediction., computational finance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 77910878 A Fast Adaptive Content-based Retrieval System of Satellite Images Database using Relevance Feedback
Authors: Hanan Mahmoud Ezzat Mahmoud, Alaa Abd El Fatah Hefnawy
Abstract:
In this paper, we present a system for content-based retrieval of large database of classified satellite images, based on user's relevance feedback (RF).Through our proposed system, we divide each satellite image scene into small subimages, which stored in the database. The modified radial basis functions neural network has important role in clustering the subimages of database according to the Euclidean distance between the query feature vector and the other subimages feature vectors. The advantage of using RF technique in such queries is demonstrated by analyzing the database retrieval results.Keywords: content-based image retrieval, large database of image, RBF neural net, relevance feedback
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 147010877 Mathematical Model of Depletion of Forestry Resource: Effect of Synthetic Based Industries
Authors: Manisha Chaudhary, Joydip Dhar, Govind Prasad Sahu
Abstract:
A mathematical model is proposed considering the forest biomass density B(t), density of wood based industries W(t) and density of synthetic industries S(t). It is assumed that the forest biomass grows logistically in the absence of wood based industries, but depletion of forestry biomass is due to presence of wood based industries. The growth of wood based industries depends on B(t), while S(t) grows at a constant rate, independent of B(t). Further there is a competition between W(t) and S(t) according to market demand. The proposed model has four ecologically feasible steady states, namely, E1: forest biomass free and wood industries free equilibrium; E2: wood industries free equilibrium and two coexisting equilibria E∗1 , E∗2 . Behavior of the system near all feasible equilibria is analyzed using the stability theory of differential equations. In the proposed model, the natural depletion rate h1 is a crucial parameter and system exhibits Hopf-bifurcation about the non-trivial equilibrium with respect to h1. The analytical results are verified using numerical simulation.
Keywords: A mathematical model, Competition between wood based and synthetic industries, Hopf-bifurcation, Stability analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 349810876 Automatic Voice Classification System Based on Traditional Korean Medicine
Authors: Jaehwan Kang, Haejung Lee
Abstract:
This paper introduces an automatic voice classification system for the diagnosis of individual constitution based on Sasang Constitutional Medicine (SCM) in Traditional Korean Medicine (TKM). For the developing of this algorithm, we used the voices of 309 female speakers and extracted a total of 134 speech features from the voice data consisting of 5 sustained vowels and one sentence. The classification system, based on a rule-based algorithm that is derived from a non parametric statistical method, presents 3 types of decisions: reserved, positive and negative decisions. In conclusion, 71.5% of the voice data were diagnosed by this system, of which 47.7% were correct positive decisions and 69.7% were correct negative decisions.Keywords: Voice Classifier, Sasang Constitution Medicine, Traditional Korean Medicine, SCM, TKM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 138910875 Task-Based Language Teaching: A Paradigm Shift in ESL/EFL Teaching and Learning: A Case Study-Based Approach
Authors: Zehra Sultan
Abstract:
The study is based on the Task-based Language Teaching (TBLT) approach which is found to be very effective in the EFL/ESL classroom. This approach engages learners to acquire the usage of authentic language skills by interacting with the real world through a sequence of pedagogical tasks. The use of technology enhances the effectiveness of this approach. This study throws light on the historical background of TBLT, and its efficacy in the EFL /ESL classroom. In addition, this study precisely talks about the implementation of this approach in the General Foundation Program (GFP) of Muscat College, Oman. It furnishes the list of the pedagogical tasks embedded in the language curriculum of the GFP which are skillfully allied to the College graduate attributes. Moreover, the study also discusses the challenges pertaining to this approach from the point of view of teachers, students and its classroom application. Additionally, the operational success of this methodology is gauged through formative assessments of the GFP which is apparent in the students’ progress.
Keywords: Task-based language teaching, authentic language, communicative approach, real world activities, ESL/EFL activities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 95510874 Lexicon-Based Sentiment Analysis for Stock Movement Prediction
Authors: Zane Turner, Kevin Labille, Susan Gauch
Abstract:
Sentiment analysis is a broad and expanding field that aims to extract and classify opinions from textual data. Lexicon-based approaches are based on the use of a sentiment lexicon, i.e., a list of words each mapped to a sentiment score, to rate the sentiment of a text chunk. Our work focuses on predicting stock price change using a sentiment lexicon built from financial conference call logs. We introduce a method to generate a sentiment lexicon based upon an existing probabilistic approach. By using a domain-specific lexicon, we outperform traditional techniques and demonstrate that domain-specific sentiment lexicons provide higher accuracy than generic sentiment lexicons when predicting stock price change.
Keywords: Computational finance, sentiment analysis, sentiment lexicon, stock movement prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 113710873 Effect of Progressive Type-I Right Censoring on Bayesian Statistical Inference of Simple Step–Stress Acceleration Life Testing Plan under Weibull Life Distribution
Authors: Saleem Z. Ramadan
Abstract:
This paper discusses the effects of using progressive Type-I right censoring on the design of the Simple Step Accelerated Life testing using Bayesian approach for Weibull life products under the assumption of cumulative exposure model. The optimization criterion used in this paper is to minimize the expected pre-posterior variance of the Pth percentile time of failures. The model variables are the stress changing time and the stress value for the first step. A comparison between the conventional and the progressive Type-I right censoring is provided. The results have shown that the progressive Type-I right censoring reduces the cost of testing on the expense of the test precision when the sample size is small. Moreover, the results have shown that using strong priors or large sample size reduces the sensitivity of the test precision to the censoring proportion. Hence, the progressive Type-I right censoring is recommended in these cases as progressive Type-I right censoring reduces the cost of the test and doesn't affect the precision of the test a lot. Moreover, the results have shown that using direct or indirect priors affects the precision of the test.
Keywords: Reliability, Accelerated life testing, Cumulative exposure model, Bayesian estimation, Progressive Type-I censoring, Weibull distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 216110872 An UML Statechart Diagram-Based MM-Path Generation Approach for Object-Oriented Integration Testing
Authors: Ruilian Zhao, Ling Lin
Abstract:
MM-Path, an acronym for Method/Message Path, describes the dynamic interactions between methods in object-oriented systems. This paper discusses the classifications of MM-Path, based on the characteristics of object-oriented software. We categorize it according to the generation reasons, the effect scope and the composition of MM-Path. A formalized representation of MM-Path is also proposed, which has considered the influence of state on response method sequences of messages. .Moreover, an automatic MM-Path generation approach based on UML Statechart diagram has been presented, and the difficulties in identifying and generating MM-Path can be solved. . As a result, it provides a solid foundation for further research on test cases generation based on MM-Path.
Keywords: MM-Path, Message Sequence, Object-Oriented Integration Testing, Response Method Sequence, UML Statechart Diagram.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 261210871 Prediction of Unsteady Forced Convection over Square Cylinder in the Presence of Nanofluid by Using ANN
Authors: Ajoy Kumar Das, Prasenjit Dey
Abstract:
Heat transfer due to forced convection of copper water based nanofluid has been predicted by Artificial Neural network (ANN). The present nanofluid is formed by mixing copper nanoparticles in water and the volume fractions are considered here are 0% to 15% and the Reynolds number are kept constant at 100. The back propagation algorithm is used to train the network. The present ANN is trained by the input and output data which has been obtained from the numerical simulation, performed in finite volume based Computational Fluid Dynamics (CFD) commercial software Ansys Fluent. The numerical simulation based results are compared with the back propagation based ANN results. It is found that the forced convection heat transfer of water based nanofluid can be predicted correctly by ANN. It is also observed that the back propagation ANN can predict the heat transfer characteristics of nanofluid very quickly compared to standard CFD method.Keywords: Forced convection, Square cylinder, nanofluid, neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 236210870 A Review on Light Shafts Rendering for Indoor Scenes
Authors: Hatam H. Ali, Mohd Shahrizal Sunar, Hoshang Kolivand, Mohd Azhar Bin M. Arsad
Abstract:
Rendering light shafts is one of the important topics in computer gaming and interactive applications. The methods and models that are used to generate light shafts play crucial role to make a scene more realistic in computer graphics. This article discusses the image-based shadows and geometric-based shadows that contribute in generating volumetric shadows and light shafts, depending on ray tracing, radiosity, and ray marching technique. The main aim of this study is to provide researchers with background on a progress of light scattering methods so as to make it available for them to determine the technique best suited to their goals. It is also hoped that our classification helps researchers find solutions to the shortcomings of each method.
Keywords: Shaft of lights, realistic images, image-based, and geometric-based.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161210869 Feature-Based Machining using Macro
Authors: M. Razak, A. Jusoh, A. Zakaria
Abstract:
This paper presents an on-going research work on the implementation of feature-based machining via macro programming. Repetitive machining features such as holes, slots, pockets etc can readily be encapsulated in macros. Each macro consists of methods on how to machine the shape as defined by the feature. The macro programming technique comprises of a main program and subprograms. The main program allows user to select several subprograms that contain features and define their important parameters. With macros, complex machining routines can be implemented easily and no post processor is required. A case study on machining of a part that comprised of planar face, hole and pocket features using the macro programming technique was carried out. It is envisaged that the macro programming technique can be extended to other feature-based machining fields such as the newly developed STEP-NC domain.Keywords: Feature-based machining, CNC, Macro, STEP-NC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 268810868 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm
Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee
Abstract:
Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.Keywords: Enhanced ideal gas molecular movement, ideal gas molecular movement, model updating method, probability-based damage detection, uncertainty quantification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 107710867 Knowledge Representation Based On Interval Type-2 CFCM Clustering
Authors: Myung-Won Lee, Keun-Chang Kwak
Abstract:
This paper is concerned with knowledge representation and extraction of fuzzy if-then rules using Interval Type-2 Context-based Fuzzy C-Means clustering (IT2-CFCM) with the aid of fuzzy granulation. This proposed clustering algorithm is based on information granulation in the form of IT2 based Fuzzy C-Means (IT2-FCM) clustering and estimates the cluster centers by preserving the homogeneity between the clustered patterns from the IT2 contexts produced in the output space. Furthermore, we can obtain the automatic knowledge representation in the design of Radial Basis Function Networks (RBFN), Linguistic Model (LM), and Adaptive Neuro-Fuzzy Networks (ANFN) from the numerical input-output data pairs. We shall focus on a design of ANFN in this paper. The experimental results on an estimation problem of energy performance reveal that the proposed method showed a good knowledge representation and performance in comparison with the previous works.
Keywords: IT2-FCM, IT2-CFCM, context-based fuzzy clustering, adaptive neuro-fuzzy network, knowledge representation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 261710866 Validation of Automotive Centrals Using Hardware in the Loop-Body Control Unit and Lights
Authors: Marley Rosa Luciano, Rodney Rezende Saldanha
Abstract:
The race for electrification and the need for innovation to attract customers has led the automotive industry to do something different with vehicles. New emissions control challenges and efficient technological availability are the pillars of creation. The growing demand to upgrade industrial manufacturing systems creates actions that directly impact vehicle production. With this comes the search for new prototyping methods and virtual tools for component testing and validation, and vehicle systems have established themselves. The demand for Electronic Control Units (ECU) is increasing due to the availability of intelligence and safety in today's vehicles, directly affecting their development, performance, and functional testing. In order to keep up with global changes, the automotive industry uses different virtual environments to produce, verify and validate their vehicles and test prototypes used during development. Therefore, in this paper, integration and validation were performed using the Hardware in the Loop (HIL) test platform, focusing on the ECU Body Control Module (BCM). Then, a brief commentary reviews other test medium platforms, such as the Plywood Buck (PWB), and examines the reliability, flexibility, installation time, and cost of the three test platforms, software in the loop (SIL), Model in the loop (MIL), and HIL, to review their benefits, challenges, and issues in use and information to optimize the use of each platform and test medium.
Keywords: Automotive, Electronic Central Unit, xIL, Hardware in the loop.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32410865 Estimating Word Translation Probabilities for Thai – English Machine Translation using EM Algorithm
Authors: Chutchada Nusai, Yoshimi Suzuki, Haruaki Yamazaki
Abstract:
Selecting the word translation from a set of target language words, one that conveys the correct sense of source word and makes more fluent target language output, is one of core problems in machine translation. In this paper we compare the 3 methods of estimating word translation probabilities for selecting the translation word in Thai – English Machine Translation. The 3 methods are (1) Method based on frequency of word translation, (2) Method based on collocation of word translation, and (3) Method based on Expectation Maximization (EM) algorithm. For evaluation we used Thai – English parallel sentences generated by NECTEC. The method based on EM algorithm is the best method in comparison to the other methods and gives the satisfying results.Keywords: Machine translation, EM algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 167910864 A SWOT Analysis on Institutional Environments of University of the Punjab
Authors: Saghir Ahmad, Abid Hussain Ch., Atif Khalil, Misbah Malik
Abstract:
The major purpose of the study was to identify the institutional environments’ strengths, weaknesses, opportunities and threats of University of the Punjab, Lahore. The target population of the study was teachers of University of the Punjab Lahore. The sample of 235 teachers (155 males, 80 females) were selected through multistage stratified sampling technique. A questionnaire regarding the institutional environments of University SWOT Analysis “Strengths, Weaknesses, Opportunities, and Threats” was used to collect the required data for this study. The questionnaire consisted of two parts. The first part comprised of the demographic information (faculty, department, gender, teacher rank), while the second part included the statements regarding SWOT analysis (strengths, weaknesses, opportunities and threats). Reliability index (Cronbach’s Alpha) of the questionnaire was 0.87, which is statistically acceptable. Analysis of the data indicated that there was significant difference in the opinion of respondents. Teachers of Islamic studies and Laws had difference in their opinions regarding the institutional environment strengths, and opportunities and it was supported by the findings of the study. There was significant difference in opinions of male and female teachers regarding strengths and opportunities of university. And there was no significant difference in opinions of male and female teachers regarding weaknesses and threats of university.
Keywords: Institutional environments, SWOT analysis, teachers, University of the Punjab.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 201710863 Web-Based Architecture of a System for Design Assessment of Night Vision Devices
Authors: Daniela I. Borissova, Ivan C. Mustakerov, Evgeni D. Bantutov
Abstract:
Nowadays the devices of night vision are widely used both for military and civil applications. The variety of night vision applications require a variety of the night vision devices designs. A web-based architecture of a software system for design assessment before producing of night vision devices is developed. The proposed architecture of the web-based system is based on the application of a mathematical model for designing of night vision devices. An algorithm with two components – for iterative design and for intelligent design is developed and integrated into system architecture. The iterative component suggests compatible modules combinations to choose from. The intelligent component provides compatible combinations of modules satisfying given user requirements to device parameters. The proposed web-based architecture of a system for design assessment of night vision devices is tested via a prototype of the system. The testing showed the applicability of both iterative and intelligent components of algorithm.
Keywords: Night vision devices, design modeling, software architecture, web-based system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 215410862 Impact of Liquidity Crunch on Interbank Network
Authors: I. Lucas, N. Schomberg, F-A. Couturier
Abstract:
Most empirical studies have analyzed how liquidity risks faced by individual institutions turn into systemic risk. Recent banking crisis has highlighted the importance of grasping and controlling the systemic risk, and the acceptance by Central Banks to ease their monetary policies for saving default or illiquid banks. This last point shows that banks would pay less attention to liquidity risk which, in turn, can become a new important channel of loss. The financial regulation focuses on the most important and “systemic” banks in the global network. However, to quantify the expected loss associated with liquidity risk, it is worth to analyze sensitivity to this channel for the various elements of the global bank network. A small bank is not considered as potentially systemic; however the interaction of small banks all together can become a systemic element. This paper analyzes the impact of medium and small banks interaction on a set of banks which is considered as the core of the network. The proposed method uses the structure of agent-based model in a two-class environment. In first class, the data from actual balance sheets of 22 large and systemic banks (such as BNP Paribas or Barclays) are collected. In second one, to model a network as closely as possible to actual interbank market, 578 fictitious banks smaller than the ones belonging to first class have been split into two groups of small and medium ones. All banks are active on the European interbank network and have deposit and market activity. A simulation of 12 three month periods representing a midterm time interval three years is projected. In each period, there is a set of behavioral descriptions: repayment of matured loans, liquidation of deposits, income from securities, collection of new deposits, new demands of credit, and securities sale. The last two actions are part of refunding process developed in this paper. To strengthen reliability of proposed model, random parameters dynamics are managed with stochastic equations as rates the variations of which are generated by Vasicek model. The Central Bank is considered as the lender of last resort which allows banks to borrow at REPO rate and some ejection conditions of banks from the system are introduced.
Liquidity crunch due to exogenous crisis is simulated in the first class and the loss impact on other bank classes is analyzed though aggregate values representing the aggregate of loans and/or the aggregate of borrowing between classes. It is mainly shown that the three groups of European interbank network do not have the same response, and that intermediate banks are the most sensitive to liquidity risk.
Keywords: Systemic Risk, Financial Contagion, Liquidity Risk, Interbank Market, Network Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 202610861 Review and Comparison of Associative Classification Data Mining Approaches
Authors: Suzan Wedyan
Abstract:
Associative classification (AC) is a data mining approach that combines association rule and classification to build classification models (classifiers). AC has attracted a significant attention from several researchers mainly because it derives accurate classifiers that contain simple yet effective rules. In the last decade, a number of associative classification algorithms have been proposed such as Classification based Association (CBA), Classification based on Multiple Association Rules (CMAR), Class based Associative Classification (CACA), and Classification based on Predicted Association Rule (CPAR). This paper surveys major AC algorithms and compares the steps and methods performed in each algorithm including: rule learning, rule sorting, rule pruning, classifier building, and class prediction.
Keywords: Associative Classification, Classification, Data Mining, Learning, Rule Ranking, Rule Pruning, Prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 663310860 A Novel Non-Uniformity Correction Algorithm Based On Non-Linear Fit
Authors: Yang Weiping, Zhang Zhilong, Zhang Yan, Chen Zengping
Abstract:
Infrared focal plane arrays (IRFPA) sensors, due to their high sensitivity, high frame frequency and simple structure, have become the most prominently used detectors in military applications. However, they suffer from a common problem called the fixed pattern noise (FPN), which severely degrades image quality and limits the infrared imaging applications. Therefore, it is necessary to perform non-uniformity correction (NUC) on IR image. The algorithms of non-uniformity correction are classified into two main categories, the calibration-based and scene-based algorithms. There exist some shortcomings in both algorithms, hence a novel non-uniformity correction algorithm based on non-linear fit is proposed, which combines the advantages of the two algorithms. Experimental results show that the proposed algorithm acquires a good effect of NUC with a lower non-uniformity ratio.Keywords: Non-uniformity correction, non-linear fit, two-point correction, temporal Kalman filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 231610859 Layout Based Spam Filtering
Authors: Claudiu N.Musat
Abstract:
Due to the constant increase in the volume of information available to applications in fields varying from medical diagnosis to web search engines, accurate support of similarity becomes an important task. This is also the case of spam filtering techniques where the similarities between the known and incoming messages are the fundaments of making the spam/not spam decision. We present a novel approach to filtering based solely on layout, whose goal is not only to correctly identify spam, but also warn about major emerging threats. We propose a mathematical formulation of the email message layout and based on it we elaborate an algorithm to separate different types of emails and find the new, numerically relevant spam types.
Keywords: Clustering, layout, k-means, spam.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 164910858 A Study on Performance-Based Design Analysis for Vertical Extension of Apartment Units
Authors: Minsun Kim, Ki-Sun Choi, Hyun-Jee Lee, Young-Chan You
Abstract:
There is no reinforcement example for the renovation of the vertical and horizontal extension to existing building structures which is a shear wall type in apartment units in Korea. Among these existing structures, the structures which are shear wall type are rare overseas, while Korea has many shear wall apartment units. Recently, in Korea, a few researchers are trying to confirm the possibility of the vertical extension in existing building with shear walls. This study evaluates the possibility of the renovation by applying performance-based seismic design to existing buildings with shear walls in the analysis phase of the structure. In addition, force-based seismic design, used by general structural engineers in Korea, is carried out to compare the amount of reinforcement of walls, which is a main component of wall structure. As a result, we suggest that performance-based design obtains more economical advantages than force-based seismic design.Keywords: Vertical extension, performance-based design, renovation, shear wall structure, structural analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 108210857 A Purpose Based Usage Access Control Model
Abstract:
As privacy becomes a major concern for consumers and enterprises, many research have been focused on the privacy protecting technology in recent years. In this paper, we present a comprehensive approach for usage access control based on the notion purpose. In our model, purpose information associated with a given data element specifies the intended use of the subjects and objects in the usage access control model. A key feature of our model is that it allows when an access is required, the access purpose is checked against the intended purposes for the data item. We propose an approach to represent purpose information to support access control based on purpose information. Our proposed solution relies on usage access control (UAC) models as well as the components which based on the notions of the purpose information used in subjects and objects. Finally, comparisons with related works are analyzed.Keywords: Purpose, privacy, access control, authorization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188610856 Designing an Online Case-Based Library for Technology Integration in Teacher Education
Authors: Mustafa Tevfik Hebebci, Sirin Kucuk, Ismail Celik, A. Oguz Akturk, Ismail Sahin, Fetah Eren
Abstract:
The purpose of this paper is to introduce an interactive online case-study library website developed in a national project. The design goal of the website is to provide interactive, enhanced, case-based and online educational resource for educators through the purpose and within the scope of a national project. The ADDIE instructional design model was used in the development of the website for interactive case-based library. This library is developed on a web-based platform, which is important in terms of manageability, accessibility, and updateability of data. Users are able to sort the displayed case-studies by their titles, dates, ratings, view counts, etc. The usability test is used and the expert opinion is taken for the evaluation of the website. This website is a tool to integrate technology into education. It is believed that this website will be beneficial for pre-service and in-service teachers in terms of their professional developments.
Keywords: ADDIE, Case-based library, Design, Technology, Integration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 182410855 Numerical Analysis of the Performance of the DU91-W2-250 Airfoil for Straight-Bladed Vertical-Axis Wind Turbine Application
Authors: M. Raciti Castelli, G. Grandi, E. Benini
Abstract:
This paper presents a numerical analysis of the performance of a three-bladed Darrieus vertical-axis wind turbine based on the DU91-W2-250 airfoil. A complete campaign of 2-D simulations, performed for several values of tip speed ratio and based on RANS unsteady calculations, has been performed to obtain the rotor torque and power curves. Rotor performances have been compared with the results of a previous work based on the use of the NACA 0021 airfoil. Both the power coefficient and the torque coefficient have been determined as a function of the tip speed ratio. The flow field around rotor blades has also been analyzed. As a final result, the performance of the DU airfoil based rotor appears to be lower than the one based on the NACA 0021 blade section. This behavior could be due to the higher stall characteristics of the NACA profile, being the separation zone at the trailing edge more extended for the DU airfoil.Keywords: CFD, vertical axis wind turbine, DU91-W2-250, NACA 0021
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 336010854 Multi-Objective Optimization for Performance-based Seismic Retrofit using Connection Upgrade
Authors: Dong-Chul Lee, Byung-Kwan Oh, Se-Woon Choi, Hyo-Sun Park
Abstract:
The unanticipated brittle fracture of connection of the steel moment resisting frame (SMRF) occurred in 1994 the Northridge earthquake. Since then, the researches for the vulnerability of connection of the existing SMRF and for rehabilitation of those buildings were conducted. This paper suggests performance-based optimal seismic retrofit technique using connection upgrade. For optimal design, a multi-objective genetic algorithm(NSGA-II) is used. One of the two objective functions is to minimize initial cost and another objective function is to minimize lifetime seismic damages cost. The optimal algorithm proposed in this paper is performed satisfying specified performance objective based on FEMA 356. The nonlinear static analysis is performed for structural seismic performance evaluation. A numerical example of SAC benchmark SMRF is provided using the performance-based optimal seismic retrofit technique proposed in this paperKeywords: connection upgrade, performace-based seismicdesign, seismic retrofit, multi-objective optimization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 203710853 Optimization of Supersonic Ejector via Sequence-Adapted Micro-Genetic Algorithm
Authors: Kolar Jan, Dvorak Vaclav
Abstract:
In this study, an optimization of supersonic air-to-air ejector is carried out by a recently developed single-objective genetic algorithm based on adaption of sequence of individuals. Adaptation of sequence is based on Shape-based distance of individuals and embedded micro-genetic algorithm. The optimal sequence found defines the succession of CFD-aimed objective calculation within each generation of regular micro-genetic algorithm. A spring-based deformation mutates the computational grid starting the initial individualvia adapted population in the optimized sequence. Selection of a generation initial individual is knowledge-based. A direct comparison of the newly defined and standard micro-genetic algorithm is carried out for supersonic air-to-air ejector. The only objective is to minimize the loose of total stagnation pressure in the ejector. The result is that sequence-adopted micro-genetic algorithm can provide comparative results to standard algorithm but in significantly lower number of overall CFD iteration steps.
Keywords: Grid deformation, Micro-genetic algorithm, shapebased sequence, supersonic ejector.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156410852 A new Adaptive Approach for Histogram based Mouth Segmentation
Authors: Axel Panning, Robert Niese, Ayoub Al-Hamadi, Bernd Michaelis
Abstract:
The segmentation of mouth and lips is a fundamental problem in facial image analyisis. In this paper we propose a method for lip segmentation based on rg-color histogram. Statistical analysis shows, using the rg-color-space is optimal for this purpose of a pure color based segmentation. Initially a rough adaptive threshold selects a histogram region, that assures that all pixels in that region are skin pixels. Based on that pixels we build a gaussian model which represents the skin pixels distribution and is utilized to obtain a refined, optimal threshold. We are not incorporating shape or edge information. In experiments we show the performance of our lip pixel segmentation method compared to the ground truth of our dataset and a conventional watershed algorithm.Keywords: Feature extraction, Segmentation, Image processing, Application
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 178810851 On Mobile Checkpointing using Index and Time Together
Authors: Awadhesh Kumar Singh
Abstract:
Checkpointing is one of the commonly used techniques to provide fault-tolerance in distributed systems so that the system can operate even if one or more components have failed. However, mobile computing systems are constrained by low bandwidth, mobility, lack of stable storage, frequent disconnections and limited battery life. Hence, checkpointing protocols having lesser number of synchronization messages and fewer checkpoints are preferred in mobile environment. There are two different approaches, although not orthogonal, to checkpoint mobile computing systems namely, time-based and index-based. Our protocol is a fusion of these two approaches, though not first of its kind. In the present exposition, an index-based checkpointing protocol has been developed, which uses time to indirectly coordinate the creation of consistent global checkpoints for mobile computing systems. The proposed algorithm is non-blocking, adaptive, and does not use any control message. Compared to other contemporary checkpointing algorithms, it is computationally more efficient because it takes lesser number of checkpoints and does not need to compute dependency relationships. A brief account of important and relevant works in both the fields, time-based and index-based, has also been included in the presentation.
Keywords: Checkpointing, forced checkpoint, mobile computing, recovery, time-coordinated.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 148910850 A Medical Images Based Retrieval System using Soft Computing Techniques
Authors: Pardeep Singh, Sanjay Sharma
Abstract:
Content-Based Image Retrieval (CBIR) has been one on the most vivid research areas in the field of computer vision over the last 10 years. Many programs and tools have been developed to formulate and execute queries based on the visual or audio content and to help browsing large multimedia repositories. Still, no general breakthrough has been achieved with respect to large varied databases with documents of difering sorts and with varying characteristics. Answers to many questions with respect to speed, semantic descriptors or objective image interpretations are still unanswered. In the medical field, images, and especially digital images, are produced in ever increasing quantities and used for diagnostics and therapy. In several articles, content based access to medical images for supporting clinical decision making has been proposed that would ease the management of clinical data and scenarios for the integration of content-based access methods into Picture Archiving and Communication Systems (PACS) have been created. This paper gives an overview of soft computing techniques. New research directions are being defined that can prove to be useful. Still, there are very few systems that seem to be used in clinical practice. It needs to be stated as well that the goal is not, in general, to replace text based retrieval methods as they exist at the moment.Keywords: CBIR, GA, Rough sets, CBMIR
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2607