Search results for: Sinc Numerical Methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5996

Search results for: Sinc Numerical Methods

2396 A Kernel Classifier using Linearised Bregman Iteration

Authors: K. A. D. N. K Wimalawarne

Abstract:

In this paper we introduce a novel kernel classifier based on a iterative shrinkage algorithm developed for compressive sensing. We have adopted Bregman iteration with soft and hard shrinkage functions and generalized hinge loss for solving l1 norm minimization problem for classification. Our experimental results with face recognition and digit classification using SVM as the benchmark have shown that our method has a close error rate compared to SVM but do not perform better than SVM. We have found that the soft shrinkage method give more accuracy and in some situations more sparseness than hard shrinkage methods.

Keywords: Compressive sensing, Bregman iteration, Generalisedhinge loss, sparse, kernels, shrinkage functions

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1378
2395 Change Detection and Non Stationary Signals Tracking by Adaptive Filtering

Authors: Mounira RouaÐùnia, Noureddine Doghmane

Abstract:

In this paper we consider the problem of change detection and non stationary signals tracking. Using parametric estimation of signals based on least square lattice adaptive filters we consider for change detection statistical parametric methods using likelihood ratio and hypothesis tests. In order to track signals dynamics, we introduce a compensation procedure in the adaptive estimation. This will improve the adaptive estimation performances and fasten it-s convergence after changes detection.

Keywords: Change detection, Hypothesis test, likelihood ratioleast square lattice adaptive filters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
2394 Biological Data Integration using SOA

Authors: Noura Meshaan Al-Otaibi, Amin Yousef Noaman

Abstract:

Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. This research suggests the use of Service Oriented Architecture (SOA) to integrate biological data from different data sources. This work shows SOA will solve the problems that facing integration process and if the biologist scientists can access the biological data in easier way. There are several methods to implement SOA but web service is the most popular method. The Microsoft .Net Framework used to implement proposed architecture.

Keywords: Bioinformatics, Biological data, Data Integration, SOA and Web Services.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2474
2393 An Approach for Optimization of Functions and Reducing the Value of the Product by Using Virtual Models

Authors: A. Bocevska, G. Todorov, T. Neshkov

Abstract:

New developed approach for Functional Cost Analysis (FCA) based on virtual prototyping (VP) models in CAD/CAE environment, applicable and necessary in developing new products is presented. It is instrument for improving the value of the product while maintaining costs and/or reducing the costs of the product without reducing value. Five broad classes of VP methods are identified. Efficient use of prototypes in FCA is a vital activity that can make the difference between successful and unsuccessful entry of new products into the competitive word market. Successful realization of this approach is illustrated for a specific example using press joint power tool.

Keywords: CAD/CAE environment, Functional Cost Analysis (FCA), Virtual prototyping (VP) models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1333
2392 Ranking - Convex Risk Minimization

Authors: Wojciech Rejchel

Abstract:

The problem of ranking (rank regression) has become popular in the machine learning community. This theory relates to problems, in which one has to predict (guess) the order between objects on the basis of vectors describing their observed features. In many ranking algorithms a convex loss function is used instead of the 0-1 loss. It makes these procedures computationally efficient. Hence, convex risk minimizers and their statistical properties are investigated in this paper. Fast rates of convergence are obtained under conditions, that look similarly to the ones from the classification theory. Methods used in this paper come from the theory of U-processes as well as empirical processes.

Keywords: Convex loss function, empirical risk minimization, empirical process, U-process, boosting, euclidean family.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1414
2391 Optimization of Unweighted Minimum Vertex Cover

Authors: S. Balaji, V. Swaminathan, K. Kannan

Abstract:

The Minimum Vertex Cover (MVC) problem is a classic graph optimization NP - complete problem. In this paper a competent algorithm, called Vertex Support Algorithm (VSA), is designed to find the smallest vertex cover of a graph. The VSA is tested on a large number of random graphs and DIMACS benchmark graphs. Comparative study of this algorithm with the other existing methods has been carried out. Extensive simulation results show that the VSA can yield better solutions than other existing algorithms found in the literature for solving the minimum vertex cover problem.

Keywords: vertex cover, vertex support, approximation algorithms, NP - complete problem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2490
2390 Quantitative Analysis of PCA, ICA, LDA and SVM in Face Recognition

Authors: Liton Jude Rozario, Mohammad Reduanul Haque, Md. Ziarul Islam, Mohammad Shorif Uddin

Abstract:

Face recognition is a technique to automatically identify or verify individuals. It receives great attention in identification, authentication, security and many more applications. Diverse methods had been proposed for this purpose and also a lot of comparative studies were performed. However, researchers could not reach unified conclusion. In this paper, we are reporting an extensive quantitative accuracy analysis of four most widely used face recognition algorithms: Principal Component Analysis (PCA), Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA) and Support Vector Machine (SVM) using AT&T, Sheffield and Bangladeshi people face databases under diverse situations such as illumination, alignment and pose variations.

Keywords: PCA, ICA, LDA, SVM, face recognition, noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2431
2389 Structural and Electrochemical Characterization of Columnar-Structured Mn-Doped Bi26Mo10O69-d Electrolytes

Authors: Maria V. Morozova, Zoya A. Mikhaylovskaya, Elena S. Buyanova, Sofia A. Petrova, Ksenia V. Arishina, Robert G. Zaharov

Abstract:

The present work is devoted to the investigation of two series of doped bismuth molybdates: Bi26-2xMn2xMo10O69-d and Bi26Mo10-2yMn2yO69-d. Complex oxides were synthesized by conventional solid state technology and by co-precipitation method. The products were identified by powder diffraction. The powders and ceramic samples were examined by means of densitometry, laser diffraction, and electron microscopic methods. Porosity of the ceramic materials was estimated using the hydrostatic method. The electrical conductivity measurements were carried out using impedance spectroscopy method.

Keywords: Bismuth molybdate, columnar structures, impedance spectroscopy, oxygen ionic conductors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1612
2388 A Study and Implementation of On-line Learning Diagnosis and Inquiry System

Authors: YuLung Wu

Abstract:

In Knowledge Structure Graph, each course unit represents a phase of learning activities. Both learning portfolios and Knowledge Structure Graphs contain learning information of students and let teachers know which content are difficulties and fails. The study purposes "Dual Mode On-line Learning Diagnosis System" that integrates two search methods: learning portfolio and knowledge structure. Teachers can operate the proposed system and obtain the information of specific students without any computer science background. The teachers can find out failed students in advance and provide remedial learning resources.

Keywords: Knowledge Structure Graph, On-line LearningDiagnosis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1465
2387 Performance Analysis of MATLAB Solvers in the Case of a Quadratic Programming Generation Scheduling Optimization Problem

Authors: Dávid Csercsik, Péter Kádár

Abstract:

In the case of the proposed method, the problem is parallelized by considering multiple possible mode of operation profiles, which determine the range in which the generators operate in each period. For each of these profiles, the optimization is carried out independently, and the best resulting dispatch is chosen. For each such profile, the resulting problem is a quadratic programming (QP) problem with a potentially negative definite Q quadratic term, and constraints depending on the actual operation profile. In this paper we analyze the performance of available MATLAB optimization methods and solvers for the corresponding QP.

Keywords: Economic dispatch, optimization, quadratic programming, MATLAB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 948
2386 A Family of Minimal Residual Based Algorithm for Adaptive Filtering

Authors: Noor Atinah Ahmad

Abstract:

The Minimal Residual (MR) is modified for adaptive filtering application. Three forms of MR based algorithm are presented: i) the low complexity SPCG, ii) MREDSI, and iii) MREDSII. The low complexity is a reduced complexity version of a previously proposed SPCG algorithm. Approximations introduced reduce the algorithm to an LMS type algorithm, but, maintain the superior convergence of the SPCG algorithm. Both MREDSI and MREDSII are MR based methods with Euclidean direction of search. The choice of Euclidean directions is shown via simulation to give better misadjustment compared to their gradient search counterparts.

Keywords: Adaptive filtering, Adaptive least square, Minimalresidual method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1442
2385 Effective Context Lossless Image Coding Approach Based on Adaptive Prediction

Authors: Grzegorz Ulacha, Ryszard Stasiński

Abstract:

In the paper an effective context based lossless coding technique is presented. Three principal and few auxiliary contexts are defined. The predictor adaptation technique is an improved CoBALP algorithm, denoted CoBALP+. Cumulated predictor error combining 8 bias estimators is calculated. It is shown experimentally that indeed, the new technique is time-effective while it outperforms the well known methods having reasonable time complexity, and is inferior only to extremely computationally complex ones.

Keywords: Adaptive prediction, context coding, image losslesscoding, prediction error bias correction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1351
2384 TBOR: Tree Based Opportunistic Routing for Mobile Ad Hoc Networks

Authors: Y. Harold Robinson, M. Rajaram, E. Golden Julie, S. Balaji

Abstract:

A mobile ad hoc network (MANET) is a wireless communication network where nodes that are not within direct transmission range establish their communication via the help of other nodes to forward data. Routing protocols in MANETs are usually categorized as proactive. Tree Based Opportunistic Routing (TBOR) finds a multipath link based on maximum probability of the throughput. The simulation results show that the presented method is performed very well compared to the existing methods in terms of throughput, delay and routing overhead.

Keywords: Mobile ad hoc networks, opportunistic data forwarding, proactive Source routing, BFS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1223
2383 Multi-agent Data Fusion Architecture for Intelligent Web Information Retrieval

Authors: Amin Milani Fard, Mohsen Kahani, Reza Ghaemi, Hamid Tabatabaee

Abstract:

In this paper we propose a multi-agent architecture for web information retrieval using fuzzy logic based result fusion mechanism. The model is designed in JADE framework and takes advantage of JXTA agent communication method to allow agent communication through firewalls and network address translators. This approach enables developers to build and deploy P2P applications through a unified medium to manage agent-based document retrieval from multiple sources.

Keywords: Information retrieval systems, list fusion methods, document score, multi-agent systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1600
2382 Tests and Measurements of Image Acquisition Characteristics for Image Sensors

Authors: Seongsoo Lee, Jong-Bae Lee, Wookkang Lee, Duyen Hai Pham

Abstract:

In the image sensors, the acquired image often differs from the real image in luminance or chrominance due to fabrication defects or nonlinear characteristics, which often lead to pixel defects or sensor failure. Therefore, the image acquisition characteristics of image sensors should be measured and tested before they are mounted on the target product. In this paper, the standardized test and measurement methods of image sensors are introduced. It applies standard light source to the image sensor under test, and the characteristics of the acquired image is compared with ideal values.

Keywords: Image Sensor, Image Acquisition Characteristics, Defect, Failure, Standard, Test, Measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1689
2381 A Parallel Quadtree Approach for Image Compression using Wavelets

Authors: Hamed Vahdat Nejad, Hossein Deldari

Abstract:

Wavelet transforms are multiresolution decompositions that can be used to analyze signals and images. Image compression is one of major applications of wavelet transforms in image processing. It is considered as one of the most powerful methods that provides a high compression ratio. However, its implementation is very time-consuming. At the other hand, parallel computing technologies are an efficient method for image compression using wavelets. In this paper, we propose a parallel wavelet compression algorithm based on quadtrees. We implement the algorithm using MatlabMPI (a parallel, message passing version of Matlab), and compute its isoefficiency function, and show that it is scalable. Our experimental results confirm the efficiency of the algorithm also.

Keywords: Image compression, MPI, Parallel computing, Wavelets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2024
2380 Enhancing the Quality of Learning by Using an Innovative Approach for Teaching Energy in Secondary Schools

Authors: Adriana Alexandru, Ovidiu Bica, Eleonora Tudora, Cristina Simona Alecu, Cristina-Adriana Alexandru, Ioan Covalcic

Abstract:

This paper presents the results of the authors in designing, experimenting, assessing and transferring an innovative approach to energy education in secondary schools, aimed to enhance the quality of learning in terms of didactic curricula and pedagogic methods. The training is online delivered to youngsters via e-Books and portals specially designed for this purpose or by learning by doing via interactive games. An online educational methodology is available teachers.

Keywords: Education, eLearning, Energy Efficiency, InternetMethodology, Renewable Energy Sources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1665
2379 UK GAAP and IFRS Standards: Similarities and Differences

Authors: Feddaoui Amina

Abstract:

This paper aimed to help researchers and international companies to the differences and similarities between IFRS (International financial reporting standards) and UK GAAP or UK accounting principles, and to the accounting changes between standard setting of the International Accounting Standards Board and the Accounting Standards Board in United Kingdom. We will use in this study statistical methods to calculate similarities and difference frequencies between the UK standards and IFRS standards, according to the PricewaterhouseCoopers report in 2005. We will use the one simple test to confirm or refuse our hypothesis. In conclusion, we found that the gap between UK GAAP and IFRS is small.

Keywords: Accounting, UK GAAP, IFRS, similarities, differences.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3598
2378 The Study of Cost Accounting in S Company Based On TDABC

Authors: Heng Ma

Abstract:

Third-party warehousing logistics has an important role in the development of external logistics. At present, the third-party logistics in our country is still a new industry, the accounting system has not yet been established, the current financial accounting system of third-party warehousing logistics is mainly in the traditional way of thinking, and only able to provide the total cost information of the entire enterprise during the accounting period, unable to reflect operating indirect cost information. In order to solve the problem of third-party logistics industry cost information distortion, improve the level of logistics cost management, the paper combines theoretical research and case analysis method to reflect cost allocation by building third-party logistics costing model using Time-Driven Activity-Based Costing(TDABC), and takes S company as an example to account and control the warehousing logistics cost.Based on the idea of “Products consume activities and activities consume resources”, TDABC put time into the main cost driver and use time-consuming equation resources assigned to cost objects. In S company, the objects focuses on three warehouse, engaged with warehousing and transportation (the second warehouse, transport point) service. These three warehouse respectively including five departments, Business Unit, Production Unit, Settlement Center, Security Department and Equipment Division, the activities in these departments are classified by in-out of storage forecast, in-out of storage or transit and safekeeping work. By computing capacity cost rate, building the time-consuming equation, the paper calculates the final operation cost so as to reveal the real cost.The numerical analysis results show that the TDABC can accurately reflect the cost allocation of service customers and reveal the spare capacity cost of resource center, verifies the feasibility and validity of TDABC in third-party logistics industry cost accounting. It inspires enterprises focus on customer relationship management and reduces idle cost to strengthen the cost management of third-party logistics enterprises.

Keywords: Third-party logistics enterprises, TDABC, cost management, S company.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2435
2377 Bounds on Reliability of Parallel Computer Interconnection Systems

Authors: Ranjan Kumar Dash, Chita Ranjan Tripathy

Abstract:

The evaluation of residual reliability of large sized parallel computer interconnection systems is not practicable with the existing methods. Under such conditions, one must go for approximation techniques which provide the upper bound and lower bound on this reliability. In this context, a new approximation method for providing bounds on residual reliability is proposed here. The proposed method is well supported by two algorithms for simulation purpose. The bounds on residual reliability of three different categories of interconnection topologies are efficiently found by using the proposed method

Keywords: Parallel computer network, reliability, probabilisticgraph, interconnection networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1343
2376 Airliner-UAV Flight Formation in Climb Regime

Authors: Pavel Zikmund, Robert Popela

Abstract:

Extreme formation is a theoretical concept of selfsustain flight when a big airliner is followed by a small UAV glider flying in the airliner wake vortex. The paper presents results of a climb analysis with the goal to lift the gliding UAV to airliners cruise altitude. Wake vortex models, the UAV drag polar and basic parameters and airliner’s climb profile are introduced at first. Afterwards, flight performance of the UAV in a wake vortex is evaluated by analytical methods. Time history of optimal distance between an airliner and the UAV during a climb is determined. The results are encouraging. Therefore available UAV drag margin for electricity generation is figured out for different vortex models.

Keywords: Flight in formation, self-sustained flight, UAV, wake vortex.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1990
2375 Modified Vector Quantization Method for Image Compression

Authors: K.Somasundaram, S.Domnic

Abstract:

A low bit rate still image compression scheme by compressing the indices of Vector Quantization (VQ) and generating residual codebook is proposed. The indices of VQ are compressed by exploiting correlation among image blocks, which reduces the bit per index. A residual codebook similar to VQ codebook is generated that represents the distortion produced in VQ. Using this residual codebook the distortion in the reconstructed image is removed, thereby increasing the image quality. Our scheme combines these two methods. Experimental results on standard image Lena show that our scheme can give a reconstructed image with a PSNR value of 31.6 db at 0.396 bits per pixel. Our scheme is also faster than the existing VQ variants.

Keywords: Image compression, Vector Quantization, Residual Codebook.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1439
2374 Induced Bone Tissue Temperature in Drilling Procedures: A Comparative Laboratory Study with and without Lubrication

Authors: L. Roseiro, C. Veiga, V. Maranha, A.Neto, N. Laraqi, A. Baïri, N. Alilat

Abstract:

In orthopedic surgery there are various situations in which the surgeon needs to implement methods of cutting and drilling the bone. With this type of procedure the generated friction leads to a localized increase in temperature, which may lead to the bone necrosis. Recognizing the importance of studying this phenomenon, an experimental evaluation of the temperatures developed during the procedure of drilling bone has been done. Additionally the influence of the use of the procedure with / without additional lubrication during drilling of bone has also been done. The obtained results are presented and discussed and suggests an advantage in using additional lubrication as a way to minimize the appearance of bone tissue necrosis during bone drilling procedures.

Keywords: Bone Necrosis, Bone Drilling, Thermography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2055
2373 WAF: an Interface Web Agent Framework

Authors: Xizhi Li, Qinming He

Abstract:

A trend in agent community or enterprises is that they are shifting from closed to open architectures composed of a large number of autonomous agents. One of its implications could be that interface agent framework is getting more important in multi-agent system (MAS); so that systems constructed for different application domains could share a common understanding in human computer interface (HCI) methods, as well as human-agent and agent-agent interfaces. However, interface agent framework usually receives less attention than other aspects of MAS. In this paper, we will propose an interface web agent framework which is based on our former project called WAF and a Distributed HCI template. A group of new functionalities and implications will be discussed, such as web agent presentation, off-line agent reference, reconfigurable activation map of agents, etc. Their enabling techniques and current standards (e.g. existing ontological framework) are also suggested and shown by examples from our own implementation in WAF.

Keywords: HCI, Interface agent, MAS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657
2372 Monte Carlo and Biophysics Analysis in a Criminal Trial

Authors: Luca Indovina, Carmela Coppola, Carlo Altucci, Riccardo Barberi, Rocco Romano

Abstract:

In this paper a real court case, held in Italy at the Court of Nola, in which a correct physical description, conducted with both a Monte Carlo and biophysical analysis, would have been sufficient to arrive at conclusions confirmed by documentary evidence, is considered. This will be an example of how forensic physics can be useful in confirming documentary evidence in order to reach hardly questionable conclusions. This was a libel trial in which the defendant, Mr. DS (Defendant for Slander), had falsely accused one of his neighbors, Mr. OP (Offended Person), of having caused him some damages. The damages would have been caused by an external plaster piece that would have detached from the neighbor’s property and would have hit Mr DS while he was in his garden, much more than a meter far away from the facade of the building from which the plaster piece would have detached. In the trial, Mr. DS claimed to have suffered a scratch on his forehead, but he never showed the plaster that had hit him, nor was able to tell from where the plaster would have arrived. Furthermore, Mr. DS presented a medical certificate with a diagnosis of contusion of the cerebral cortex. On the contrary, the images of Mr. OP’s security cameras do not show any movement in the garden of Mr. DS in a long interval of time (about 2 hours) around the time of the alleged accident, nor do they show any people entering or coming out from the house of Mr. DS in the same interval of time. Biophysical analysis shows that both the diagnosis of the medical certificate and the wound declared by the defendant, already in conflict with each other, are not compatible with the fall of external plaster pieces too small to be found. The wind was at a level 1 of the Beaufort scale, that is, unable to raise even dust (level 4 of the Beaufort scale). Therefore, the motion of the plaster pieces can be described as a projectile motion, whereas collisions with the building cornice can be treated using Newtons law of coefficients of restitution. Numerous numerical Monte Carlo simulations show that the pieces of plaster would not have been able to reach even the garden of Mr. DS, let alone a distance over 1.30 meters. Results agree with the documentary evidence (images of Mr. OP’s security cameras) that Mr. DS could not have been hit by plaster pieces coming from Mr. OP’s property.

Keywords: Biophysical analysis, Monte Carlo simulations, Newton’s law of restitution, projectile motion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 614
2371 Extraction of Squalene from Lebanese Olive Oil

Authors: Henri El Zakhem, Christina Romanos, Charlie Bakhos, Hassan Chahal, Jessica Koura

Abstract:

Squalene is a valuable component of the oil composed of 30 carbon atoms and is mainly used for cosmetic materials. The main concern of this article is to study the Squalene composition in the Lebanese olive oil and to compare it with foreign oil results. To our knowledge, extraction of Squalene from the Lebanese olive oil has not been conducted before. Three different techniques were studied and experiments were performed on three brands of olive oil, Al Wadi Al Akhdar, Virgo Bio and Boulos. The techniques performed are the Fractional Crystallization, the Soxhlet and the Esterification. By comparing the results, it is found that the Lebanese oil contains squalene and Soxhlet method is the most effective between the three methods extracting about 6.5E-04 grams of Squalene per grams of olive oil.

Keywords: Squalene, extraction, crystallization, Soxhlet.‎

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2306
2370 Evolutionary Query Optimization for Heterogeneous Distributed Database Systems

Authors: Reza Ghaemi, Amin Milani Fard, Hamid Tabatabaee, Mahdi Sadeghizadeh

Abstract:

Due to new distributed database applications such as huge deductive database systems, the search complexity is constantly increasing and we need better algorithms to speedup traditional relational database queries. An optimal dynamic programming method for such high dimensional queries has the big disadvantage of its exponential order and thus we are interested in semi-optimal but faster approaches. In this work we present a multi-agent based mechanism to meet this demand and also compare the result with some commonly used query optimization algorithms.

Keywords: Information retrieval systems, list fusion methods, document score, multi-agent systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3424
2369 Safety Climate Assessment and Its Impact on the Productivity of Construction Enterprises

Authors: Krzysztof J. Czarnocki, F. Silveira, E. Czarnocka, K. Szaniawska

Abstract:

Research background: Problems related to the occupational health and decreasing level of safety occur commonly in the construction industry. Important factor in the occupational safety in construction industry is scaffold use. All scaffolds used in construction, renovation, and demolition shall be erected, dismantled and maintained in accordance with safety procedure. Increasing demand for new construction projects unfortunately still is linked to high level of occupational accidents. Therefore, it is crucial to implement concrete actions while dealing with scaffolds and risk assessment in construction industry, the way on doing assessment and liability of assessment is critical for both construction workers and regulatory framework. Unfortunately, professionals, who tend to rely heavily on their own experience and knowledge when taking decisions regarding risk assessment, may show lack of reliability in checking the results of decisions taken. Purpose of the article: The aim was to indicate crucial parameters that could be modeling with Risk Assessment Model (RAM) use for improving both building enterprise productivity and/or developing potential and safety climate. The developed RAM could be a benefit for predicting high-risk construction activities and thus preventing accidents occurred based on a set of historical accident data. Methodology/Methods: A RAM has been developed for assessing risk levels as various construction process stages with various work trades impacting different spheres of enterprise activity. This project includes research carried out by teams of researchers on over 60 construction sites in Poland and Portugal, under which over 450 individual research cycles were carried out. The conducted research trials included variable conditions of employee exposure to harmful physical and chemical factors, variable levels of stress of employees and differences in behaviors and habits of staff. Genetic modeling tool has been used for developing the RAM. Findings and value added: Common types of trades, accidents, and accident causes have been explored, in addition to suitable risk assessment methods and criteria. We have found that the initial worker stress level is more direct predictor for developing the unsafe chain leading to the accident rather than the workload, or concentration of harmful factors at the workplace or even training frequency and management involvement.

Keywords: Civil engineering, occupational health, productivity, safety climate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1116
2368 Institutional Aspects of Information Security in Russian Economy

Authors: Mingaleva Zhanna, Kapuskina Tatiana

Abstract:

The article touches upon questions of information security in Russian Economy. It covers theoretical bases of information security and causes of its development. The theory is proved by the analysis of business activities and the main tendencies of information security development. Perm region has been chosen as the bases for the analysis, being the fastestdeveloping region that uses methods of information security in managing it economy. As a result of the study the authors of the given article have formulated their own vision of the problem of information security in various branches of economy and stated prospects of information security development and its growing role in Russian economy

Keywords: security of business, management of information security, institutional analyses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1319
2367 A General Regression Test Selection Technique

Authors: Walid S. Abd El-hamid, Sherif S. El-etriby, Mohiy M. Hadhoud

Abstract:

This paper presents a new methodology to select test cases from regression test suites. The selection strategy is based on analyzing the dynamic behavior of the applications that written in any programming language. Methods based on dynamic analysis are more safe and efficient. We design a technique that combine the code based technique and model based technique, to allow comparing the object oriented of an application that written in any programming language. We have developed a prototype tool that detect changes and select test cases from test suite.

Keywords: Regression testing, Model based testing, Dynamicbehavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1979