Search results for: Algorithms decision tree
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3107

Search results for: Algorithms decision tree

977 Business Rules for Data Warehouse

Authors: Rajeev Kaula

Abstract:

Business rules and data warehouse are concepts and technologies that impact a wide variety of organizational tasks. In general, each area has evolved independently, impacting application development and decision-making. Generating knowledge from data warehouse is a complex process. This paper outlines an approach to ease import of information and knowledge from a data warehouse star schema through an inference class of business rules. The paper utilizes the Oracle database for illustrating the working of the concepts. The star schema structure and the business rules are stored within a relational database. The approach is explained through a prototype in Oracle-s PL/SQL Server Pages.

Keywords: Business Rules, Data warehouse, PL/SQL ServerPages, Relational model, Web Application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2986
976 Statistical Evaluation of Nonlinear Distortion using the Multi-Canonical Monte Carlo Method and the Split Step Fourier Method

Authors: Ioannis Neokosmidis, Nikos Gkekas, Thomas Kamalakis, Thomas Sphicopoulos

Abstract:

In high powered dense wavelength division multiplexed (WDM) systems with low chromatic dispersion, four-wave mixing (FWM) can prove to be a major source of noise. The MultiCanonical Monte Carlo Method (MCMC) and the Split Step Fourier Method (SSFM) are combined to accurately evaluate the probability density function of the decision variable of a receiver, limited by FWM. The combination of the two methods leads to more accurate results, and offers the possibility of adding other optical noises such as the Amplified Spontaneous Emission (ASE) noise.

Keywords: Monte Carlo, Nonlinear optics, optical crosstalk, Wavelength-division Multiplexing (WDM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1693
975 Automatic Method for Exudates and Hemorrhages Detection from Fundus Retinal Images

Authors: A. Biran, P. Sobhe Bidari, K. Raahemifar

Abstract:

Diabetic Retinopathy (DR) is an eye disease that leads to blindness. The earliest signs of DR are the appearance of red and yellow lesions on the retina called hemorrhages and exudates. Early diagnosis of DR prevents from blindness; hence, many automated algorithms have been proposed to extract hemorrhages and exudates. In this paper, an automated algorithm is presented to extract hemorrhages and exudates separately from retinal fundus images using different image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Since Optic Disc is the same color as the exudates, it is first localized and detected. The presented method has been tested on fundus images from Structured Analysis of the Retina (STARE) and Digital Retinal Images for Vessel Extraction (DRIVE) databases by using MATLAB codes. The results show that this method is perfectly capable of detecting hard exudates and the highly probable soft exudates. It is also capable of detecting the hemorrhages and distinguishing them from blood vessels.

Keywords: Diabetic retinopathy, fundus, CHT, exudates, hemorrhages.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2647
974 Improved Weighted Matching for Speaker Recognition

Authors: Ozan Mut, Mehmet Göktürk

Abstract:

Matching algorithms have significant importance in speaker recognition. Feature vectors of the unknown utterance are compared to feature vectors of the modeled speakers as a last step in speaker recognition. A similarity score is found for every model in the speaker database. Depending on the type of speaker recognition, these scores are used to determine the author of unknown speech samples. For speaker verification, similarity score is tested against a predefined threshold and either acceptance or rejection result is obtained. In the case of speaker identification, the result depends on whether the identification is open set or closed set. In closed set identification, the model that yields the best similarity score is accepted. In open set identification, the best score is tested against a threshold, so there is one more possible output satisfying the condition that the speaker is not one of the registered speakers in existing database. This paper focuses on closed set speaker identification using a modified version of a well known matching algorithm. The results of new matching algorithm indicated better performance on YOHO international speaker recognition database.

Keywords: Automatic Speaker Recognition, Voice Recognition, Pattern Recognition, Digital Audio Signal Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733
973 Evolution of Fuzzy Neural Networks Using an Evolution Strategy with Fuzzy Genotype Values

Authors: Hidehiko Okada

Abstract:

Evolution strategy (ES) is a well-known instance of evolutionary algorithms, and there have been many studies on ES. In this paper, the author proposes an extended ES for solving fuzzy-valued optimization problems. In the proposed ES, genotype values are not real numbers but fuzzy numbers. Evolutionary processes in the ES are extended so that it can handle genotype instances with fuzzy numbers. In this study, the proposed method is experimentally applied to the evolution of neural networks with fuzzy weights and biases. Results reveal that fuzzy neural networks evolved using the proposed ES with fuzzy genotype values can model hidden target fuzzy functions even though no training data are explicitly provided. Next, the proposed method is evaluated in terms of variations in specifying fuzzy numbers as genotype values. One of the mostly adopted fuzzy numbers is a symmetric triangular one that can be specified by its lower and upper bounds (LU) or its center and width (CW). Experimental results revealed that the LU model contributed better to the fuzzy ES than the CW model, which indicates that the LU model should be adopted in future applications of the proposed method.

Keywords: Evolutionary algorithm, evolution strategy, fuzzy number, feedforward neural network, neuroevolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1553
972 Portfolio Simulation in GSM Cellular Telecommunication Industry for Company's Decision and Policies Making

Authors: M. Dachyar, Yudavedito

Abstract:

The rising growth of the GSM cellular phone industry has tightening competition level between providers in making strategies enhancing the market shares in Indonesia. Tsel, as one of those companies, has to determine the proper strategy to sustain as well as improve the market share without reducing its operational income level. Portfolio simulation model is designed with a dynamic system approach. The result of this research is a recommendation to the company by optimizing its technological policies, services, and promotions. The tariff policies and the signal quality should not be the main focus because this company has had a large number of customers and a good infrastructural condition.

Keywords: Telecommunication industry, simulation, dynamic system, portfolio, quality services.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1616
971 Investigating Student Behavior in Adopting Online Formative Assessment Feedback

Authors: Peter Clutterbuck, Terry Rowlands, Owen Seamons

Abstract:

In this paper we describe one critical research program within a complex, ongoing multi-year project (2010 to 2014 inclusive) with the overall goal to improve the learning outcomes for first year undergraduate commerce/business students within an Information Systems (IS) subject with very large enrolment. The single research program described in this paper is the analysis of student attitudes and decision making in relation to the availability of formative assessment feedback via Web-based real time conferencing and document exchange software (Adobe Connect). The formative assessment feedback between teaching staff and students is in respect of an authentic problem-based, team-completed assignment. The analysis of student attitudes and decision making is investigated via both qualitative (firstly) and quantitative (secondly) application of the Theory of Planned Behavior (TPB) with a two statistically-significant and separate trial samples of the enrolled students. The initial qualitative TPB investigation revealed that perceived self-efficacy, improved time-management, and lecturer-student relationship building were the major factors in shaping an overall favorable student attitude to online feedback, whilst some students expressed valid concerns with perceived control limitations identified within the online feedback protocols. The subsequent quantitative TPB investigation then confirmed that attitude towards usage, subjective norms surrounding usage, and perceived behavioral control of usage were all significant in shaping student intention to use the online feedback protocol, with these three variables explaining 63 percent of the variance in the behavioral intention to use the online feedback protocol. The identification in this research of perceived behavioral control as a significant determinant in student usage of a specific technology component within a virtual learning environment (VLE) suggests that VLEs could now be viewed not as a single, atomic entity, but as a spectrum of technology offerings ranging from the mature and simple (e.g., email, Web downloads) to the cutting-edge and challenging (e.g., Web conferencing and real-time document exchange). That is, that all VLEs should not be considered the same. The results of this research suggest that tertiary students have the technological sophistication to assess a VLE in this more selective manner.

Keywords: Formative assessment feedback, virtual learning environment, theory of planned behavior, perceived behavioral control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2089
970 Developing New Algorithm and Its Application on Optimal Control of Pumps in Water Distribution Network

Authors: R. Rajabpour, N. Talebbeydokhti, M. H. Ahmadi

Abstract:

In recent years, new techniques for solving complex problems in engineering are proposed. One of these techniques is JPSO algorithm. With innovative changes in the nature of the jump algorithm JPSO, it is possible to construct a graph-based solution with a new algorithm called G-JPSO. In this paper, a new algorithm to solve the optimal control problem Fletcher-Powell and optimal control of pumps in water distribution network was evaluated. Optimal control of pumps comprise of optimum timetable operation (status on and off) for each of the pumps at the desired time interval. Maximum number of status on and off for each pumps imposed to the objective function as another constraint. To determine the optimal operation of pumps, a model-based optimization-simulation algorithm was developed based on G-JPSO and JPSO algorithms. The proposed algorithm results were compared well with the ant colony algorithm, genetic and JPSO results. This shows the robustness of proposed algorithm in finding near optimum solutions with reasonable computational cost.

Keywords: G-JPSO, operation, optimization, pumping station, water distribution networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
969 Investigating Simple Multipath Compensation for Frequency Modulated Signals at Lower Frequencies

Authors: Lusungu Ndovi

Abstract:

Radio propagation from point-to-point is affected by the physical channel in many ways. A signal arriving at a destination travels through a number of different paths which are referred to as multi-paths. Research in this area of wireless communications has progressed well over the years with the research taking different angles of focus. By this is meant that some researchers focus on ways of reducing or eluding Multipath effects whilst others focus on ways of mitigating the effects of Multipath through compensation schemes. Baseband processing is seen as one field of signal processing that is cardinal to the advancement of software defined radio technology. This has led to wide research into the carrying out certain algorithms at baseband. This paper considers compensating for Multipath for Frequency Modulated signals. The compensation process is carried out at Radio frequency (RF) and at Quadrature baseband (QBB) and the results are compared. Simulations are carried out using MatLab so as to show the benefits of working at lower QBB frequencies than at RF.

Keywords: Quadrature baseband, Radio frequency, MultipathCompensation, Frequency modulation, Signal Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653
968 A Matrix Evaluation Model for Sustainability Assessment of Manufacturing Technologies

Authors: Q. Z. Yang, B. H. Chua, B. Song

Abstract:

Technology assessment is a vital part of decision process in manufacturing, particularly for decisions on selection of new sustainable manufacturing processes. To assess these processes, a matrix approach is introduced and sustainability assessment models are developed. Case studies show that the matrix-based approach provides a flexible and practical way for sustainability evaluation of new manufacturing technologies such as those used in surface coating. The technology assessment of coating processes reveals that compared with powder coating, the sol-gel coating can deliver better technical, economical and environmental sustainability with respect to the selected sustainability evaluation criteria for a decorative coating application of car wheels.

Keywords: Evaluation matrix, sustainable manufacturing, surface coating, technology assessment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2635
967 A Fuzzy Logic Based Navigation of a Mobile Robot

Authors: Anis Fatmi, Amur Al Yahmadi, Lazhar Khriji, Nouri Masmoudi

Abstract:

One of the long standing challenging aspect in mobile robotics is the ability to navigate autonomously, avoiding modeled and unmodeled obstacles especially in crowded and unpredictably changing environment. A successful way of structuring the navigation task in order to deal with the problem is within behavior based navigation approaches. In this study, Issues of individual behavior design and action coordination of the behaviors will be addressed using fuzzy logic. A layered approach is employed in this work in which a supervision layer based on the context makes a decision as to which behavior(s) to process (activate) rather than processing all behavior(s) and then blending the appropriate ones, as a result time and computational resources are saved.

Keywords: Behavior based navigation, context based coordination, fuzzy logic, mobile robots.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1869
966 Agent-based Simulation for Blood Glucose Control in Diabetic Patients

Authors: Sh. Yasini, M. B. Naghibi-Sistani, A. Karimpour

Abstract:

This paper employs a new approach to regulate the blood glucose level of type I diabetic patient under an intensive insulin treatment. The closed-loop control scheme incorporates expert knowledge about treatment by using reinforcement learning theory to maintain the normoglycemic average of 80 mg/dl and the normal condition for free plasma insulin concentration in severe initial state. The insulin delivery rate is obtained off-line by using Qlearning algorithm, without requiring an explicit model of the environment dynamics. The implementation of the insulin delivery rate, therefore, requires simple function evaluation and minimal online computations. Controller performance is assessed in terms of its ability to reject the effect of meal disturbance and to overcome the variability in the glucose-insulin dynamics from patient to patient. Computer simulations are used to evaluate the effectiveness of the proposed technique and to show its superiority in controlling hyperglycemia over other existing algorithms

Keywords: Insulin Delivery rate, Q-learning algorithm, Reinforcement learning, Type I diabetes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2204
965 Using Perspective Schemata to Model the ETL Process

Authors: Valeria M. Pequeno, Joao Carlos G. M. Pires

Abstract:

Data Warehouses (DWs) are repositories which contain the unified history of an enterprise for decision support. The data must be Extracted from information sources, Transformed and integrated to be Loaded (ETL) into the DW, using ETL tools. These tools focus on data movement, where the models are only used as a means to this aim. Under a conceptual viewpoint, the authors want to innovate the ETL process in two ways: 1) to make clear compatibility between models in a declarative fashion, using correspondence assertions and 2) to identify the instances of different sources that represent the same entity in the real-world. This paper presents the overview of the proposed framework to model the ETL process, which is based on the use of a reference model and perspective schemata. This approach provides the designer with a better understanding of the semantic associated with the ETL process.

Keywords: conceptual data model, correspondence assertions, data warehouse, data integration, ETL process, object relational database.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1512
964 Multi-Criteria Nautical Ports Capacity and Services Planning

Authors: N. Perko, N. Kavran, M. Bukljaš, I. Berbić

Abstract:

This paper is a result of implemented research on proposed introduced methodology for nautical ports capacity planning by introducing a multi-criteria approach of defined criteria in the Adriatic Sea region. The purpose was analyzing the determinants - characteristics of infrastructure and services of nautical ports capacity allocated, especially nowadays due to COVID-19 pandemic, as crucial for successful operation of nautical ports. Giving the importance of the defined priorities for short-term and long-term planning is essential not only in terms of the development of nautical tourism, but also in terms of developing the maritime system, but unfortunately this is not always carried out. Evaluation of the use of resources should follow from a detailed analysis of all aspects of resources bearing in mind that nautical tourism used resources in a sustainable manner and generates effects in the tourism and maritime sectors. Consequently, identified multiplier effect of nautical tourism, which should be defined and quantified in detail, should be one of the major competitive products on the Croatian Adriatic and the Mediterranean. Research of nautical tourism is necessary to quantify the effects and required planning system development. In the future, the greatest threat to long-term sustainable development of nautical tourism can be its further uncontrolled or unlimited and undirected development, especially under pressure markedly higher demand than supply for new moorings in the Mediterranean. Results of this implemented research are applicable to nautical ports management and decision makers of maritime transport system development. This paper will present implemented research and obtained result - developed methodology for nautical port capacity planning - Port Capacity Planning Multi-criteria decision-making. A proposed methodological approach of multi-criteria capacity planning includes four criteria (spatial - transport, cost - infrastructure, ecological and organizational criteria, and additional services). The importance of the criteria and sub-criteria is evaluated and carried out the basis for a sensitivity analysis of the importance of the criteria and sub-criteria. Based on the analysis of the identified and quantified importance of certain criteria and sub-criteria as well as sensitivity analysis and analysis of changes of the quantified importance scientific and applicable results will be presented. These obtained results have practical applicability by management of nautical ports in the planning of increasing capacity and further development and for the adaptation of existing nautical ports. The obtained research is applicable and replicable in other seas and results are especially important and useful in this COVID-19 pandemic challenging maritime development framework.

Keywords: Adriatic Sea, capacity, infrastructures, maritime system, methodology, nautical ports, nautical tourism, service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 448
963 CPU Architecture Based on Static Hardware Scheduler Engine and Multiple Pipeline Registers

Authors: Ionel Zagan, Vasile Gheorghita Gaitan

Abstract:

The development of CPUs and of real-time systems based on them made it possible to use time at increasingly low resolutions. Together with the scheduling methods and algorithms, time organizing has been improved so as to respond positively to the need for optimization and to the way in which the CPU is used. This presentation contains both a detailed theoretical description and the results obtained from research on improving the performances of the nMPRA (Multi Pipeline Register Architecture) processor by implementing specific functions in hardware. The proposed CPU architecture has been developed, simulated and validated by using the FPGA Virtex-7 circuit, via a SoC project. Although the nMPRA processor hardware structure with five pipeline stages is very complex, the present paper presents and analyzes the tests dedicated to the implementation of the CPU and of the memory on-chip for instructions and data. In order to practically implement and test the entire SoC project, various tests have been performed. These tests have been performed in order to verify the drivers for peripherals and the boot module named Bootloader.

Keywords: Hardware scheduler, nMPRA processor, real-time systems, scheduling methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1096
962 An Output Oriented Super-Efficiency Model for Considering Time Lag Effect

Authors: Yanshuang Zhang, Byungho Jeong

Abstract:

There exists some time lag between the consumption of inputs and the production of outputs. This time lag effect should be considered in calculating efficiency of decision making units (DMU). Recently, a couple of DEA models were developed for considering time lag effect in efficiency evaluation of research activities. However, these models can’t discriminate efficient DMUs because of the nature of basic DEA model in which efficiency scores are limited to ‘1’. This problem can be resolved a super-efficiency model. However, a super efficiency model sometimes causes infeasibility problem. This paper suggests an output oriented super-efficiency model for efficiency evaluation under the consideration of time lag effect. A case example using a long term research project is given to compare the suggested model with the MpO model.

Keywords: DEA, Super-efficiency, Time Lag.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2683
961 Multi Objective Micro Genetic Algorithm for Combine and Reroute Problem

Authors: Soottipoom Yaowiwat, Manoj Lohatepanont, Proadpran Punyabukkana

Abstract:

Several approaches such as linear programming, network modeling, greedy heuristic and decision support system are well-known approaches in solving irregular airline operation problem. This paper presents an alternative approach based on Multi Objective Micro Genetic Algorithm. The aim of this research is to introduce the concept of Multi Objective Micro Genetic Algorithm as a tool to solve irregular airline operation, combine and reroute problem. The experiment result indicated that the model could obtain optimal solutions within a few second.

Keywords: Irregular Airline Operation, Combine and RerouteRoutine, Genetic Algorithm, Micro Genetic Algorithm, Multi ObjectiveOptimization, Evolutionary Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645
960 A Heuristic Based Conceptual Framework for Product Innovation

Authors: Amalia Suzianti

Abstract:

This research elaborates decision models for product innovation in the early phases, focusing on one of the most widely implemented method in marketing research: conjoint analysis and the related conjoint-based models with special focus on heuristics programming techniques for the development of optimal product innovation. The concept, potential, requirements and limitations of conjoint analysis and its conjoint-based heuristics successors are analysed and the development of conceptual framework of Genetic Algorithm (GA) as one of the most widely implemented heuristic methods for developing product innovations are discussed.

Keywords: Product Innovation, Conjoint Analysis, Heuristic Model, Genetic Algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611
959 CFD Simulations to Examine Natural Ventilation of a Work Area in a Public Building

Authors: An-Shik Yang, Chiang-Ho Cheng, Jen-Hao Wu, Yu-Hsuan Juan

Abstract:

Natural ventilation has played an important role for many low energy-building designs. It has been also noticed as a essential subject to persistently bring the fresh cool air from the outside into a building. This study carried out the computational fluid dynamics (CFD)-based simulations to examine the natural ventilation development of a work area in a public building. The simulated results can be useful to better understand the indoor microclimate and the interaction of wind with buildings. Besides, this CFD simulation procedure can serve as an effective analysis tool to characterize the airing performance, and thereby optimize the building ventilation for strengthening the architects, planners and other decision makers on improving the natural ventilation design of public buildings.

Keywords: CFD simulations, Natural ventilation, Microclimate, Wind environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3752
958 Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency

Authors: Rania Alshikhe, Vinita Jindal

Abstract:

Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from trav-eling vehicles, such as taxis through installed global positioning sys-tem (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.

Keywords: GPS trajectory, road network, taxi trips, digital map, big data, STEM, LANCE

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 512
957 UDCA: An Energy Efficient Clustering Algorithm for Wireless Sensor Network

Authors: Boregowda S.B., Hemanth Kumar A.R. Babu N.V, Puttamadappa C., And H.S Mruthyunjaya

Abstract:

In the past few years, the use of wireless sensor networks (WSNs) potentially increased in applications such as intrusion detection, forest fire detection, disaster management and battle field. Sensor nodes are generally battery operated low cost devices. The key challenge in the design and operation of WSNs is to prolong the network life time by reducing the energy consumption among sensor nodes. Node clustering is one of the most promising techniques for energy conservation. This paper presents a novel clustering algorithm which maximizes the network lifetime by reducing the number of communication among sensor nodes. This approach also includes new distributed cluster formation technique that enables self-organization of large number of nodes, algorithm for maintaining constant number of clusters by prior selection of cluster head and rotating the role of cluster head to evenly distribute the energy load among all sensor nodes.

Keywords: Clustering algorithms, Cluster head, Energy consumption, Sensor nodes, and Wireless sensor networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2392
956 A Fast Silhouette Detection Algorithm for Shadow Volumes in Augmented Reality

Authors: Hoshang Kolivand, Mahyar Kolivand, Mohd Shahrizal Sunar, Mohd Azhar M. Arsad

Abstract:

Real-time shadow generation in virtual environments and Augmented Reality (AR) was always a hot topic in the last three decades. Lots of calculation for shadow generation among AR needs a fast algorithm to overcome this issue and to be capable of implementing in any real-time rendering. In this paper, a silhouette detection algorithm is presented to generate shadows for AR systems. Δ+ algorithm is presented based on extending edges of occluders to recognize which edges are silhouettes in the case of real-time rendering. An accurate comparison between the proposed algorithm and current algorithms in silhouette detection is done to show the reduction calculation by presented algorithm. The algorithm is tested in both virtual environments and AR systems. We think that this algorithm has the potential to be a fundamental algorithm for shadow generation in all complex environments.

Keywords: Silhouette detection, shadow volumes, real-time shadows, rendering, augmented reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2022
955 A Hidden Markov Model for Modeling Pavement Deterioration under Incomplete Monitoring Data

Authors: Nam Lethanh, Bryan T. Adey

Abstract:

In this paper, the potential use of an exponential hidden Markov model to model a hidden pavement deterioration process, i.e. one that is not directly measurable, is investigated. It is assumed that the evolution of the physical condition, which is the hidden process, and the evolution of the values of pavement distress indicators, can be adequately described using discrete condition states and modeled as a Markov processes. It is also assumed that condition data can be collected by visual inspections over time and represented continuously using an exponential distribution. The advantage of using such a model in decision making process is illustrated through an empirical study using real world data.

Keywords: Deterioration modeling, Exponential distribution, Hidden Markov model, Pavement management

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2309
954 On Mobile Checkpointing using Index and Time Together

Authors: Awadhesh Kumar Singh

Abstract:

Checkpointing is one of the commonly used techniques to provide fault-tolerance in distributed systems so that the system can operate even if one or more components have failed. However, mobile computing systems are constrained by low bandwidth, mobility, lack of stable storage, frequent disconnections and limited battery life. Hence, checkpointing protocols having lesser number of synchronization messages and fewer checkpoints are preferred in mobile environment. There are two different approaches, although not orthogonal, to checkpoint mobile computing systems namely, time-based and index-based. Our protocol is a fusion of these two approaches, though not first of its kind. In the present exposition, an index-based checkpointing protocol has been developed, which uses time to indirectly coordinate the creation of consistent global checkpoints for mobile computing systems. The proposed algorithm is non-blocking, adaptive, and does not use any control message. Compared to other contemporary checkpointing algorithms, it is computationally more efficient because it takes lesser number of checkpoints and does not need to compute dependency relationships. A brief account of important and relevant works in both the fields, time-based and index-based, has also been included in the presentation.

Keywords: Checkpointing, forced checkpoint, mobile computing, recovery, time-coordinated.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1489
953 When Explanations “Cause“ Error: A Look at Representations and Compressions

Authors: Michael Lissack

Abstract:

We depend upon explanation in order to “make sense" out of our world. And, making sense is all the more important when dealing with change. But, what happens if our explanations are wrong? This question is examined with respect to two types of explanatory model. Models based on labels and categories we shall refer to as “representations." More complex models involving stories, multiple algorithms, rules of thumb, questions, ambiguity we shall refer to as “compressions." Both compressions and representations are reductions. But representations are far more reductive than compressions. Representations can be treated as a set of defined meanings – coherence with regard to a representation is the degree of fidelity between the item in question and the definition of the representation, of the label. By contrast, compressions contain enough degrees of freedom and ambiguity to allow us to make internal predictions so that we may determine our potential actions in the possibility space. Compressions are explanatory via mechanism. Representations are explanatory via category. Managers are often confusing their evocation of a representation (category inclusion) as the creation of a context of compression (description of mechanism). When this type of explanatory error occurs, more errors follow. In the drive for efficiency such substitutions are all too often proclaimed – at the manager-s peril..

Keywords: Coherence, Emergence, Reduction, Model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1240
952 Benchmarking of Pentesting Tools

Authors: Esteban Alejandro Armas Vega, Ana Lucila Sandoval Orozco, Luis Javier García Villalba

Abstract:

The benchmarking of tools for dynamic analysis of vulnerabilities in web applications is something that is done periodically, because these tools from time to time update their knowledge base and search algorithms, in order to improve their accuracy. Unfortunately, the vast majority of these evaluations are made by software enthusiasts who publish their results on blogs or on non-academic websites and always with the same evaluation methodology. Similarly, academics who have carried out this type of analysis from a scientific approach, the majority, make their analysis within the same methodology as well the empirical authors. This paper is based on the interest of finding answers to questions that many users of this type of tools have been asking over the years, such as, to know if the tool truly test and evaluate every vulnerability that it ensures do, or if the tool, really, deliver a real report of all the vulnerabilities tested and exploited. This kind of questions have also motivated previous work but without real answers. The aim of this paper is to show results that truly answer, at least on the tested tools, all those unanswered questions. All the results have been obtained by changing the common model of benchmarking used for all those previous works.

Keywords: Cybersecurity, IDS, security, web scanners, web vulnerabilities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1808
951 Modified Naïve Bayes Based Prediction Modeling for Crop Yield Prediction

Authors: Kefaya Qaddoum

Abstract:

Most of greenhouse growers desire a determined amount of yields in order to accurately meet market requirements. The purpose of this paper is to model a simple but often satisfactory supervised classification method. The original naive Bayes have a serious weakness, which is producing redundant predictors. In this paper, utilized regularization technique was used to obtain a computationally efficient classifier based on naive Bayes. The suggested construction, utilized L1-penalty, is capable of clearing redundant predictors, where a modification of the LARS algorithm is devised to solve this problem, making this method applicable to a wide range of data. In the experimental section, a study conducted to examine the effect of redundant and irrelevant predictors, and test the method on WSG data set for tomato yields, where there are many more predictors than data, and the urge need to predict weekly yield is the goal of this approach. Finally, the modified approach is compared with several naive Bayes variants and other classification algorithms (SVM and kNN), and is shown to be fairly good.

Keywords: Tomato yields prediction, naive Bayes, redundancy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5110
950 Multichannel Scheme under Max-Min Fairness Environment for Cognitive Radio Networks

Authors: Hans R. Márquez, Cesar Hernández, Ingrid Páez

Abstract:

This paper develops a multiple channel assignment model, which allows to take advantage of spectrum opportunities in cognitive radio networks in the most efficient way. The developed scheme allows making several assignments of available and frequency adjacent channel, which require a bigger bandwidth, under an equality environment. The hybrid assignment model it is made by two algorithms, one that makes the ranking and selects available frequency channels and the other one in charge of establishing the Max-Min Fairness for not restrict the spectrum opportunities for all the other secondary users, who also claim to make transmissions. Measurements made were done for average bandwidth, average delay, as well as fairness computation for several channel assignments. Reached results were evaluated with experimental spectrum occupational data from captured GSM frequency band. The developed model shows evidence of improvement in spectrum opportunity use and a wider average transmission bandwidth for each secondary user, maintaining equality criteria in channel assignment.

Keywords: Bandwidth, fairness, multichannel, secondary users.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1764
949 Evaluation of Model Evaluation Criterion for Software Development Effort Estimation

Authors: S. K. Pillai, M. K. Jeyakumar

Abstract:

Estimation of model parameters is necessary to predict the behavior of a system. Model parameters are estimated using optimization criteria. Most algorithms use historical data to estimate model parameters. The known target values (actual) and the output produced by the model are compared. The differences between the two form the basis to estimate the parameters. In order to compare different models developed using the same data different criteria are used. The data obtained for short scale projects are used here. We consider software effort estimation problem using radial basis function network. The accuracy comparison is made using various existing criteria for one and two predictors. Then, we propose a new criterion based on linear least squares for evaluation and compared the results of one and two predictors. We have considered another data set and evaluated prediction accuracy using the new criterion. The new criterion is easy to comprehend compared to single statistic. Although software effort estimation is considered, this method is applicable for any modeling and prediction.

Keywords: Software effort estimation, accuracy, Radial Basis Function, linear least squares.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2044
948 Analysis of Testing and Operational Software Reliability in SRGM based on NHPP

Authors: S. Thirumurugan, D. R. Prince Williams

Abstract:

Software Reliability is one of the key factors in the software development process. Software Reliability is estimated using reliability models based on Non Homogenous Poisson Process. In most of the literature the Software Reliability is predicted only in testing phase. So it leads to wrong decision-making concept. In this paper, two Software Reliability concepts, testing and operational phase are studied in detail. Using S-Shaped Software Reliability Growth Model (SRGM) and Exponential SRGM, the testing and operational reliability values are obtained. Finally two reliability values are compared and optimal release time is investigated.

Keywords: Error Detection Rate, Estimation of Parameters, Instantaneous Failure Rate, Mean Value Function, Non Homogenous Poisson Process (NHPP), Software Reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636