Search results for: low complexity
341 Analysis of the Omnichannel Delivery Network with Application to Last Mile Delivery
Authors: Colette Malyack, Pius Egbelu
Abstract:
Business-to-Customer (B2C) delivery options have improved to meet increased demand in recent years. The change in end users has forced logistics networks to focus on customer service and sentiment that would have previously been the priority of the company or organization of origin. This has led to increased pressure on logistics companies to extend traditional B2B networks into a B2C solution while accommodating additional costs, roadblocks, and customer sentiment; the result has been the creation of the omnichannel delivery network encompassing a number of traditional and modern methods of package delivery. In this paper the many solutions within the omnichannel delivery network are defined and discussed. It can be seen through this analysis that the omnichannel delivery network can be applied to reduce the complexity of package delivery and provide customers with more options. Applied correctly the result is a reduction in cost to the logistics company over time, even with an initial increase in cost to obtain the technology.Keywords: Network planning, Last Mile Delivery, LMD, omnichannel delivery network, omnichannel logistics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 665340 The Rank-scaled Mutation Rate for Genetic Algorithms
Authors: Mike Sewell, Jagath Samarabandu, Ranga Rodrigo, Kenneth McIsaac
Abstract:
A novel method of individual level adaptive mutation rate control called the rank-scaled mutation rate for genetic algorithms is introduced. The rank-scaled mutation rate controlled genetic algorithm varies the mutation parameters based on the rank of each individual within the population. Thereby the distribution of the fitness of the papulation is taken into consideration in forming the new mutation rates. The best fit mutate at the lowest rate and the least fit mutate at the highest rate. The complexity of the algorithm is of the order of an individual adaptation scheme and is lower than that of a self-adaptation scheme. The proposed algorithm is tested on two common problems, namely, numerical optimization of a function and the traveling salesman problem. The results show that the proposed algorithm outperforms both the fixed and deterministic mutation rate schemes. It is best suited for problems with several local optimum solutions without a high demand for excessive mutation rates.
Keywords: Genetic algorithms, mutation rate control, adaptive mutation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2669339 A Minimum Spanning Tree-Based Method for Initializing the K-Means Clustering Algorithm
Authors: J. Yang, Y. Ma, X. Zhang, S. Li, Y. Zhang
Abstract:
The traditional k-means algorithm has been widely used as a simple and efficient clustering method. However, the algorithm often converges to local minima for the reason that it is sensitive to the initial cluster centers. In this paper, an algorithm for selecting initial cluster centers on the basis of minimum spanning tree (MST) is presented. The set of vertices in MST with same degree are regarded as a whole which is used to find the skeleton data points. Furthermore, a distance measure between the skeleton data points with consideration of degree and Euclidean distance is presented. Finally, MST-based initialization method for the k-means algorithm is presented, and the corresponding time complexity is analyzed as well. The presented algorithm is tested on five data sets from the UCI Machine Learning Repository. The experimental results illustrate the effectiveness of the presented algorithm compared to three existing initialization methods.
Keywords: Degree, initial cluster center, k-means, minimum spanning tree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1552338 Impact of the Existence of One-Way Functionson the Conceptual Difficulties of Quantum Measurements
Authors: Arkady Bolotin
Abstract:
One-way functions are functions that are easy to compute but hard to invert. Their existence is an open conjecture; it would imply the existence of intractable problems (i.e. NP-problems which are not in the P complexity class). If true, the existence of one-way functions would have an impact on the theoretical framework of physics, in particularly, quantum mechanics. Such aspect of one-way functions has never been shown before. In the present work, we put forward the following. We can calculate the microscopic state (say, the particle spin in the z direction) of a macroscopic system (a measuring apparatus registering the particle z-spin) by the system macroscopic state (the apparatus output); let us call this association the function F. The question is: can we compute the function F in the inverse direction? In other words, can we compute the macroscopic state of the system through its microscopic state (the preimage F -1)? In the paper, we assume that the function F is a one-way function. The assumption implies that at the macroscopic level the Schrödinger equation becomes unfeasible to compute. This unfeasibility plays a role of limit of the validity of the linear Schrödinger equation.Keywords: One-way functions, P versus NP problem, quantummeasurements.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1307337 Multidimensional Performance Management
Authors: David Wiese
Abstract:
In order to maximize efficiency of an information management platform and to assist in decision making, the collection, storage and analysis of performance-relevant data has become of fundamental importance. This paper addresses the merits and drawbacks provided by the OLAP paradigm for efficiently navigating large volumes of performance measurement data hierarchically. The system managers or database administrators navigate through adequately (re)structured measurement data aiming to detect performance bottlenecks, identify causes for performance problems or assessing the impact of configuration changes on the system and its representative metrics. Of particular importance is finding the root cause of an imminent problem, threatening availability and performance of an information system. Leveraging OLAP techniques, in contrast to traditional static reporting, this is supposed to be accomplished within moderate amount of time and little processing complexity. It is shown how OLAP techniques can help improve understandability and manageability of measurement data and, hence, improve the whole Performance Analysis process.
Keywords: Data Warehousing, OLAP, Multidimensional Navigation, Performance Diagnosis, Performance Management, Performance Tuning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2135336 Mass Customization in Supply Chain Management Environment: A Review
Authors: Nirjhar Roy, V. R. Komma, Jitendra Kumar
Abstract:
In the supply chain management customer is the most significant component and mass customization is mostly related to customers because it is the capability of any industry or organization to deliver highly customized products and its services to the respective customers with flexibility and integration, providing such a variety of products that nearly everyone can find what they want. Today all over the world many companies and markets are facing varied situations that at one side customers are demanding that their orders should be completed as quickly as possible while on other hand it requires highly customized products and services. By applying mass customization some companies face unwanted cost and complexity. Now they are realizing that they should completely examine what kind of customization would be best suited for their companies. In this paper authors review some approaches and principles which show effect in supply chain management that can be adopted and used by companies for quickly meeting the customer orders at reduced cost, with minimum amount of inventory and maximum efficiency.Keywords: Mass customization and supply chain management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7044335 Digital Individual Benefit Statement: The Use of a Triangulation Methodology to Design a Digital Platform for Switzerland
Authors: Catherine Equey Balzli
Abstract:
Old age retirement pensions are an important concern among the Swiss but estimating one’s income after retirement is difficult due to the Swiss insurance system’s complexity. This project’s aim is to prepare for developing a digital platform that will allow individuals to plan for retirement in a simplified manner. The main objective of the platform will be to give individuals the tools to check that their savings and retirement benefits will allow them to continue the lifestyle to which they are accustomed once they are retired. The research results from qualitative (focus group) and quantitative (survey) methodologies, recommend the scope and functionalities for a digital platform to be developed. A main outcome is the need to limit the platform’s scope to old-age pension only (excluding survivors’ or disability pensions, for instance). Furthermore, an outcome regarding the functionalities is the proposition of scenarios such as early retirement, changes to income, or modifications to personal status. The development of the digital platform will be a subsequent project.
Keywords: Benefit statement, digital platform, retirement financial planning, social insurances.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 580334 Signal Driven Sampling and Filtering a Promising Approach for Time Varying Signals Processing
Authors: Saeed Mian Qaisar, Laurent Fesquet, Marc Renaudin
Abstract:
The mobile systems are powered by batteries. Reducing the system power consumption is a key to increase its autonomy. It is known that mostly the systems are dealing with time varying signals. Thus, we aim to achieve power efficiency by smartly adapting the system processing activity in accordance with the input signal local characteristics. It is done by completely rethinking the processing chain, by adopting signal driven sampling and processing. In this context, a signal driven filtering technique, based on the level crossing sampling is devised. It adapts the sampling frequency and the filter order by analysing the input signal local variations. Thus, it correlates the processing activity with the signal variations. It leads towards a drastic computational gain of the proposed technique compared to the classical one.Keywords: Level Crossing Sampling, Activity Selection, Adaptive Rate Filtering, Computational Complexity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1361333 Markov Chain Monte Carlo Model Composition Search Strategy for Quantitative Trait Loci in a Bayesian Hierarchical Model
Authors: Susan J. Simmons, Fang Fang, Qijun Fang, Karl Ricanek
Abstract:
Quantitative trait loci (QTL) experiments have yielded important biological and biochemical information necessary for understanding the relationship between genetic markers and quantitative traits. For many years, most QTL algorithms only allowed one observation per genotype. Recently, there has been an increasing demand for QTL algorithms that can accommodate more than one observation per genotypic distribution. The Bayesian hierarchical model is very flexible and can easily incorporate this information into the model. Herein a methodology is presented that uses a Bayesian hierarchical model to capture the complexity of the data. Furthermore, the Markov chain Monte Carlo model composition (MC3) algorithm is used to search and identify important markers. An extensive simulation study illustrates that the method captures the true QTL, even under nonnormal noise and up to 6 QTL.Keywords: Bayesian hierarchical model, Markov chain MonteCarlo model composition, quantitative trait loci.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1962332 Similarity Detection in Collaborative Development of Object-Oriented Formal Specifications
Authors: Fathi Taibi, Fouad Mohammed Abbou, Md. Jahangir Alam
Abstract:
The complexity of today-s software systems makes collaborative development necessary to accomplish tasks. Frameworks are necessary to allow developers perform their tasks independently yet collaboratively. Similarity detection is one of the major issues to consider when developing such frameworks. It allows developers to mine existing repositories when developing their own views of a software artifact, and it is necessary for identifying the correspondences between the views to allow merging them and checking their consistency. Due to the importance of the requirements specification stage in software development, this paper proposes a framework for collaborative development of Object- Oriented formal specifications along with a similarity detection approach to support the creation, merging and consistency checking of specifications. The paper also explores the impact of using additional concepts on improving the matching results. Finally, the proposed approach is empirically evaluated.Keywords: Collaborative Development, Formal methods, Object-Oriented, Similarity detection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1469331 Design of Digital IIR filters with the Advantages of Model Order Reduction Technique
Authors: K.Ramesh, A.Nirmalkumar, G.Gurusamy
Abstract:
In this paper, a new model order reduction phenomenon is introduced at the design stage of linear phase digital IIR filter. The complexity of a system can be reduced by adopting the model order reduction method in their design. In this paper a mixed method of model order reduction is proposed for linear IIR filter. The proposed method employs the advantages of factor division technique to derive the reduced order denominator polynomial and the reduced order numerator is obtained based on the resultant denominator polynomial. The order reduction technique is used to reduce the delay units at the design stage of IIR filter. The validity of the proposed method is illustrated with design example in frequency domain and stability is also examined with help of nyquist plot.Keywords: Error index (J), Factor division method, IIR filter, Nyquist plot, Order reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780330 A Hybrid Approach for Quantification of Novelty in Rule Discovery
Authors: Vasudha Bhatnagar, Ahmed Sultan Al-Hegami, Naveen Kumar
Abstract:
Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules lead to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach that uses objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules. We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are quite promising.
Keywords: Knowledge Discovery in Databases (KDD), Data Mining, Rule Discovery, Interestingness, Subjective Measures, Novelty Measure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1354329 Linear Quadratic Gaussian/Loop Transfer Recover Control Flight Control on a Nonlinear Model
Authors: T. Sanches, K. Bousson
Abstract:
As part of the development of a 4D autopilot system for unmanned aerial vehicles (UAVs), i.e. a time-dependent robust trajectory generation and control algorithm, this work addresses the problem of optimal path control based on the flight sensors data output that may be unreliable due to noise on data acquisition and/or transmission under certain circumstances. Although several filtering methods, such as the Kalman-Bucy filter or the Linear Quadratic Gaussian/Loop Transfer Recover Control (LQG/LTR), are available, the utter complexity of the control system, together with the robustness and reliability required of such a system on a UAV for airworthiness certifiable autonomous flight, required the development of a proper robust filter for a nonlinear system, as a way of further mitigate errors propagation to the control system and improve its ,performance. As such, a nonlinear algorithm based upon the LQG/LTR, is validated through computational simulation testing, is proposed on this paper.
Keywords: Autonomous flight, LQG/LTR, nonlinear state estimator, robust flight control and stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 695328 Creating Maintenance Cost Model for University Buildings
Authors: AbdulLateef A. Olanrewaju, Arazi Idrus, Mohd F. Khamidi
Abstract:
Maintenance costs incurred on building differs. The difference can be as results of the types, functions, age, building health index, size, form height, location and complexity of the building. These are contributing to the difficulty in maintenance development of deterministic maintenance cost model. This paper is concerns with reporting the preliminary findings on the creation of building maintenance cost distributions for universities in Malaysia. This study is triggered by the need to provide guides on maintenance costs distributions for decision making. For this purpose, a survey questionnaire was conducted to investigate the distribution of maintenance costs in the universities. Altogether, responses were received from twenty universities comprising both private and publicly owned. The research found that engineering services, roofing and finishes were the elements contributing the larger segment of the maintenance costs. Furthermore, the study indicates the significance of maintenance cost distribution as decision making tool towards maintenance management.Keywords: Performance matrix, university buildings, costmodel, Malaysia
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2037327 Multilayer Neural Network and Fuzzy Logic Based Software Quality Prediction
Authors: Sadaf Sahar, Usman Qamar, Sadaf Ayaz
Abstract:
In the software development lifecycle, the quality prediction techniques hold a prime importance in order to minimize future design errors and expensive maintenance. There are many techniques proposed by various researchers, but with the increasing complexity of the software lifecycle model, it is crucial to develop a flexible system which can cater for the factors which in result have an impact on the quality of the end product. These factors include properties of the software development process and the product along with its operation conditions. In this paper, a neural network (perceptron) based software quality prediction technique is proposed. Using this technique, the stakeholders can predict the quality of the resulting software during the early phases of the lifecycle saving time and resources on future elimination of design errors and costly maintenance. This technique can be brought into practical use using successful training.Keywords: Software quality, fuzzy logic, perceptron, prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1180326 Efficient Copy-Move Forgery Detection for Digital Images
Authors: Somayeh Sadeghi, Hamid A. Jalab, Sajjad Dadkhah
Abstract:
Due to availability of powerful image processing software and improvement of human computer knowledge, it becomes easy to tamper images. Manipulation of digital images in different fields like court of law and medical imaging create a serious problem nowadays. Copy-move forgery is one of the most common types of forgery which copies some part of the image and pastes it to another part of the same image to cover an important scene. In this paper, a copy-move forgery detection method proposed based on Fourier transform to detect forgeries. Firstly, image is divided to same size blocks and Fourier transform is performed on each block. Similarity in the Fourier transform between different blocks provides an indication of the copy-move operation. The experimental results prove that the proposed method works on reasonable time and works well for gray scale and colour images. Computational complexity reduced by using Fourier transform in this method.Keywords: Copy-Move forgery, Digital Forensics, Image Forgery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2785325 Software Evolution Based Sequence Diagrams Merging
Authors: Zine-Eddine Bouras, Abdelouaheb Talai
Abstract:
The need to merge software artifacts seems inherent to modern software development. Distribution of development over several teams and breaking tasks into smaller, more manageable pieces are an effective means to deal with the kind of complexity. In each case, the separately developed artifacts need to be assembled as efficiently as possible into a consistent whole in which the parts still function as described. In addition, earlier changes are introduced into the life cycle and easier is their management by designers. Interaction-based specifications such as UML sequence diagrams have been found effective in this regard. As a result, sequence diagrams can be used not only for capturing system behaviors but also for merging changes in order to create a new version. The objective of this paper is to suggest a new approach to deal with the problem of software merging at the level of sequence diagrams by using the concept of dependence analysis that captures, formally, all mapping, and differences between elements of sequence diagrams and serves as a key concept to create a new version of sequence diagram.Keywords: System behaviors, sequence diagram merging, dependence analysis, sequence diagram slicing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1762324 Outer-Brace Stress Concentration Factors of Offshore Two-Planar Tubular DKT-Joints
Authors: Mohammad Ali Lotfollahi-Yaghin, Hamid Ahmadi
Abstract:
In the present paper, a set of parametric FE stress analyses is carried out for two-planar welded tubular DKT-joints under two different axial load cases. Analysis results are used to present general remarks on the effect of geometrical parameters on the stress concentration factors (SCFs) at the inner saddle, outer saddle, toe, and heel positions on the main (outer) brace. Then a new set of SCF parametric equations is developed through nonlinear regression analysis for the fatigue design of two-planar DKT-joints. An assessment study of these equations is conducted against the experimental data; and the satisfaction of the criteria regarding the acceptance of parametric equations is checked. Significant effort has been devoted by researchers to the study of SCFs in various uniplanar tubular connections. Nevertheless, for multi-planar joints covering the majority of practical applications, very few investigations have been reported due to the complexity and high cost involved.Keywords: Offshore jacket structure, Parametric equation, Stress concentration factor (SCF), Two-planar tubular KT-joint
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2817323 Mathematical Approach for Large Deformation Analysis of the Stiffened Coupled Shear Walls
Authors: M. J. Fadaee, H. Saffari, H. Khosravi
Abstract:
Shear walls are used in most of the tall buildings for carrying the lateral load. When openings for doors or windows are necessary to be existed in the shear walls, a special type of the shear walls is used called "coupled shear walls" which in some cases is stiffened by specific beams and so, called "stiffened coupled shear walls". In this paper, a mathematical method for geometrically nonlinear analysis of the stiffened coupled shear walls has been presented. Then, a suitable formulation for determining the critical load of the stiffened coupled shear walls under gravity force has been proposed. The governing differential equations for equilibrium and deformation of the stiffened coupled shear walls have been obtained by setting up the equilibrium equations and the moment-curvature relationships for each wall. Because of the complexity of the differential equation, the energy method has been adopted for approximate solution of the equations.Keywords: Buckling load, differential equation, energy method, geometrically nonlinear analysis, mathematical method, Stiffened coupled shear walls.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640322 Physical Verification Flow on Multiple Foundries
Authors: R. Abdul Wahab, R. Mohd Fuad Tengku Aziz, N. Othman, S. Saleh, N. Razali, M. Al Baqir Zinal Abidin, M. Hanif Md Nasir
Abstract:
This paper will discuss how we optimize our physical verification flow in our IC Design Department having various rule decks from multiple foundries. Our ultimate goal is to achieve faster time to tape-out and avoid schedule delay. Currently the physical verification runtimes and memory usage have drastically increased with the increasing number of design rules, design complexity, and the size of the chips to be verified. To manage design violations, we use a number of solutions to reduce the amount of violations needed to be checked by physical verification engineers. The most important functions in physical verifications are DRC (design rule check), LVS (layout vs. schematic), and XRC (extraction). Since we have a multiple number of foundries for our design tape-outs, we need a flow that improve the overall turnaround time and ease of use of the physical verification process. The demand for fast turnaround time is even more critical since the physical design is the last stage before sending the layout to the foundries.Keywords: Physical verification, DRC, LVS, XRC, flow, foundry, runset.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3230321 Simulation of Agri-Food Supply Chains
Authors: Sherine Beshara, Khaled S. El-Kilany, Noha M. Galal
Abstract:
Supply chain management has become more challenging with the emerging trend of globalization and sustainability. Lately, research related to perishable products supply chains, in particular agricultural food products, has emerged. This is attributed to the additional complexity of managing this type of supply chains with the recently increased concern of public health, food quality, food safety, demand and price variability, and the limited lifetime of these products. Inventory management for agrifood supply chains is of vital importance due to the product perishability and customers- strive for quality. This paper concentrates on developing a simulation model of a real life case study of a two echelon production-distribution system for agri-food products. The objective is to improve a set of performance measures by developing a simulation model that helps in evaluating and analysing the performance of these supply chains. Simulation results showed that it can help in improving overall system performance.Keywords: Agri-food supply chains, inventory model, modelling and Simulation, supply chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3359320 Phytoadaptation in Desert Soil Prediction Using Fuzzy Logic Modeling
Authors: S. Bouharati, F. Allag, M. Belmahdi, M. Bounechada
Abstract:
In terms of ecology forecast effects of desertification, the purpose of this study is to develop a predictive model of growth and adaptation of species in arid environment and bioclimatic conditions. The impact of climate change and the desertification phenomena is the result of combined effects in magnitude and frequency of these phenomena. Like the data involved in the phytopathogenic process and bacteria growth in arid soil occur in an uncertain environment because of their complexity, it becomes necessary to have a suitable methodology for the analysis of these variables. The basic principles of fuzzy logic those are perfectly suited to this process. As input variables, we consider the physical parameters, soil type, bacteria nature, and plant species concerned. The result output variable is the adaptability of the species expressed by the growth rate or extinction. As a conclusion, we prevent the possible strategies for adaptation, with or without shifting areas of plantation and nature adequate vegetation.
Keywords: Climate changes, dry soil, Phytopathogenicity, Predictive model, Fuzzy logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1875319 The Estimation of Human Vital Signs Complexity
Authors: L. Bikulciene, E. Venskaityte, G. Jarusevicius
Abstract:
Nonstationary and nonlinear signals generated by living complex systems defy traditional mechanistic approaches, which are based on homeostasis. Previous our studies have shown that the evaluation of the interactions of physiological signals by using special analysis methods is suitable for observation of physiological processes. It is demonstrated the possibility of using deep physiological model, based on the interpretation of the changes of the human body’s functional states combined with an application of the analytical method based on matrix theory for the physiological signals analysis, which was applied on high risk cardiac patients. It is shown that evaluation of cardiac signals interactions show peculiar for each individual functional changes at the onset of hemodynamic restoration procedure. Therefore, we suggest that the alterations of functional state of the body, after patients overcome surgery can be complemented by the data received from the suggested approach of the evaluation of functional variables’ interactions.
Keywords: Cardiac diseases, Complex systems theory, ECG analysis, matrix analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2247318 Estimation of Skew Angle in Binary Document Images Using Hough Transform
Authors: Nandini N., Srikanta Murthy K., G. Hemantha Kumar
Abstract:
This paper includes two novel techniques for skew estimation of binary document images. These algorithms are based on connected component analysis and Hough transform. Both these methods focus on reducing the amount of input data provided to Hough transform. In the first method, referred as word centroid approach, the centroids of selected words are used for skew detection. In the second method, referred as dilate & thin approach, the selected characters are blocked and dilated to get word blocks and later thinning is applied. The final image fed to Hough transform has the thinned coordinates of word blocks in the image. The methods have been successful in reducing the computational complexity of Hough transform based skew estimation algorithms. Promising experimental results are also provided to prove the effectiveness of the proposed methods.Keywords: Dilation, Document processing, Hough transform, Optical Character Recognition, Skew estimation, and Thinning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3266317 Curriculum Based Measurement and Precision Teaching in Writing Empowerment Enhancement: Results from an Italian Learning Center
Authors: I. Pelizzoni, C. Cavallini, I. Salvaderi, F. Cavallini
Abstract:
We present the improvement in writing skills obtained by 94 participants (aged between six and 10 years) with special educational needs through a writing enhancement program based on fluency principles. The study was planned and conducted with a single-subject experimental plan for each of the participants, in order to confirm the results in the literature. These results were obtained using precision teaching (PT) methodology to increase the number of written graphemes per minute in the pre- and post-test, by curriculum based measurement (CBM). Results indicated an increase in the number of written graphemes for all participants. The average overall duration of the intervention is 144 minutes in five months of treatment. These considerations have been analyzed taking account of the complexity of the implementation of measurement systems in real operational contexts (an Italian learning center) and important aspects of replicability and cost-effectiveness of such interventions.
Keywords: Precision teaching, writing skills, CBM, Italian Learning Center.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 784316 A Block Cipher for Resource-Constrained IoT Devices
Authors: Muhammad Rana, Quazi Mamun, Rafiqul Islam
Abstract:
In the Internet of Things (IoT), many devices are connected and accumulate a sheer amount of data. These Internet-driven raw data need to be transferred securely to the end-users via dependable networks. Consequently, the challenges of IoT security in various IoT domains are paramount. Cryptography is being applied to secure the networks for authentication, confidentiality, data integrity and access control. However, due to the resource constraint properties of IoT devices, the conventional cipher may not be suitable in all IoT networks. This paper designs a robust and effective lightweight cipher to secure the IoT environment and meet the resource-constrained nature of IoT devices. We also propose a symmetric and block-cipher based lightweight cryptographic algorithm. The proposed algorithm increases the complexity of the block cipher, maintaining the lowest computational requirements possible. The proposed algorithm efficiently constructs the key register updating technique, reduces the number of encryption rounds, and adds a layer between the encryption and decryption processes.
Keywords: Internet of Things, IoT, cryptography block cipher, s-box, key management, IoT security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 539315 A New Quantile Based Fuzzy Time Series Forecasting Model
Authors: Tahseen A. Jilani, Aqil S. Burney, C. Ardil
Abstract:
Time series models have been used to make predictions of academic enrollments, weather, road accident, casualties and stock prices, etc. Based on the concepts of quartile regression models, we have developed a simple time variant quantile based fuzzy time series forecasting method. The proposed method bases the forecast using prediction of future trend of the data. In place of actual quantiles of the data at each point, we have converted the statistical concept into fuzzy concept by using fuzzy quantiles using fuzzy membership function ensemble. We have given a fuzzy metric to use the trend forecast and calculate the future value. The proposed model is applied for TAIFEX forecasting. It is shown that proposed method work best as compared to other models when compared with respect to model complexity and forecasting accuracy.
Keywords: Quantile Regression, Fuzzy time series, fuzzy logicalrelationship groups, heuristic trend prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1997314 Unscented Transformation for Estimating the Lyapunov Exponents of Chaotic Time Series Corrupted by Random Noise
Authors: K. Kamalanand, P. Mannar Jawahar
Abstract:
Many systems in the natural world exhibit chaos or non-linear behavior, the complexity of which is so great that they appear to be random. Identification of chaos in experimental data is essential for characterizing the system and for analyzing the predictability of the data under analysis. The Lyapunov exponents provide a quantitative measure of the sensitivity to initial conditions and are the most useful dynamical diagnostic for chaotic systems. However, it is difficult to accurately estimate the Lyapunov exponents of chaotic signals which are corrupted by a random noise. In this work, a method for estimation of Lyapunov exponents from noisy time series using unscented transformation is proposed. The proposed methodology was validated using time series obtained from known chaotic maps. In this paper, the objective of the work, the proposed methodology and validation results are discussed in detail.
Keywords: Lyapunov exponents, unscented transformation, chaos theory, neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1988313 Approximate Bounded Knowledge Extraction Using Type-I Fuzzy Logic
Authors: Syed Muhammad Aqil Burney, Tahseen Ahmed Jilani, C. Ardil
Abstract:
Using neural network we try to model the unknown function f for given input-output data pairs. The connection strength of each neuron is updated through learning. Repeated simulations of crisp neural network produce different values of weight factors that are directly affected by the change of different parameters. We propose the idea that for each neuron in the network, we can obtain quasi-fuzzy weight sets (QFWS) using repeated simulation of the crisp neural network. Such type of fuzzy weight functions may be applied where we have multivariate crisp input that needs to be adjusted after iterative learning, like claim amount distribution analysis. As real data is subjected to noise and uncertainty, therefore, QFWS may be helpful in the simplification of such complex problems. Secondly, these QFWS provide good initial solution for training of fuzzy neural networks with reduced computational complexity.
Keywords: Crisp neural networks, fuzzy systems, extraction of logical rules, quasi-fuzzy numbers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1740312 A Framework to Support Reuse in Object-Oriented Software Development
Authors: Fathi Taibi
Abstract:
Reusability is a quality desired attribute in software products. Generally, it could be achieved through adopting development methods that promote it and achieving software qualities that have been linked with high reusability proneness. With the exponential growth in mobile application development, software reuse became an integral part in a substantial number of projects. Similarly, software reuse has become widely practiced in start-up companies. However, this has led to new emerging problems. Firstly, the reused code does not meet the required quality and secondly, the reuse intentions are dubious. This work aims to propose a framework to support reuse in Object-Oriented (OO) software development. The framework comprises a process that uses a proposed reusability assessment metric and a formal foundation to specify the elements of the reused code and the relationships between them. The framework is empirically evaluated using a wide range of open-source projects and mobile applications. The results are analyzed to help understand the reusability proneness of OO software and the possible means to improve it.
Keywords: Software reusability, software metrics, object-oriented software, modularity, low complexity, understandability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 379