Search results for: Complexity index
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1817

Search results for: Complexity index

1067 Evaluating Performance of an Anomaly Detection Module with Artificial Neural Network Implementation

Authors: Edward Guillén, Jhordany Rodriguez, Rafael Páez

Abstract:

Anomaly detection techniques have been focused on two main components: data extraction and selection and the second one is the analysis performed over the obtained data. The goal of this paper is to analyze the influence that each of these components has over the system performance by evaluating detection over network scenarios with different setups. The independent variables are as follows: the number of system inputs, the way the inputs are codified and the complexity of the analysis techniques. For the analysis, some approaches of artificial neural networks are implemented with different number of layers. The obtained results show the influence that each of these variables has in the system performance.

Keywords: Network Intrusion Detection, Machine learning, Artificial Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2072
1066 Public Key Cryptosystem based on Number Theoretic Transforms

Authors: C. Porkodi, R. Arumuganathan

Abstract:

In this paper a Public Key Cryptosystem is proposed using the number theoretic transforms (NTT) over a ring of integer modulo a composite number. The key agreement is similar to ElGamal public key algorithm. The security of the system is based on solution of multivariate linear congruence equations and discrete logarithm problem. In the proposed cryptosystem only fixed numbers of multiplications are carried out (constant complexity) and hence the encryption and decryption can be done easily. At the same time, it is very difficult to attack the cryptosystem, since the cipher text is a sequence of integers which are interrelated. The system provides authentication also. Using Mathematica version 5.0 the proposed algorithm is justified with a numerical example.

Keywords: Cryptography, decryption, discrete logarithm problem encryption, Integer Factorization problem, Key agreement, Number Theoretic Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1676
1065 Angles of Arrival Estimation with Unitary Partial Propagator

Authors: Youssef Khmou, Said Safi

Abstract:

In this paper, we investigated the effect of real valued transformation of the spectral matrix of the received data for Angles Of Arrival estimation problem.  Indeed, the unitary transformation of Partial Propagator (UPP) for narrowband sources is proposed and applied on Uniform Linear Array (ULA).

Monte Carlo simulations proved the performance of the UPP spectrum comparatively with Forward Backward Partial Propagator (FBPP) and Unitary Propagator (UP). The results demonstrates that when some of the sources are fully correlated and closer than the Rayleigh angular limit resolution of the broadside array, the UPP method outperforms the FBPP in both of spatial resolution and complexity.

Keywords: DOA, Uniform Linear Array, Narrowband, Propagator, Real valued transformation, Subspace, Unitary Operator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2279
1064 Restoration of Noisy Document Images with an Efficient Bi-Level Adaptive Thresholding

Authors: Abhijit Mitra

Abstract:

An effective approach for extracting document images from a noisy background is introduced. The entire scheme is divided into three sub- stechniques – the initial preprocessing operations for noise cluster tightening, introduction of a new thresholding method by maximizing the ratio of stan- dard deviations of the combined effect on the image to the sum of weighted classes and finally the image restoration phase by image binarization utiliz- ing the proposed optimum threshold level. The proposed method is found to be efficient compared to the existing schemes in terms of computational complexity as well as speed with better noise rejection.

Keywords: Document image extraction, Preprocessing, Ratio of stan-dard deviations, Bi-level adaptive thresholding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1452
1063 Defining a Framework for Holistic Life Cycle Assessment of Building Components

Authors: Naomi Grigoryan, Alexandros Loutsioli Daskalakis, Anna Elisse Uy, Yihe Huang, Aude Laurent (Webanck)

Abstract:

In response to the building and construction sectors accounting for a third of all energy demand and emissions, the European Union has placed new laws and regulations in the construction sector that emphasize material circularity, energy efficiency, biodiversity, and social impact. Existing design tools assess sustainability in early-stage design for products or buildings; however, there is no standardized methodology for measuring the circularity performance of building components. Existing assessment methods for building components focus primarily on carbon footprint but lack the comprehensive analysis required to design for circularity. The research conducted in this paper covers the parameters needed to assess sustainability in the design process of architectural products such as doors, windows, and facades. It maps a framework for a tool that assists designers with real-time sustainability metrics. Considering the life cycle of building components such as façades, windows, and doors involves the life cycle stages applied to product design and many of the methods used in the life cycle analysis of buildings. The current industry standards of sustainability assessment for metal building components follow cradle-to-grave life cycle assessment (LCA), track Global Warming Potential (GWP), and document the parameters used for an Environmental Product Declaration (EPD). Expanding on the MCI with additional indicators such as the Water Circularity Index (WCI), the Energy Circularity Index (ECI), the Social Circularity Index (SCI), Life Cycle Economic Value (EV), and calculating biodiversity risk and uncertainty, the assessment methodology of an architectural product's impact can be targeted more specifically based on product requirements, performance, and lifespan. Broadening the scope of LCA calculation for products to incorporate aspects of building design allows product designers to account for the disassembly of architectural components. For example, the MCI for architectural products such as windows and facades is typically low due to the impact of glass, as 70% of glass ends up in landfills due to damage in the disassembly process. The low MCI can be combatted by expanding beyond cradle-to-grave assessment and focusing the design process on disassembly, recycling, and repurposing with the help of real-time assessment tools. Design for Disassembly and Urban Mining has been integrated within the construction field on small scales as project-based exercises, not addressing the entire supply chain of architectural products. By adopting more comprehensive sustainability metrics and incorporating uncertainty calculations, the sustainability assessment of building components can be more accurately assessed with decarbonization and disassembly in mind, addressing the large-scale commercial markets within construction, some of the most significant contributors to climate change.

Keywords: Architectural products, early-stage design, life cycle assessment, material circularity indicator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30
1062 Health Assessment of Electronic Products using Mahalanobis Distance and Projection Pursuit Analysis

Authors: Sachin Kumar, Vasilis Sotiris, Michael Pecht

Abstract:

With increasing complexity in electronic systems there is a need for system level anomaly detection and fault isolation. Anomaly detection based on vector similarity to a training set is used in this paper through two approaches, one the preserves the original information, Mahalanobis Distance (MD), and the other that compresses the data into its principal components, Projection Pursuit Analysis. These methods have been used to detect deviations in system performance from normal operation and for critical parameter isolation in multivariate environments. The study evaluates the detection capability of each approach on a set of test data with known faults against a baseline set of data representative of such “healthy" systems.

Keywords: Mahalanobis distance, Principle components, Projection pursuit, Health assessment, Anomaly.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1677
1061 Two Different Solutions for Gigabit Ethernet Transmission over POF

Authors: Stefano Straullu, Silvio Abrate, Antonino Nespola, Paolo Savio, Roberto Gaudino

Abstract:

Two completely different approaches for a Gigabit Ethernet compliant stream transmission over 50m of 1mm PMMA SI-POF have been experimentally demonstrated and are compared in this paper. The first solution is based on a commercial RC-LED transmission and a careful optimization of the physical layer architecture, realized during the POF-PLUS EU Project. The second solution exploits the performance of an edge-emitting laser at the transmitter side in order to avoid any sort of electrical equalization at the receiver side.

Keywords: Gigabit Ethernet, Home Networking, Step-Index Polymer Optical Fiber (SI-POF)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786
1060 Some Characteristics of Systolic Arrays

Authors: Halil Snopce, Ilir Spahiu

Abstract:

In this paper is investigated a possible optimization of some linear algebra problems which can be solved by parallel processing using the special arrays called systolic arrays. In this paper are used some special types of transformations for the designing of these arrays. We show the characteristics of these arrays. The main focus is on discussing the advantages of these arrays in parallel computation of matrix product, with special approach to the designing of systolic array for matrix multiplication. Multiplication of large matrices requires a lot of computational time and its complexity is O(n3 ). There are developed many algorithms (both sequential and parallel) with the purpose of minimizing the time of calculations. Systolic arrays are good suited for this purpose. In this paper we show that using an appropriate transformation implicates in finding more optimal arrays for doing the calculations of this type.

Keywords: Data dependences, matrix multiplication, systolicarray, transformation matrix.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1519
1059 An Investigation on the Variation of Software Development Productivity

Authors: Zhizhong Jiang, Peter Naudé, Craig Comstock

Abstract:

The productivity of software development is one of the major concerns for project managers. Given the increasing complexity of the software being developed and the concomitant rise in the typical project size, the productivity has not consistently improved. By analyzing the latest release of ISBSG data repository with 4106 projects ever developed, we report on the factors found to significantly influence productivity, and present an original model for the estimation of productivity during project design. We further illustrate that software development productivity has experienced irregular variations between the years 1995 and 2005. Considering the factors significant to productivity, we found its variations are primarily caused by the variations of average team size for the development and the unbalanced use of the less productive development language 3GL.

Keywords: Development Platform, Function Point, Language, Productivity, Software Engineering, Team Size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1672
1058 The Simulation and Experimental Investigation to Study the Strain Distribution Pattern during the Closed Die Forging Process

Authors: D. B. Gohil

Abstract:

Closed die forging is a very complex process, and measurement of actual forces for real material is difficult and time consuming. Hence, the modelling technique has taken the advantage of carrying out the experimentation with the proper model material which needs lesser forces and relatively low temperature. The results of experiments on the model material then may be correlated with the actual material by using the theory of similarity. There are several methods available to resolve the complexity involved in the closed die forging process. Finite Element Method (FEM) and Finite Difference Method (FDM) are relatively difficult as compared to the slab method. The slab method is very popular and very widely used by the people working on shop floor because it is relatively easy to apply and reasonably accurate for most of the common forging load requirement computations.

Keywords: Experimentation, forging, process modeling, strain distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1358
1057 Kirchhoff’s Depth Migration over Heterogeneous Velocity Models with Ray Tracing Modeling Approach

Authors: Alok Kumar Routa, Priya Ranjan Mohanty

Abstract:

Complex seismic signatures are generated due to the complexity of the subsurface which is difficult to interpret. In the present study, an attempt has been made to model the complex subsurface using the Ray tracing modeling technique. Add to this, for the imaging of these geological features, Kirchhoff’s prestack depth migration is applied over the synthetic common shot gather dataset. It is found that the Kirchhoff’s migration technique in addition with the Ray tracing modeling concept has the flexibility towards the imaging of various complex geology which gives satisfactory results with proper delineation of the reflectors at their respective true depth position. The entire work has been carried out under the MATLAB environment.

Keywords: Kirchhoff’s migration, Prestack depth migration, Ray tracing modeling, Velocity model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370
1056 Calculus Logarithmic Function for Image Encryption

Authors: Adil AL-Rammahi

Abstract:

When we prefer to make the data secure from various attacks and fore integrity of data, we must encrypt the data before it is transmitted or stored. This paper introduces a new effective and lossless image encryption algorithm using a natural logarithmic function. The new algorithm encrypts an image through a three stage process. In the first stage, a reference natural logarithmic function is generated as the foundation for the encryption image. The image numeral matrix is then analyzed to five integer numbers, and then the numbers’ positions are transformed to matrices. The advantages of this method is useful for efficiently encrypting a variety of digital images, such as binary images, gray images, and RGB images without any quality loss. The principles of the presented scheme could be applied to provide complexity and then security for a variety of data systems such as image and others.

Keywords: Linear Systems, Image Encryption, Calculus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2394
1055 Optimized Detection in Multi-Antenna System using Particle Swarm Algorithm

Authors: A. A. Khan, M. Naeem, S. Bashir, S. I. Shah

Abstract:

In this paper we propose a Particle Swarm heuristic optimized Multi-Antenna (MA) system. Efficient MA systems detection is performed using a robust stochastic evolutionary computation algorithm based on movement and intelligence of swarms. This iterative particle swarm optimized (PSO) detector significantly reduces the computational complexity of conventional Maximum Likelihood (ML) detection technique. The simulation results achieved with this proposed MA-PSO detection algorithm show near optimal performance when compared with ML-MA receiver. The performance of proposed detector is convincingly better for higher order modulation schemes and large number of antennas where conventional ML detector becomes non-practical.

Keywords: Multi Antenna (MA), Multi-input Multi-output(MIMO), Particle Swarm Optimization (PSO), ML detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1499
1054 Performance Analysis of IDMA Scheme Using Quasi-Cyclic Low Density Parity Check Codes

Authors: Anurag Saxena, Alkesh Agrawal, Dinesh Kumar

Abstract:

The next generation mobile communication systems i.e. fourth generation (4G) was developed to accommodate the quality of service and required data rate. This project focuses on multiple access technique proposed in 4G communication systems. It is attempted to demonstrate the IDMA (Interleave Division Multiple Access) technology. The basic principle of IDMA is that interleaver is different for each user whereas CDMA employs different signatures. IDMA inherits many advantages of CDMA such as robust against fading, easy cell planning; dynamic channel sharing and IDMA increase the spectral efficiency and reduce the receiver complexity. In this, performance of IDMA is analyzed using QC-LDPC coding scheme further it is compared with LDPC coding and at last BER is calculated and plotted in MATLAB.

Keywords: 4G, QC-LDPC, CDMA, IDMA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1983
1053 Verifying the Supremacy of Volume Modulated Arc Therapy Over Intensity Modulated Radiation Therapy: Pelvis Malignancies’ Perspective

Authors: M. Umar Farooq, T. Ahmad Afridi, M. Zia-Ul-Islam Arsalan, U. Hussain Haider, S. Ullah

Abstract:

Cancer, a leading fatal disease worldwide, can be treated with various techniques including radiation therapy. It involves the use of ionizing radiation to target cancer cells. On basis of source placement, radiation therapy is of two types i.e., Brachytherapy and External Beam Radiotherapy (EBRT). EBRT has evolved from 2-D conventional therapy to 3-D Conformal radiotherapy (3D-CRT) and then Intensity-Modulated Radiotherapy (IMRT). IMRT improves dose conformity and sparing of organs at risk. Volumetric Modulated Arc Therapy (VMAT) is a modern technique that uses treatment delivery in arcs with rotation of the gantry. In this report, a dosimetry comparison was performed between IMRT and VMAT. This study was conducted in the Radiotherapy Department of the Institute of Nuclear Medicine and Oncology Lahore (INMOL). Ten patients with Prostate Carcinoma were selected for this study to compare the methods. Simulation of these patients was done with help of a CT Simulator. All target volumes and organs were delineated by the oncologists. Then suitable fields/arcs were applied which cover volumes effectively. This was followed by the optimization of plans for both techniques for every patient. Finally, a comparison of evaluating parameters e.g., Conformity Index (CI), Volume Coverage, Homogeneity Index (HI), Organ Doses, and MUs (Monitor Units) was performed. We obtained better results of target conformity indices from VMAT (CI = 1.16) than IMRT (CI = 1.24). VMAT was better in organ sparing too. Also, VMAT shows fewer MUs (733 MUs) as compared to IMRT (2149 MUs). From this study, it is concluded that VMAT is a better treatment technique than IMRT. This technique will enhance treatment efficiency as it takes less time in obtaining the required results. Also, a very less scatter dose will be delivered to the patient.

Keywords: 2-D Conventional Radiotherapy, 3-D Conformal Radiotherapy, Intensity Modulated Radiotherapy, Prostate Carcinoma, Radiotherapy, Volumetric Modulated Arc Therapy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 360
1052 ISTER (Immune System - Tumor Efficiency Rate): An Important Key for Planning in Radiotherapic Facilities

Authors: O. Sotolongo-Grau, D. Rodriguez-Perez, J. A. Santos-Miranda, M. M. Desco, O. Sotolongo-Costa, J. C. Antoranz

Abstract:

The use of the oncologic index ISTER allows for a more effective planning of the radiotherapic facilities in the hospitals. Any change in the radiotherapy treatment, due to unexpected stops, may be adapted by recalculating the doses to the new treatment duration while keeping the optimal prognosis. The results obtained in a simulation model on millions of patients allow the definition of optimal success probability algorithms.

Keywords: Mathematical model, radiation oncology, dynamical systems applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1539
1051 A P-SPACE Algorithm for Groebner Bases Computation in Boolean Rings

Authors: Quoc-Nam Tran

Abstract:

The theory of Groebner Bases, which has recently been honored with the ACM Paris Kanellakis Theory and Practice Award, has become a crucial building block to computer algebra, and is widely used in science, engineering, and computer science. It is wellknown that Groebner bases computation is EXP-SPACE in a general setting. In this paper, we give an algorithm to show that Groebner bases computation is P-SPACE in Boolean rings. We also show that with this discovery, the Groebner bases method can theoretically be as efficient as other methods for automated verification of hardware and software. Additionally, many useful and interesting properties of Groebner bases including the ability to efficiently convert the bases for different orders of variables making Groebner bases a promising method in automated verification.

Keywords: Algorithm, Complexity, Groebner basis, Applications of Computer Science.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1828
1050 Equivalence Class Subset Algorithm

Authors: Jeffrey L. Duffany

Abstract:

The equivalence class subset algorithm is a powerful tool for solving a wide variety of constraint satisfaction problems and is based on the use of a decision function which has a very high but not perfect accuracy. Perfect accuracy is not required in the decision function as even a suboptimal solution contains valuable information that can be used to help find an optimal solution. In the hardest problems, the decision function can break down leading to a suboptimal solution where there are more equivalence classes than are necessary and which can be viewed as a mixture of good decision and bad decisions. By choosing a subset of the decisions made in reaching a suboptimal solution an iterative technique can lead to an optimal solution, using series of steadily improved suboptimal solutions. The goal is to reach an optimal solution as quickly as possible. Various techniques for choosing the decision subset are evaluated.

Keywords: np-complete, complexity, algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1359
1049 Web Log Mining by an Improved AprioriAll Algorithm

Authors: Wang Tong, He Pi-lian

Abstract:

This paper sets forth the possibility and importance about applying Data Mining in Web logs mining and shows some problems in the conventional searching engines. Then it offers an improved algorithm based on the original AprioriAll algorithm which has been used in Web logs mining widely. The new algorithm adds the property of the User ID during the every step of producing the candidate set and every step of scanning the database by which to decide whether an item in the candidate set should be put into the large set which will be used to produce next candidate set. At the meantime, in order to reduce the number of the database scanning, the new algorithm, by using the property of the Apriori algorithm, limits the size of the candidate set in time whenever it is produced. Test results show the improved algorithm has a more lower complexity of time and space, better restrain noise and fit the capacity of memory.

Keywords: Candidate Sets Pruning, Data Mining, ImprovedAlgorithm, Noise Restrain, Web Log

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2273
1048 Extraction of Knowledge Complexity in 3G Killer Application Construction for Telecommunications National Strategy

Authors: Muhammad Suryanegara, Dendi Wijayatullah, Dadang Gunawan

Abstract:

We review a knowledge extractor model in constructing 3G Killer Applications. The success of 3G is essential for Government as it became part of Telecommunications National Strategy. The 3G wireless technologies may reach larger area and increase country-s ICT penetration. In order to understand future customers needs, the operators require proper information (knowledge) lying inside. Our work approached future customers as complex system where the complex knowledge may expose regular behavior. The hidden information from 3G future customers is revealed by using fractal-based questionnaires. Afterward, further statistical analysis is used to match the results with operator-s strategic plan. The developments of 3G applications also consider its saturation time and further improvement of the application.

Keywords: 3G Killer Applications, Knowledge, Telecommunications Strategy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1175
1047 Dignity and Suffering: Reading of Human Rights in Untouchable by Anand

Authors: Norah A. Elgibreen

Abstract:

Cultural stories are political. They register cultural phenomena and their relations with the world and society in term of their existence, function, characteristics by using different context. This paper will provide a new way of rethinking which will help us to rethink the relationship between fiction and politics. It discusses the theme of human rights and it shows the relevance between art and politics by studying the civil society through a literary framework. Reasons to establish a relationship between fiction and politics are the relevant themes and universal issues among the two disciplines. Both disciplines are sets of views and ideas formulated by the human mind to explain political or cultural phenomenon. Other reasons are the complexity and depth of the author-s vision, and the need to explain the violations of human rights in a more active structure which can relate to emotional and social existence.

Keywords: dignity, human rights, politics and literature, Untouchable.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3299
1046 A Novel Design Methodology for a 1.5 KW DC/DC Converter in EV and Hybrid EV Applications

Authors: Farhan Beg

Abstract:

This paper presents a method for the efficient implementation of a unidirectional or bidirectional DC/DC converter. The DC/DC converter is used essentially for energy exchange between the low voltage service battery and a high voltage battery commonly found in Electric Vehicle applications. In these applications, apart from cost, efficiency of design is an important characteristic. A useful way to reduce the size of electronic equipment in the electric vehicles is proposed in this paper. The technique simplifies the mechanical complexity and maximizes the energy usage using the latest converter control techniques. Moreover a bidirectional battery charger for hybrid electric vehicles is also implemented in this paper. Several simulations on the test system have been carried out in Matlab/Simulink environment. The results exemplify the robustness of the proposed design methodology in case of a 1.5 KW DC-DC converter.

Keywords: DC-DC converters, Electric Vehicles, Direct Current Control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3693
1045 Government of Ghana’s Budget: An Assessment of Its Compliance with Fundamental Budgeting Principles

Authors: Mohammed Sani Abdulai

Abstract:

Public sector budgeting, all over the world, is underpinned by some universally accepted principles of sound budget management such as budget unity, universality, annuality, and a balanced budget. These traditional principles, though fundamental, had, in recent years, been augmented by the more modern principles of budgeting within fiscal objective, alignment with medium-term strategic plans as well as the observance of such related concepts as transparency, openness and accessibility. In this paper, we have endeavored to shed light, from literature and practice, on the meaning and purposes of such fundamental budgeting principles. We have also assessed the extent to which the Government of Ghana’s budget complies with the four traditional principles of budget unity, universality, annuality, and a balanced budget and the three out of the ten modern principles of budgetary governance of Organisation for Economic Co-operation and Development (OECD). We did so by using a qualitative method of review and analysis of existing documents and the performance assessment reports on Ghana’s Public Financial Management (PFM) measured using such frameworks as the Public Expenditure and Financial Accountability (PEFA), the Open Budget Survey (OBS) and its Index (OBI), the reports and action plans of Open Government Partnership (OGP) and the Global Initiative for Fiscal Transparency (GIFT). Other performance assessment reports that were relied on included, but not limited to, the Joint Evaluation Report of PFM in Ghana, 2001-2010, and the Joint Evaluation of Budget Support to Ghana, 2005-2015. We have, through this paper, brought to the fore the lessons that could be learned on how those budgetary principles undergird the Government of Ghana’s budget formulation, execution, accounting, control, and oversight. These lessons include, but are not limited to, the need for both scholars and practitioners in the PFM space to be aware of the impact of those principles on public sector budgeting.

Keywords: Annulaity, Balanced Budget, Budget Unity, Budgetary Principles, OECD’s Principles on Budgetary Governance, Open Budget Index, Public Expenditure and Financial Accountability, Universality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 647
1044 Conditioning Process of Fresh Activated Sludge

Authors: Salam K Al-Dawery, Mustafa S Nasser

Abstract:

The effect of polyelectrolytes; cationic and anionic charges and coagulants have been investigated for fresh activated sludge at different concentrations and pH values in a comparative fashion. The results from the experiments indicate that the cationic polyelectrolytes have a significant effluence on the sludge characteristic, degree of flocculation and water quality such as turbidity and SVI. The results show that the cationic CPAM-80 is the most effective polyelectrolyte used corresponding to turbidity and SVI despite of the variations in feed properties of the fresh activated sludge.

Keywords: Coagulant, Polyelectrolyte, Settling volume index, Turbidity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1714
1043 A Practical Distributed String Matching Algorithm Architecture and Implementation

Authors: Bi Kun, Gu Nai-jie, Tu Kun, Liu Xiao-hu, Liu Gang

Abstract:

Traditional parallel single string matching algorithms are always based on PRAM computation model. Those algorithms concentrate on the cost optimal design and the theoretical speed. Based on the distributed string matching algorithm proposed by CHEN, a practical distributed string matching algorithm architecture is proposed in this paper. And also an improved single string matching algorithm based on a variant Boyer-Moore algorithm is presented. We implement our algorithm on the above architecture and the experiments prove that it is really practical and efficient on distributed memory machine. Its computation complexity is O(n/p + m), where n is the length of the text, and m is the length of the pattern, and p is the number of the processors.

Keywords: Boyer-Moore algorithm, distributed algorithm, parallel string matching, string matching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2186
1042 Low Complexity Regular LDPC codes for Magnetic Storage Devices

Authors: Gabofetswe Malema, Michael Liebelt

Abstract:

LDPC codes could be used in magnetic storage devices because of their better decoding performance compared to other error correction codes. However, their hardware implementation results in large and complex decoders. This one of the main obstacles the decoders to be incorporated in magnetic storage devices. We construct small high girth and rate 2 columnweight codes from cage graphs. Though these codes have low performance compared to higher column weight codes, they are easier to implement. The ease of implementation makes them more suitable for applications such as magnetic recording. Cages are the smallest known regular distance graphs, which give us the smallest known column-weight 2 codes given the size, girth and rate of the code.

Keywords: Structured LDPC codes, cage graphs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2109
1041 Effects of Four Dietary Oils on Cholesterol and Fatty Acid Composition of Egg Yolk in Layers

Authors: A. F. Agboola, B. R. O. Omidiwura, A. Oyeyemi, E. A. Iyayi, A. S. Adelani

Abstract:

Dietary cholesterol has elicited the most public interest as it relates with coronary heart disease. Thus, humans have been paying more attention to health, thereby reducing consumption of cholesterol enriched food. Egg is considered as one of the major sources of human dietary cholesterol. However, an alternative way to reduce the potential cholesterolemic effect of eggs is to modify the fatty acid composition of the yolk. The effect of palm oil (PO), soybean oil (SO), sesame seed oil (SSO) and fish oil (FO) supplementation in the diets of layers on egg yolk fatty acid, cholesterol, egg production and egg quality parameters were evaluated in a 42-day feeding trial. One hundred and five Isa Brown laying hens of 34 weeks of age were randomly distributed into seven groups of five replicates and three birds per replicate in a completely randomized design. Seven corn-soybean basal diets (BD) were formulated: BD+No oil (T1), BD+1.5% PO (T2), BD+1.5% SO (T3), BD+1.5% SSO (T4), BD+1.5% FO (T5), BD+0.75% SO+0.75% FO (T6) and BD+0.75% SSO+0.75% FO (T7). Five eggs were randomly sampled at day 42 from each replicate to assay for the cholesterol, fatty acid profile of egg yolk and egg quality assessment. Results showed that there were no significant (P>0.05) differences observed in production performance, egg cholesterol and egg quality parameters except for yolk height, albumen height, yolk index, egg shape index, haugh unit, and yolk colour. There were no significant differences (P>0.05) observed in total cholesterol, high density lipoprotein and low density lipoprotein levels of egg yolk across the treatments. However, diets had effect (P<0.05) on TAG (triacylglycerol) and VLDL (very low density lipoprotein) of the egg yolk. The highest TAG (603.78 mg/dl) and VLDL values (120.76 mg/dl) were recorded in eggs of hens on T4 (1.5% sesame seed oil) and was similar to those on T3 (1.5% soybean oil), T5 (1.5% fish oil) and T6 (0.75% soybean oil + 0.75% fish oil). However, results revealed a significant (P<0.05) variations on eggs’ summation of polyunsaturated fatty acid (PUFA). In conclusion, it is suggested that dietary oils could be included in layers’ diets to produce designer eggs low in cholesterol and high in PUFA especially omega-3 fatty acids.

Keywords: Dietary oils, Egg cholesterol, Egg fatty acid profile, Egg quality parameters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2077
1040 Optimization of Quantization in Higher Order Modulations for LDPC-Coded Systems

Authors: M.Sushanth Babu, P.Krishna, U.Venu, M.Ranjith

Abstract:

In this paper, we evaluate the choice of suitable quantization characteristics for both the decoder messages and the received samples in Low Density Parity Check (LDPC) coded systems using M-QAM (Quadrature Amplitude Modulation) schemes. The analysis involves the demapper block that provides initial likelihood values for the decoder, by relating its quantization strategy of the decoder. A mapping strategy refers to the grouping of bits within a codeword, where each m-bit group is used to select a 2m-ary signal in accordance with the signal labels. Further we evaluate the system with mapping strategies like Consecutive-Bit (CB) and Bit-Reliability (BR). A new demapper version, based on approximate expressions, is also presented to yield a low complexity hardware implementation.

Keywords: Low Density parity Check, Mapping, Demapping, Quantization, Quadrature Amplitude Modulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731
1039 Application of Genetic Algorithms to Feature Subset Selection in a Farsi OCR

Authors: M. Soryani, N. Rafat

Abstract:

Dealing with hundreds of features in character recognition systems is not unusual. This large number of features leads to the increase of computational workload of recognition process. There have been many methods which try to remove unnecessary or redundant features and reduce feature dimensionality. Besides because of the characteristics of Farsi scripts, it-s not possible to apply other languages algorithms to Farsi directly. In this paper some methods for feature subset selection using genetic algorithms are applied on a Farsi optical character recognition (OCR) system. Experimental results show that application of genetic algorithms (GA) to feature subset selection in a Farsi OCR results in lower computational complexity and enhanced recognition rate.

Keywords: Feature Subset Selection, Genetic Algorithms, Optical Character Recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1973
1038 Pilot Directional Protection Scheme Using Wireless Communication

Authors: Nitish Sharma, G. G. Karady

Abstract:

This paper presents a scheme for the protection of loop system from all type of faults using the direction of fault current. The presence of distributed generation in today’s system increases the complexity of fault detection as the power flow is bidirectional. Hence, protection scheme specific to this purpose needs to be developed. This paper shows a fast protection scheme using communication which can be fiber optic or wireless. In this paper, the possibility of wireless communication for protection is studied to exchange the information between the relays. The negative sequence and positive sequence directional elements are used to determine the direction of fault current. A PSCAD simulation is presented and validated using commercial SEL relays.

Keywords: Smart grid protection, pilot protection, power system simulation, wireless communication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1233