Search results for: entity's size.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1877

Search results for: entity's size.

1637 A NXM Version of 5X5 Playfair Cipher for any Natural Language (Urdu as Special Case)

Authors: Muhammad Salam, Nasir Rashid, Shah Khalid, Muhammad Raees Khan

Abstract:

In this paper a modified version NXM of traditional 5X5 playfair cipher is introduced which enable the user to encrypt message of any Natural language by taking appropriate size of the matrix depending upon the size of the natural language. 5X5 matrix has the capability of storing only 26 characters of English language and unable to store characters of any language having more than 26 characters. To overcome this limitation NXM matrix is introduced which solve this limitation. In this paper a special case of Urdu language is discussed. Where # is used for completing odd pair and * is used for repeating letters.

Keywords: cryptography, decryption, encryption, playfair cipher, traditional cipher.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2117
1636 Automatic Visualization Pipeline Formation for Medical Datasets on Grid Computing Environment

Authors: Aboamama Atahar Ahmed, Muhammad Shafie Abd Latiff, Kamalrulnizam Abu Bakar, Zainul AhmadRajion

Abstract:

Distance visualization of large datasets often takes the direction of remote viewing and zooming techniques of stored static images. However, the continuous increase in the size of datasets and visualization operation causes insufficient performance with traditional desktop computers. Additionally, the visualization techniques such as Isosurface depend on the available resources of the running machine and the size of datasets. Moreover, the continuous demand for powerful computing powers and continuous increase in the size of datasets results an urgent need for a grid computing infrastructure. However, some issues arise in current grid such as resources availability at the client machines which are not sufficient enough to process large datasets. On top of that, different output devices and different network bandwidth between the visualization pipeline components often result output suitable for one machine and not suitable for another. In this paper we investigate how the grid services could be used to support remote visualization of large datasets and to break the constraint of physical co-location of the resources by applying the grid computing technologies. We show our grid enabled architecture to visualize large medical datasets (circa 5 million polygons) for remote interactive visualization on modest resources clients.

Keywords: Visualization, Grid computing, Medical datasets, visualization techniques, thin clients, Globus toolkit, VTK.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719
1635 Limestone Briquette Production and Characterization

Authors: André C. Silva, Mariana R. Barros, Elenice M. S. Silva, Douglas. Y. Marinho, Diego F. Lopes, Débora N. Sousa, Raphael S. Tomáz

Abstract:

Modern agriculture requires productivity, efficiency and quality. Therefore, there is need for agricultural limestone implementation that provides adequate amounts of calcium and magnesium carbonates in order to correct soil acidity. During the limestone process, fine particles (with average size under 400#) are generated. These particles do not have economic value in agricultural and metallurgical sectors due their size. When limestone is used for agriculture purposes, these fine particles can be easily transported by wind generated air pollution. Therefore, briquetting, a mineral processing technique, was used to mitigate this problem resulting in an agglomerated product suitable for agriculture use. Briquetting uses compressive pressure to agglomerate fine particles. It can be aided by agglutination agents, allowing adjustments in shape, size and mechanical parameters of the mass. Briquettes can generate extra profits for mineral industry, presenting as a distinct product for agriculture, and can reduce the environmental liabilities of the fine particles storage or disposition. The produced limestone briquettes were subjected to shatter and water action resistance tests. The results show that after six minutes completely submerged in water, the briquettes where fully diluted, a highly favorable result considering its use for soil acidity correction.

Keywords: Agglomeration, briquetting, limestone, agriculture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1532
1634 Effects of Particle Size Distribution of Binders on the Performance of Slag-Limestone Ternary Cement

Authors: Zhuomin Zou, Thijs Van Landeghem, Elke Gruyaert

Abstract:

Using supplementary cementitious materials, such as ground granulated blast-furnace slag (GGBFS) and limestone to replace Portland cement (PC) is a promising method to reduce the carbon emissions from cement production. To efficiently use GGBFS and limestone, it is necessary to carefully select the particle size distribution (PSD) of the binders. This study investigated the effects of the PSD of binders on the performance of slag-limestone ternary cement. Based on the PSD parameters of the binders, three types of ternary cements with a similar overall PSD were designed, i.e., No.1 fine GGBFS, medium PC, and coarse limestone; No.2 fine limestone, medium PC, and coarse GGBFS; No.3. fine PC, medium GGBFS, and coarse limestone. The binder contents in the ternary cements were 50% PC, 40% slag, and 10% limestone. The mortar performance of the three ternary cements was investigated in terms of flow table value, strength at 28 days, carbonation resistance and non-steady state chloride migration resistance at 28 days. Results show that ternary cement with fine limestone (No.2) has the weakest performance among the three ternary cements. Ternary cements with fine slag (No.1) show an overall comparable performance to ternary cement with fine PC (No.3). Moreover, the chloride migration coefficient of ternary cements with fine slag (No.1) is significantly lower than the other two ternary cements.

Keywords: Limestone, particle size distribution, slag, ternary cement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 273
1633 Improved MARS Ciphering Using a Metamorphic-Enhanced Function

Authors: Moataz M. Naguib, Hatem Khater, A. Baith Mohamed

Abstract:

MARS is a shared-key (symmetric) block cipher algorithm supporting 128-bit block size and a variable key size of between 128 and 448 bits. MARS has a several rounds of cryptographic core that is designed to take advantage of the powerful results for improving security/performance tradeoff over existing ciphers. In this work, a new function added to improve the ciphering process it is called, Meta-Morphic function. This function use XOR, Rotating, Inverting and No-Operation logical operations before and after encryption process. The aim of these operations is to improve MARS cipher process and makes a high confusion criterion for the Ciphertext.

Keywords: AES, MARS, Metamorphic, Cryptography, Block Cipher.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1998
1632 Automatic Text Summarization

Authors: Mohamed Abdel Fattah, Fuji Ren

Abstract:

This work proposes an approach to address automatic text summarization. This approach is a trainable summarizer, which takes into account several features, including sentence position, positive keyword, negative keyword, sentence centrality, sentence resemblance to the title, sentence inclusion of name entity, sentence inclusion of numerical data, sentence relative length, Bushy path of the sentence and aggregated similarity for each sentence to generate summaries. First we investigate the effect of each sentence feature on the summarization task. Then we use all features score function to train genetic algorithm (GA) and mathematical regression (MR) models to obtain a suitable combination of feature weights. The proposed approach performance is measured at several compression rates on a data corpus composed of 100 English religious articles. The results of the proposed approach are promising.

Keywords: Automatic Summarization, Genetic Algorithm, Mathematical Regression, Text Features.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2279
1631 Dynamics of Phytoplankton Blooms in the Baltic Sea – Numerical Simulations

Authors: L. Dzierzbicka-Głowacka, M. Janecki

Abstract:

Dynamic of phytoplankton blooms in the Baltic Sea has been analyzed applying the numerical ecosystem model 3D CEMBS. The model consists of the hydrodynamic model (POP, version 2.1) and the ice model (CICE, version 4.0), which are imposed by the atmospheric data model (DATM7). The 3D model has an ecosystem module, activated in 2012 in the operational mode. The ecosystem model consists of 11 main variables: biomass of small-size phytoplankton and large-size phytoplankton and cyanobacteria, zooplankton biomass, dissolved and molecular detritus, dissolved oxygen concentration, as well as concentrations of nutrients, including: nitrates, ammonia, phosphates and silicates. The 3D-CEMBS model is an effective tool for solving problems related to phytoplankton blooms dynamic in the Baltic Sea

Keywords: Ecosystem model, phytoplankton, Baltic Sea

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2633
1630 Performance Analysis of Wireless Ad-Hoc Network Based on EDCA IEEE802.11e

Authors: Shah Ahsanuzzaman Md. Tariq, Fabrizio Granelli

Abstract:

IEEE 802.11e is the enhanced version of the IEEE 802.11 MAC dedicated to provide Quality of Service of wireless network. It supports QoS by the service differentiation and prioritization mechanism. Data traffic receives different priority based on QoS requirements. Fundamentally, applications are divided into four Access Categories (AC). Each AC has its own buffer queue and behaves as an independent backoff entity. Every frame with a specific priority of data traffic is assigned to one of these access categories. IEEE 802.11e EDCA (Enhanced Distributed Channel Access) is designed to enhance the IEEE 802.11 DCF (Distributed Coordination Function) mechanisms by providing a distributed access method that can support service differentiation among different classes of traffic. Performance of IEEE 802.11e MAC layer with different ACs is evaluated to understand the actual benefits deriving from the MAC enhancements.

Keywords: 802.11e, fairness, enhanced distributed channelaccess, access categories, quality of Service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1905
1629 Large Eddy Simulation of Compartment Fire with Gas Combustible

Authors: Mliki Bouchmel, Abbassi Mohamed Ammar, Kamel Geudri, Chrigui Mouldi, Omri Ahmed

Abstract:

The objective of this work is to use the Fire Dynamics Simulator (FDS) to investigate the behavior of a kerosene small-scale fire. FDS is a Computational Fluid Dynamics (CFD) tool developed specifically for fire applications. Throughout its development, FDS is used for the resolution of practical problems in fire protection engineering. At the same time FDS is used to study fundamental fire dynamics and combustion. Predictions are based on Large Eddy Simulation (LES) with a Smagorinsky turbulence model. LES directly computes the large-scale eddies and the sub-grid scale dissipative processes are modeled. This technique is the default turbulence model which was used in this study. The validation of the numerical prediction is done using a direct comparison of combustion output variables to experimental measurements. Effect of the mesh size on the temperature evolutions is investigated and optimum grid size is suggested. Effect of width openings is investigated. Temperature distribution and species flow are presented for different operating conditions. The effect of the composition of the used fuel on atmospheric pollution is also a focus point within this work. Good predictions are obtained where the size of the computational cells within the fire compartment is less than 1/10th of the characteristic fire diameter.

Keywords: Large eddy simulation, Radiation, Turbulence, combustion, pollution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2139
1628 Preparation of Porous Carbon Particles using a Spray-Drying Method with Colloidal Template

Authors: Yutaka Kisakibaru, AsepBayu Dani Nandiyanto, Ratna Balgis, Takashi Ogi, Kikuo Okuyama

Abstract:

spherical porous carbon particles with controllable porosity with a mean size of 2.5m have been prepared using a spray drying method with organic particle colloidal template. As a precursor, a mixing solution of carbon nanopowder and polystyrene (PS) particles as a template was used. The result showed that the particles with a good porous structure could be obtained. The pore size and shape (spherical) were identical to the initial template, giving a potential way for further developments. The control of particle porosity was also possible and reported in this paper, in which this control could be achieved by means of PS concentration.

Keywords: Porous structure particle; Carbon nanoparticles; Catalyst; Spray-drying method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2045
1627 The Dividend Payments for General Claim Size Distributions under Interest Rate

Authors: Li-Li Li, Jinghai Feng, Lixin Song

Abstract:

This paper evaluates the dividend payments for general claim size distributions in the presence of a dividend barrier. The surplus of a company is modeled using the classical risk process perturbed by diffusion, and in addition, it is assumed to accrue interest at a constant rate. After presenting the integro-differential equation with initial conditions that dividend payments satisfies, the paper derives a useful expression of the dividend payments by employing the theory of Volterra equation. Furthermore, the optimal value of dividend barrier is found. Finally, numerical examples illustrate the optimality of optimal dividend barrier and the effects of parameters on dividend payments.

Keywords: Dividend payout, Integro-differential equation, Jumpdiffusion model, Volterra equation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
1626 A Fuzzy Classifier with Evolutionary Design of Ellipsoidal Decision Regions

Authors: Leehter Yao, Kuei-Song Weng, Cherng-Dir Huang

Abstract:

A fuzzy classifier using multiple ellipsoids approximating decision regions for classification is to be designed in this paper. An algorithm called Gustafson-Kessel algorithm (GKA) with an adaptive distance norm based on covariance matrices of prototype data points is adopted to learn the ellipsoids. GKA is able toadapt the distance norm to the underlying distribution of the prototypedata points except that the sizes of ellipsoids need to be determined a priori. To overcome GKA's inability to determine appropriate size ofellipsoid, the genetic algorithm (GA) is applied to learn the size ofellipsoid. With GA combined with GKA, it will be shown in this paper that the proposed method outperforms the benchmark algorithms as well as algorithms in the field.

Keywords: Ellipsoids, genetic algorithm, classification, fuzzyc-means (FCM)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655
1625 Optimum Stratification of a Skewed Population

Authors: D.K. Rao, M.G.M. Khan, K.G. Reddy

Abstract:

The focus of this paper is to develop a technique of solving a combined problem of determining Optimum Strata Boundaries(OSB) and Optimum Sample Size (OSS) of each stratum, when the population understudy isskewed and the study variable has a Pareto frequency distribution. The problem of determining the OSB isformulated as a Mathematical Programming Problem (MPP) which is then solved by dynamic programming technique. A numerical example is presented to illustrate the computational details of the proposed method. The proposed technique is useful to obtain OSB and OSS for a Pareto type skewed population, which minimizes the variance of the estimate of population mean.

Keywords: Stratified sampling, Optimum strata boundaries, Optimum sample size, Pareto distribution, Mathematical programming problem, Dynamic programming technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4013
1624 Mechanical Properties and Chloride Diffusion of Ceramic Waste Aggregate Mortar Containing Ground Granulated Blast–Furnace Slag

Authors: H. Higashiyama, M. Sappakittipakorn, M. Mizukoshi, O. Takahashi

Abstract:

Ceramic Waste Aggregates (CWAs) were made from electric porcelain insulator wastes supplied from an electric power company, which were crushed and ground to fine aggregate sizes. In this study, to develop the CWA mortar as an eco–efficient, ground granulated blast–furnace slag (GGBS) as a Supplementary Cementitious Material (SCM) was incorporated. The water–to–binder ratio (W/B) of the CWA mortars was varied at 0.4, 0.5, and 0.6. The cement of the CWA mortar was replaced by GGBS at 20 and 40% by volume (at about 18 and 37% by weight). Mechanical properties of compressive and splitting tensile strengths, and elastic modulus were evaluated at the age of 7, 28, and 91 days. Moreover, the chloride ingress test was carried out on the CWA mortars in a 5.0% NaCl solution for 48 weeks. The chloride diffusion was assessed by using an electron probe microanalysis (EPMA). To consider the relation of the apparent chloride diffusion coefficient and the pore size, the pore size distribution test was also performed using a mercury intrusion porosimetry at the same time with the EPMA. The compressive strength of the CWA mortars with the GGBS was higher than that without the GGBS at the age of 28 and 91 days. The resistance to the chloride ingress of the CWA mortar was effective in proportion to the GGBS replacement level.

Keywords: Ceramic waste aggregate, Chloride diffusion, GGBS, Pore size distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1954
1623 An efficient Activity Network Reduction Algorithm based on the Label Correcting Tracing Algorithm

Authors: Weng Ming Chu

Abstract:

When faced with stochastic networks with an uncertain duration for their activities, the securing of network completion time becomes problematical, not only because of the non-identical pdf of duration for each node, but also because of the interdependence of network paths. As evidenced by Adlakha & Kulkarni [1], many methods and algorithms have been put forward in attempt to resolve this issue, but most have encountered this same large-size network problem. Therefore, in this research, we focus on network reduction through a Series/Parallel combined mechanism. Our suggested algorithm, named the Activity Network Reduction Algorithm (ANRA), can efficiently transfer a large-size network into an S/P Irreducible Network (SPIN). SPIN can enhance stochastic network analysis, as well as serve as the judgment of symmetry for the Graph Theory.

Keywords: Series/Parallel network, Stochastic network, Network reduction, Interdictive Graph, Complexity Index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1339
1622 Injunctions, Disjunctions, Remnants: The Reverse of Unity

Authors: Igor Guatelli

Abstract:

The universe of aesthetic perception entails impasses about sensitive divergences that each text or visual object may be subjected to. If approached through intertextuality that is not based on the misleading notion of kinships or similarities a priori admissible, the possibility of anachronistic, heterogeneous - and non-diachronic - assemblies can enhance the emergence of interval movements, intermediate, and conflicting, conducive to a method of reading, interpreting, and assigning meaning that escapes the rigid antinomies of the mere being and non-being of things. In negative, they operate in a relationship built by the lack of an adjusted meaning set by their positive existences, with no remainders; the generated interval becomes the remnant of each of them; it is the opening that obscures the stable positions of each one. Without the negative of absence, of that which is always missing or must be missing in a text, concept, or image made positive by history, nothing is perceived beyond what has been already given. Pairings or binary oppositions cannot lead only to functional syntheses; on the contrary, methodological disturbances accumulated by the approximation of signs and entities can initiate a process of becoming as an opening to an unforeseen other, transformation until a moment when the difficulties of [re]conciliation become the mainstay of a future of that sign/entity, not envisioned a priori. A counter-history can emerge from these unprecedented, misadjusted approaches, beginnings of unassigned injunctions and disjunctions, in short, difficult alliances that open cracks in a supposedly cohesive history, chained in its apparent linearity with no remains, understood as a categorical historical imperative. Interstices are minority fields that, because of their opening, are capable of causing opacity in that which, apparently, presents itself with irreducible clarity. Resulting from an incomplete and maladjusted [at the least dual] marriage between the signs/entities that originate them, this interval may destabilize and cause disorder in these entities and their own meanings. The interstitials offer a hyphenated relationship: a simultaneous union and separation, a spacing between the entity’s identity and its otherness or, alterity. One and the other may no longer be seen without the crack or fissure that now separates them, uniting, by a space-time lapse. Ontological, semantic shifts are caused by this fissure, an absence between one and the other, one with and against the other. Based on an improbable approximation between some conceptual and semantic shifts within the design production of architect Rem Koolhaas and the textual production of the philosopher Jacques Derrida, this article questions the notion of unity, coherence, affinity, and complementarity in the process of construction of thought from these ontological, epistemological, and semiological fissures that rattle the signs/entities and their stable meanings. Fissures in a thought that is considered coherent, cohesive, formatted are the negativity that constitutes the interstices that allow us to move towards what still remains as non-identity, which allows us to begin another story.

Keywords: Clearing, interstice, negative, remnant, spectrum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 376
1621 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — In the Case of Critical Dataset Size —

Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno

Abstract:

STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to real-world data

Keywords: Rule induction, decision table, missing data, noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1418
1620 Soil Mass Loss Reduction during Rainfalls by Reinforcing the Slopes with the Surficial Confinement

Authors: Ramli Nazir, Hossein Moayedi

Abstract:

Soil confinement systems serve as effective solutions to any erosion control project. Various confinements systems, namely triangular, circular and rectangular with the size of 50, 100, and 150 mm, and with a depth of 10 mm, were embedded in soil samples at slope angle of 60°. The observed soil mass losses for the confined soil systems were much smaller than those from unconfined system. As a result, the size of confinement and rainfall intensity have a direct effect on the soil mass loss. The triangular and rectangular confinement systems showed the lowest and highest soil loss masses, respectively. The slopes also failed much faster in the unconfined system than in the confined slope.

Keywords: Erosion control, Soil confinement, Soil erosion, Slope stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821
1619 Practical Design Procedures of 3D Reinforced Concrete Shear Wall-Frame Structure Based on Structural Optimization Method

Authors: H. Nikzad, S. Yoshitomi

Abstract:

This study investigates and develops the structural optimization method. The effect of size constraints on practical solution of reinforced concrete (RC) building structure with shear wall is proposed. Cross-sections of beam and column, and thickness of shear wall are considered as design variables. The objective function to be minimized is total cost of the structure by using a simple and efficient automated MATLAB platform structural optimization methodology. With modification of mathematical formulations, the result is compared with optimal solution without size constraints. The most suitable combination of section sizes is selected as for the final design application based on linear static analysis. The findings of this study show that defining higher value of upper bound of sectional sizes significantly affects optimal solution, and defining of size constraints play a vital role in finding of global and practical solution during optimization procedures. The result and effectiveness of proposed method confirm the ability and efficiency of optimal solutions for 3D RC shear wall-frame structure.

Keywords: Structural optimization, linear static analysis, ETABS, MATLAB, RC shear wall-frame structures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1229
1618 Comparative Analysis of Diversity and Similarity Indices with Special Relevance to Vegetations around Sewage Drains

Authors: Ekta Singh

Abstract:

Indices summarizing community structure are used to evaluate fundamental community ecology, species interaction, biogeographical factors, and environmental stress. Some of these indices are insensitive to gross community changes induced by contaminants of pollution. Diversity indices and similarity indices are reviewed considering their ecological application, both theoretical and practical. For some useful indices, empirical equations are given to calculate the expected maximum value of the indices to which the observed values can be related at any combination of sample sizes at the experimental sites. This paper examines the effects of sample size and diversity on the expected values of diversity indices and similarity indices, using various formulae. It has been shown that all indices are strongly affected by sample size and diversity. In some indices, this influence is greater than the others and an attempt has been made to deal with these influences.

Keywords: Biogeographical factors, Diversity Indices, Ecology and Similarity Indices

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2928
1617 Study of Efficiency and Capability LZW++ Technique in Data Compression

Authors: Yusof. Mohd Kamir, Mat Deris. Mohd Sufian, Abidin. Ahmad Faisal Amri

Abstract:

The purpose of this paper is to show efficiency and capability LZWµ in data compression. The LZWµ technique is enhancement from existing LZW technique. The modification the existing LZW is needed to produce LZWµ technique. LZW read one by one character at one time. Differ with LZWµ technique, where the LZWµ read three characters at one time. This paper focuses on data compression and tested efficiency and capability LZWµ by different data format such as doc type, pdf type and text type. Several experiments have been done by different types of data format. The results shows LZWµ technique is better compared to existing LZW technique in term of file size.

Keywords: Data Compression, Huffman Encoding, LZW, LZWµ, RLL, Size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2043
1616 Ant Colony Optimization for Optimal Distributed Generation in Distribution Systems

Authors: I. A. Farhat

Abstract:

The problem of optimal planning of multiple sources of distributed generation (DG) in distribution networks is treated in this paper using an improved Ant Colony Optimization algorithm (ACO). This objective of this problem is to determine the DG optimal size and location that in order to minimize the network real power losses. Considering the multiple sources of DG, both size and location are simultaneously optimized in a single run of the proposed ACO algorithm. The various practical constraints of the problem are taken into consideration by the problem formulation and the algorithm implementation. A radial power flow algorithm for distribution networks is adopted and applied to satisfy these constraints. To validate the proposed technique and demonstrate its effectiveness, the well-know 69-bus feeder standard test system is employed.cm.

Keywords: About Ant Colony Optimization (ACO), Distributed Generation (DG).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3227
1615 Ultrasound Assisted Extraction and Microwave Assisted Extraction of Carotenoids from Melon Shells

Authors: A. Brinda Lakshmi, J. Lakshmi Priya

Abstract:

Cantaloupes (muskmelon and watermelon) contain biologically active molecules such as carotenoids which are natural pigments used as food colorants and afford health benefits. ß-carotene is the major source of carotenoids present in muskmelon and watermelon shell. Carotenoids were extracted using Microwave assisted extraction (MAE) and Ultrasound assisted extraction (UAE) utilising organic lipophilic solvents such as acetone, methanol, and hexane. Extraction conditions feed-solvent ratio, microwave power, ultrasound frequency, temperature and particle size were varied and optimized. It was found that the yield of carotenoids was higher using UAE than MAE, and muskmelon had the highest yield of carotenoids when was ethanol used as a solvent for 0.5 mm particle size.

Keywords: Carotenoids, extraction, muskmelon shell, watermelon shell.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 929
1614 Decision Algorithm for Smart Airbag Deployment Safety Issues

Authors: Aini Hussain, M A Hannan, Azah Mohamed, Hilmi Sanusi, Burhanuddin Yeop Majlis

Abstract:

Airbag deployment has been known to be responsible for huge death, incidental injuries and broken bones due to low crash severity and wrong deployment decisions. Therefore, the authorities and industries have been looking for more innovative and intelligent products to be realized for future enhancements in the vehicle safety systems (VSSs). Although the VSSs technologies have advanced considerably, they still face challenges such as how to avoid unnecessary and untimely airbag deployments that can be hazardous and fatal. Currently, most of the existing airbag systems deploy without regard to occupant size and position. As such, this paper will focus on the occupant and crash sensing performances due to frontal collisions for the new breed of so called smart airbag systems. It intends to provide a thorough discussion relating to the occupancy detection, occupant size classification, occupant off-position detection to determine safe distance zone for airbag deployment, crash-severity analysis and airbag decision algorithms via a computer modeling. The proposed system model consists of three main modules namely, occupant sensing, crash severity analysis and decision fusion. The occupant sensing system module utilizes the weight sensor to determine occupancy, classify the occupant size, and determine occupant off-position condition to compute safe distance for airbag deployment. The crash severity analysis module is used to generate relevant information pertinent to airbag deployment decision. Outputs from these two modules are fused to the decision module for correct and efficient airbag deployment action. Computer modeling work is carried out using Simulink, Stateflow, SimMechanics and Virtual Reality toolboxes.

Keywords: Crash severity analysis, occupant size classification, smart airbag, vehicle safety system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4061
1613 An Improved Cuckoo Search Algorithm for Voltage Stability Enhancement in Power Transmission Networks

Authors: Reza Sirjani, Nobosse Tafem Bolan

Abstract:

Many optimization techniques available in the literature have been developed in order to solve the problem of voltage stability enhancement in power systems. However, there are a number of drawbacks in the use of previous techniques aimed at determining the optimal location and size of reactive compensators in a network. In this paper, an Improved Cuckoo Search algorithm is applied as an appropriate optimization algorithm to determine the optimum location and size of a Static Var Compensator (SVC) in a transmission network. The main objectives are voltage stability improvement and total cost minimization. The results of the presented technique are then compared with other available optimization techniques.

Keywords: Cuckoo search algorithm, optimization, power system, var compensators, voltage stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1304
1612 Affective Approach to Selected Ingmar Bergman Films

Authors: Grzegorz Zinkiewicz

Abstract:

The paper explores affective potential implicit in Bergman’s movies. This is done by the use of affect theory and the concept of affect in terms of paradigmatic and syntagmatic relations, from both diachronic and synchronic perspective. Since its inception in the early 2000s, affect theory has been applied to a number of academic fields. In Film Studies, it offers new avenues for discovering deeper, hidden layers of a given film. The aim is to show that the form and content of the films by Ingmar Bergman are determined by their inner affects that function independently of the viewer and, to an extent, are autonomous entities that can be analysed in separation from the auteur and actual characters. The paper discovers layers in Ingmar Bergman films and focuses on aspects that are often marginalised or studied from other viewpoints such as the connection between the content and visual side. As a result, a revaluation of Bergman films is possible that is more consistent with his original interpretations and comments included in his lectures, interviews and autobiography.

Keywords: Affect theory, experimental cinema, Ingmar Bergman, film as autonomous entity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 741
1611 Wet Polymeric Precipitation Synthesis for Monophasic Tricalcium Phosphate

Authors: I. Grigoraviciute-Puroniene, K. Tsuru, E. Garskaite, Z. Stankeviciute, A. Beganskiene, K. Ishikawa, A. Kareiva

Abstract:

Tricalcium phosphate (β-Ca3(PO4)2, β-TCP) powders were synthesized using wet polymeric precipitation method for the first time to our best knowledge. The results of X-ray diffraction analysis showed the formation of almost single a Ca-deficient hydroxyapatite (CDHA) phase of a poor crystallinity already at room temperature. With continuously increasing the calcination temperature up to 800 °C, the crystalline β-TCP was obtained as the main phase. It was demonstrated that infrared spectroscopy is very effective method to characterize the formation of β-TCP. The SEM results showed that β-TCP solids were homogeneous having a small particle size distribution. The β-TCP powders consisted of spherical particles varying in size from 100 to 300 nm. Fabricated β-TCP specimens were placed to the bones of the rats and maintained for 1-2 months.

Keywords: β-TCP, bone regeneration, wet chemical processing, polymeric precipitation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1007
1610 Structure and Magnetic Properties of Nanocomposite Fe2O3/TiO2 Catalysts Fabricated by Heterogeneous Precipitation

Authors: Jana P. Vejpravova, Daniel Niznansky, Vaclav Vales, Barbara Bittova, Vaclav Tyrpekl, Stanislav Danis, Vaclav Holy, Stephen Doyle

Abstract:

The aim of our work is to study phase composition, particle size and magnetic response of Fe2O3/TiO2 nanocomposites with respect to the final annealing temperature. Those nanomaterials are considered as smart catalysts, separable from a liquid/gaseous phase by applied magnetic field. The starting product was obtained by an ecologically acceptable route, based on heterogeneous precipitation of the TiO2 on modified g-Fe2O3 nanocrystals dispersed in water. The precursor was subsequently annealed on air at temperatures ranging from 200 oC to 900 oC. The samples were investigated by synchrotron X-ray powder diffraction (S-PXRD), magnetic measurements and Mössbauer spectroscopy. As evidenced by S-PXRD and Mössbauer spectroscopy, increasing the annealing temperature causes evolution of the phase composition from anatase/maghemite to rutile/hematite, finally above 700 oC the pseudobrookite (Fe2TiO5) also forms. The apparent particle size of the various Fe2O3/TiO2 phases has been determined from the highquality S-PXRD data by using two different approaches: the Rietveld refinement and the Debye method. Magnetic response of the samples is discussed in considering the phase composition and the particle size.

Keywords: X-ray diffraction, profile analysis, Mössbauer spectroscopy, magnetic properties, TiO2, Fe2O3, Fe2TiO5

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2069
1609 Service-Oriented Architecture for Object- Centric Information Fusion

Authors: Jeffrey A. Dunne, Kevin Ligozio

Abstract:

In many applications there is a broad variety of information relevant to a focal “object" of interest, and the fusion of such heterogeneous data types is desirable for classification and categorization. While these various data types can sometimes be treated as orthogonal (such as the hull number, superstructure color, and speed of an oil tanker), there are instances where the inference and the correlation between quantities can provide improved fusion capabilities (such as the height, weight, and gender of a person). A service-oriented architecture has been designed and prototyped to support the fusion of information for such “object-centric" situations. It is modular, scalable, and flexible, and designed to support new data sources, fusion algorithms, and computational resources without affecting existing services. The architecture is designed to simplify the incorporation of legacy systems, support exact and probabilistic entity disambiguation, recognize and utilize multiple types of uncertainties, and minimize network bandwidth requirements.

Keywords: Data fusion, distributed computing, service-oriented architecture, SOA

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1430
1608 Normalization Discriminant Independent Component Analysis

Authors: Liew Yee Ping, Pang Ying Han, Lau Siong Hoe, Ooi Shih Yin, Housam Khalifa Bashier Babiker

Abstract:

In face recognition, feature extraction techniques attempts to search for appropriate representation of the data. However, when the feature dimension is larger than the samples size, it brings performance degradation. Hence, we propose a method called Normalization Discriminant Independent Component Analysis (NDICA). The input data will be regularized to obtain the most reliable features from the data and processed using Independent Component Analysis (ICA). The proposed method is evaluated on three face databases, Olivetti Research Ltd (ORL), Face Recognition Technology (FERET) and Face Recognition Grand Challenge (FRGC). NDICA showed it effectiveness compared with other unsupervised and supervised techniques.

Keywords: Face recognition, small sample size, regularization, independent component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1922