Search results for: Connected component labeling
388 Design and Construction of an Impulse Current Generator for Lightning Strike Experiments
Authors: Kamran Yousefpour, Mojtaba Rostaghi-Chalaki, Jason Warden, David Wallace, Chanyeop Park
Abstract:
There has been a rising trend in using impulse current generators to investigate the lightning strike protection of materials including aluminum and composites in structures such as wind turbine blade and aircraft body. The focus of this research is to present an impulse current generator built in the High Voltage Lab at Mississippi State University. The generator is capable of producing component A and D of the natural lightning discharges in accordance with the Society of Automotive Engineers (SAE) standard, which is widely used in the aerospace industry. The generator can supply lightning impulse energy up to 400 kJ with the capability of producing impulse currents with magnitudes greater than 200 kA. The electrical circuit and physical components of an improved impulse current generator are described and several lightning strike waveforms with different amplitudes is presented for comparing with the standard waveform. The results of this study contribute to the fundamental understanding the functionality of the impulse current generators and present an impulse current generator developed at the High Voltage Lab of Mississippi State University.
Keywords: impulse current generator, lightning, society of automotive engineers, capacitor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 768387 Thermal Insulating Silicate Materials Suitable for Thermal Insulation and Rehabilitation Structures
Authors: J. Hroudova, M. Sedlmajer, J. Zach
Abstract:
Problems insulation of building structures is often closely connected with the problem of moisture remediation. In the case of historic buildings or if only part of the redevelopment of envelope of structures, it is not possible to apply the classical external thermal insulation composite systems. This application is mostly effective thermal insulation plasters with high porosity and controlled capillary properties which assures improvement of thermal properties construction, its diffusion openness towards the external environment and suitable treatment capillary properties of preventing the penetration of liquid moisture and salts thereof toward the outer surface of the structure. With respect to the current trend of reducing the energy consumption of building structures and reduce the production of CO2 is necessary to develop capillary-active materials characterized by their low density, low thermal conductivity while maintaining good mechanical properties. The aim of researchers at the Faculty of Civil Engineering, Brno University of Technology is the development and study of hygrothermal behaviour of optimal materials for thermal insulation and rehabilitation of building structures with the possible use of alternative, less energy demanding binders in comparison with conventional, frequently used binder, which represents cement. The paper describes the evaluation of research activities aimed at the development of thermal insulation and repair materials using lightweight aggregate and alternative binders such as metakaolin and finely ground fly ash.
Keywords: Thermal insulating plasters, rehabilitation materials, thermal conductivity, lightweight aggregate, alternative binders.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2181386 Solid Waste Management through Mushroom Cultivation – An Eco Friendly Approach
Authors: Mary Josephine
Abstract:
Waste of certain process can be the input source of other sectors in order to reduce environmental pollution. Today there are more and more solid wastes are generated, but only very small amount of those are recycled. So, the threatening of environmental pressure to public health is very serious. The methods considered for the treatment of solid waste are biogas tanks or processing to make animal feed and fertilizer, however, they did not perform well. An alternative approach is growing mushrooms on waste residues. This is regarded as an environmental friendly solution with potential economical benefit. The substrate producers do their best to produce quality substrate at low cost. Apart from other methods, this can be achieved by employing biologically degradable wastes used as the resource material component of the substrate. Mushroom growing is a significant tool for the restoration, replenishment and remediation of Earth’s overburdened ecosphere. One of the rational methods of waste utilization involves locally available wastes. The present study aims to find out the yield of mushroom grown on locally available waste for free and to conserve our environment by recycling wastes.
Keywords: Biodegradable, environment, mushroom, remediation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5312385 A Semi-Fragile Signature based Scheme for Ownership Identification and Color Image Authentication
Authors: M. Hamad Hassan, S.A.M. Gilani
Abstract:
In this paper, a novel scheme is proposed for ownership identification and authentication using color images by deploying Cryptography and Digital Watermarking as underlaying technologies. The former is used to compute the contents based hash and the latter to embed the watermark. The host image that will claim to be the rightful owner is first transformed from RGB to YST color space exclusively designed for watermarking based applications. Geometrically YS ÔèÑ T and T channel corresponds to the chrominance component of color image, therefore suitable for embedding the watermark. The T channel is divided into 4×4 nonoverlapping blocks. The size of block is important for enhanced localization, security and low computation. Each block along with ownership information is then deployed by SHA160, a one way hash function to compute the content based hash, which is always unique and resistant against birthday attack instead of using MD5 that may raise the condition i.e. H(m)=H(m'). The watermark payload varies from block to block and computed by the variance factorα . The quality of watermarked images is quite high both subjectively and objectively. Our scheme is blind, computationally fast and exactly locates the tampered region.
Keywords: Hash Collision, LSB, MD5, PSNR, SHA160.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563384 Random Projections for Dimensionality Reduction in ICA
Authors: Sabrina Gaito, Andrea Greppi, Giuliano Grossi
Abstract:
In this paper we present a technique to speed up ICA based on the idea of reducing the dimensionality of the data set preserving the quality of the results. In particular we refer to FastICA algorithm which uses the Kurtosis as statistical property to be maximized. By performing a particular Johnson-Lindenstrauss like projection of the data set, we find the minimum dimensionality reduction rate ¤ü, defined as the ratio between the size k of the reduced space and the original one d, which guarantees a narrow confidence interval of such estimator with high confidence level. The derived dimensionality reduction rate depends on a system control parameter β easily computed a priori on the basis of the observations only. Extensive simulations have been done on different sets of real world signals. They show that actually the dimensionality reduction is very high, it preserves the quality of the decomposition and impressively speeds up FastICA. On the other hand, a set of signals, on which the estimated reduction rate is greater than 1, exhibits bad decomposition results if reduced, thus validating the reliability of the parameter β. We are confident that our method will lead to a better approach to real time applications.Keywords: Independent Component Analysis, FastICA algorithm, Higher-order statistics, Johnson-Lindenstrauss lemma.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1890383 Structural Reliability of Existing Structures: A Case Study
Authors: Z. Sakka, I. Assakkaf, T. Al-Yaqoub, J. Parol
Abstract:
reliability-based methodology for the assessment and evaluation of reinforced concrete (R/C) structural elements of concrete structures is presented herein. The results of the reliability analysis and assessment for R/C structural elements were verified by the results obtained through deterministic methods. The outcomes of the reliability-based analysis were compared against currently adopted safety limits that are incorporated in the reliability indices β’s, according to international standards and codes. The methodology is based on probabilistic analysis using reliability concepts and statistics of the main random variables that are relevant to the subject matter, and for which they are to be used in the performance-function equation(s) associated with the structural elements under study. These methodology techniques can result in reliability index β, which is commonly known as the reliability index or reliability measure value that can be utilized to assess and evaluate the safety, human risk, and functionality of the structural component. Also, these methods can result in revised partial safety factor values for certain target reliability indices that can be used for the purpose of redesigning the R/C elements of the building and in which they could assist in considering some other remedial actions to improve the safety and functionality of the member.
Keywords: Concrete Structures, FORM, Monte Carlo Simulation, Structural Reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3091382 Evaluation of Thrombolytic Activity of Zingiber cassumunar Roxb. and Thai Herbal Prasaplai Formula
Authors: Warachate Khobjai, Suriyan Sukati, Khemjira Jarmkom, Pattaranut Eakwaropas, Surachai Techaoei
Abstract:
The propose of this study was to investigate in vitro thrombolytic activity of Zingiber cassumunar Roxb. and Prasaplai, a Thai herbal formulation of Z. cassumunar Roxb. Herbs were extracted with boiling water and concentrated by lyophilization. To observe their thrombolytic potential, an in vitro clot lysis method was applied where streptokinase and sterile distilled water were used as positive and negative controls, respectively. Crude aqueous extracts from Z. cassumunar Roxb. and Prasaplai formula showed significant thrombolytic activity by clot lysis of 17.90% and 25.21%, respectively, compared to the negative control water (5.16%) while the standard streptokinase revealed 64.78% clot lysis. These findings suggest that Z. cassumunar Roxb. exhibits moderate thrombolytic activity and cloud play an important role in the thrombolytic properties of Prasaplai formula. However, further study should be done to observe in vivo clot dissolving potential and to isolate active component(s) of these extracts.
Keywords: Aqueous extract, prasaplai formula, thrombolytic activity, Zingiber cassumunar Roxb.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1414381 Wavelet Enhanced CCA for Minimization of Ocular and Muscle Artifacts in EEG
Authors: B. S. Raghavendra, D. Narayana Dutt
Abstract:
Electroencephalogram (EEG) recordings are often contaminated with ocular and muscle artifacts. In this paper, the canonical correlation analysis (CCA) is used as blind source separation (BSS) technique (BSS-CCA) to decompose the artifact contaminated EEG into component signals. We combine the BSSCCA technique with wavelet filtering approach for minimizing both ocular and muscle artifacts simultaneously, and refer the proposed method as wavelet enhanced BSS-CCA. In this approach, after careful visual inspection, the muscle artifact components are discarded and ocular artifact components are subjected to wavelet filtering to retain high frequency cerebral information, and then clean EEG is reconstructed. The performance of the proposed wavelet enhanced BSS-CCA method is tested on real EEG recordings contaminated with ocular and muscle artifacts, for which power spectral density is used as a quantitative measure. Our results suggest that the proposed hybrid approach minimizes ocular and muscle artifacts effectively, minimally affecting underlying cerebral activity in EEG recordings.Keywords: Blind source separation, Canonical correlationanalysis, Electroencephalogram, Muscle artifact, Ocular artifact, Power spectrum, Wavelet threshold.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2334380 Zero Carbon & Low Energy Housing; Comparative Analysis of Two Persian Vernacular Architectural Solutions to Increase Energy Efficiency
Authors: N. Poorang
Abstract:
In order to respond the human needs, all regional, social, and economical factors are available to gain residents’ comfort and ideal architecture. There is no doubt the thermal comfort has to satisfy people not only for daily and physical activities but also creating pleasant area for mental activities and relaxing. It costs energy and increases greenhouse gas emissions.
Reducing energy use in buildings is a critical component of meeting carbon reduction commitments. Hence housing design represents a major opportunity to cut energy use and CO2 emissions.
In terms of energy efficiency, it is vital to propose and research modern design methods for buildings however vernacular architecture techniques are proven empirical existing practices which have to be considered. This research tries to compare two architectural solution were proposed by Persian vernacular architecture, to achieve energy efficiency in hot areas.
The aim of this research is to analyze two forms of traditional Persian architecture in different locations in order to develop a systematic research and sustainable technologies on adaptation to contemporary living standards.
Keywords: Comparative Analysis, Persian Vernacular Architecture, Sustainable architecture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2294379 Software Reliability Prediction Model Analysis
Authors: L. Mirtskhulava, M. Khunjgurua, N. Lomineishvili, K. Bakuria
Abstract:
Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.
Keywords: Exponential distribution, conditional mean time to failure, distribution function, mathematical model, software reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1680378 Outsourcing the Front End of Innovation
Abstract:
The paper presents a new method for efficient innovation process management. Even though the innovation management methods, tools and knowledge are well established and documented in literature, most of the companies still do not manage it efficiently. Especially in SMEs the front end of innovation - problem identification, idea creation and selection - is often not optimally performed. Our eMIPS methodology represents a sort of "umbrella methodology" - a well-defined set of procedures, which can be dynamically adapted to the concrete case in a company. In daily practice, various methods (e.g. for problem identification and idea creation) can be applied, depending on the company's needs. It is based on the proactive involvement of the company's employees supported by the appropriate methodology and external experts. The presented phases are performed via a mixture of face-to-face activities (workshops) and online (eLearning) activities taking place in eLearning Moodle environment and using other e-communication channels. One part of the outcomes is an identified set of opportunities and concrete solutions ready for implementation. The other also very important result is connected to innovation competences for the participating employees related with concrete tools and methods for idea management. In addition, the employees get a strong experience for dynamic, efficient and solution oriented managing of the invention process. The eMIPS also represents a way of establishing or improving the innovation culture in the organization. The first results in a pilot company showed excellent results regarding the motivation of participants and also as to the results achieved.
Keywords: Creativity, distance learning, front end, innovation, problem.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2208377 On the Development of a Homogenized Earthquake Catalogue for Northern Algeria
Authors: I. Grigoratos, R. Monteiro
Abstract:
Regions with a significant percentage of non-seismically designed buildings and reduced urban planning are particularly vulnerable to natural hazards. In this context, the project ‘Improved Tools for Disaster Risk Mitigation in Algeria’ (ITERATE) aims at seismic risk mitigation in Algeria. Past earthquakes in North Algeria caused extensive damages, e.g. the El Asnam 1980 moment magnitude (Mw) 7.1 and Boumerdes 2003 Mw 6.8 earthquakes. This paper will address a number of proposed developments and considerations made towards a further improvement of the component of seismic hazard. In specific, an updated earthquake catalog (until year 2018) is compiled, and new conversion equations to moment magnitude are introduced. Furthermore, a network-based method for the estimation of the spatial and temporal distribution of the minimum magnitude of completeness is applied. We found relatively large values for Mc, due to the sparse network, and a nonlinear trend between Mw and body wave (mb) or local magnitude (ML), which are the most common scales reported in the region. Lastly, the resulting b-value of the Gutenberg-Richter distribution is sensitive to the declustering method.
Keywords: Conversion equation, magnitude of completeness, seismic events, seismic hazard.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 815376 Lattice Boltzmann Simulation of Binary Mixture Diffusion Using Modern Graphics Processors
Authors: Mohammad Amin Safi, Mahmud Ashrafizaadeh, Amir Ali Ashrafizaadeh
Abstract:
A highly optimized implementation of binary mixture diffusion with no initial bulk velocity on graphics processors is presented. The lattice Boltzmann model is employed for simulating the binary diffusion of oxygen and nitrogen into each other with different initial concentration distributions. Simulations have been performed using the latest proposed lattice Boltzmann model that satisfies both the indifferentiability principle and the H-theorem for multi-component gas mixtures. Contemporary numerical optimization techniques such as memory alignment and increasing the multiprocessor occupancy are exploited along with some novel optimization strategies to enhance the computational performance on graphics processors using the C for CUDA programming language. Speedup of more than two orders of magnitude over single-core processors is achieved on a variety of Graphical Processing Unit (GPU) devices ranging from conventional graphics cards to advanced, high-end GPUs, while the numerical results are in excellent agreement with the available analytical and numerical data in the literature.Keywords: Lattice Boltzmann model, Graphical processing unit, Binary mixture diffusion, 2D flow simulations, Optimized algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1556375 Assesing Extension of Meeting System Performance in Information Technology in Defense and Aerospace Project
Authors: Hakan Gürkan, Ahmet Denker
Abstract:
The Ministry of Defense (MoD) spends hundreds of millions of dollars on software to support its infrastructure, operate its weapons and provide command, control, communications, computing, intelligence, surveillance, and reconnaissance (C4ISR) functions. These and other all new advanced systems have a common critical component is information technology. Defense and Aerospace environment is continuously striving to keep up with increasingly sophisticated Information Technology (IT) in order to remain effective in today-s dynamic and unpredictable threat environment. This makes it one of the largest and fastest growing expenses of Defense. Hundreds of millions of dollars spent a year on IT projects. But, too many of those millions are wasted on costly mistakes. Systems that do not work properly, new components that are not compatible with old once, trendily new applications that do not really satisfy defense needs or lost though poorly managed contracts. This paper investigates and compiles the effective strategies that aim to end exasperation with low returns and high cost of Information Technology Acquisition for defense; it tries to show how to maximize value while reducing time and expenditure.Keywords: Iterative Process, Acquisition Management, Project management, Software Economics, Requirement analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1243374 Evaluation of Stent Performances using FEA considering a Realistic Balloon Expansion
Authors: Won-Pil Park, Seung-Kwan Cho, Jai-Young Ko, Anders Kristensson, S.T.S. Al-Hassani, Han-Sung Kim, Dohyung Lim
Abstract:
A number of previous studies were rarely considered the effects of transient non-uniform balloon expansion on evaluation of the properties and behaviors of stents during stent expansion, nor did they determine parameters to maximize the performances driven by mechanical characteristics. Therefore, in order to fully understand the mechanical characteristics and behaviors of stent, it is necessary to consider a realistic modeling of transient non-uniform balloon-stent expansion. The aim of the study is to propose design parameters capable of improving the ability of vascular stent through a comparative study of seven commercial stents using finite element analyses of a realistic transient non-uniform balloon-stent expansion process. In this study, seven representative commercialized stents were evaluated by finite element (FE) analysis in terms of the criteria based on the itemized list of Food and Drug Administration (FDA) and European Standards (prEN). The results indicate that using stents composed of opened unit cells connected by bend-shaped link structures and controlling the geometrical and morphological features of the unit cell strut or the link structure at the distal ends of stent may improve mechanical characteristics of stent. This study provides a better method at the realistic transient non-uniform balloon-stent expansion by investigating the characteristics, behaviors, and parameters capable of improving the ability of vascular stent.Keywords: Finite Element Analysis, Mechanical Characteristic, Transient Non-uniform Balloon-Stent Expansion, Vascular Stent.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1801373 Reliability Assessment for Tie Line Capacity Assistance of Power Systems Based On Multi-Agent System
Authors: Nadheer A. Shalash, Abu Zaharin Bin Ahmad
Abstract:
Technological developments in industrial innovations have currently been related to interconnected system assistance and distribution networks. This important in order to enable an electrical load to continue receive power in the event of disconnection of load from the main power grid. This paper represents a method for reliability assessment of interconnected power systems based. The multi-agent system consists of four agents. The first agent was the generator agent to using as connected the generator to the grid depending on the state of the reserve margin and the load demand. The second was a load agent is that located at the load. Meanwhile, the third is so-called "the reverse margin agent" that to limit the reserve margin between 0 - 25% depend on the load and the unit size generator. In the end, calculation reliability Agent can be calculate expected energy not supplied (EENS), loss of load expectation (LOLE) and the effecting of tie line capacity to determine the risk levels Roy Billinton Test System (RBTS) can use to evaluated the reliability indices by using the developed JADE package. The results estimated of the reliability interconnection power systems presented in this paper. The overall reliability of power system can be improved. Thus, the market becomes more concentrated against demand increasing and the generation units were operating in relation to reliability indices.
Keywords: Reliability indices, Load expectation, Reserve margin, Daily load, Probability, Multi-agent system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2581372 Single Phase 13-Level D-STATCOM Inverter with Distributed System
Authors: R. Kamalakannan, N. Ravi Kumar
Abstract:
The global energy consumption is increasing persistently and need for distributed power generation through renewable energy is essential. To meet the power requirements for consumers without any voltage fluctuations and losses, modeling and design of multilevel inverter with Flexible AC Transmission System (FACTS) capability is presented. The presented inverter is provided with 13-level cascaded H-bridge topology of Insulated Gate Bipolar Transistor (IGBTs) connected along with inbuilt Distributed Static Synchronous Compensators (DSTATCOM). The DSTATCOM device provides control of power factor stability at local feeder lines and the inverter eliminates Total Harmonic Distortion (THD). The 13-level inverter utilizes 52 switches of each H-bridge is fed with single DC sources separately and the Pulse Width Modulation (PWM) technique is used for switching IGBTs. The control strategy implemented for inverter transmits active power to grid as well as it maintains power factor to be stable with achievement of steady state power transmission. Significant outcome of this project is improvement of output voltage quality with steady state power transmission with low THD. Simulation of inverter with DSTATCOM is performed using MATLAB/Simulink environment. The scaled prototype model of proposed inverter is built and its results were validated with simulated results.Keywords: FACTS devices, distributed-Static synchronous compensators, DSTATCOM, total harmonics elimination, modular multilevel converter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454371 Performance of Derna Steam Power Plant at Varying Super-Heater Operating Conditions Based on Exergy
Authors: Idris Elfeituri
Abstract:
In the current study, energy and exergy analysis of a 65 MW steam power plant was carried out. This study investigated the effect of variations of overall conductance of the super heater on the performance of an existing steam power plant located in Derna, Libya. The performance of the power plant was estimated by a mathematical modelling which considers the off-design operating conditions of each component. A fully interactive computer program based on the mass, energy and exergy balance equations has been developed. The maximum exergy destruction has been found in the steam generation unit. A 50% reduction in the design value of overall conductance of the super heater has been achieved, which accordingly decreases the amount of the net electrical power that would be generated by at least 13 MW, as well as the overall plant exergy efficiency by at least 6.4%, and at the same time that would cause an increase of the total exergy destruction by at least 14 MW. The achieved results showed that the super heater design and operating conditions play an important role on the thermodynamics performance and the fuel utilization of the power plant. Moreover, these considerations are very useful in the process of the decision that should be taken at the occasions of deciding whether to replace or renovate the super heater of the power plant.
Keywords: Exergy, super-heater, fouling, steam power plant, off-design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1126370 Finite Element Application to Estimate Inservice Material Properties using Miniature Specimen
Authors: G. Partheepan, D.K. Sehgal, R.K. Pandey
Abstract:
This paper presents a method for determining the uniaxial tensile properties such as Young-s modulus, yield strength and the flow behaviour of a material in a virtually non-destructive manner. To achieve this, a new dumb-bell shaped miniature specimen has been designed. This helps in avoiding the removal of large size material samples from the in-service component for the evaluation of current material properties. The proposed miniature specimen has an advantage in finite element modelling with respect to computational time and memory space. Test fixtures have been developed to enable the tension tests on the miniature specimen in a testing machine. The studies have been conducted in a chromium (H11) steel and an aluminum alloy (AR66). The output from the miniature test viz. load-elongation diagram is obtained and the finite element simulation of the test is carried out using a 2D plane stress analysis. The results are compared with the experimental results. It is observed that the results from the finite element simulation corroborate well with the miniature test results. The approach seems to have potential to predict the mechanical properties of the materials, which could be used in remaining life estimation of the various in-service structures.Keywords: ABAQUS, finite element, miniature test, tensileproperties
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1730369 Design of a Service-Enabled Dependable Integration Environment
Authors: Fuyang Peng, Donghong Li
Abstract:
The aim of information systems integration is to make all the data sources, applications and business flows integrated into the new environment so that unwanted redundancies are reduced and bottlenecks and mismatches are eliminated. Two issues have to be dealt with to meet such requirements: the software architecture that supports resource integration, and the adaptor development tool that help integration and migration of legacy applications. In this paper, a service-enabled dependable integration environment (SDIE), is presented, which has two key components, i.e., a dependable service integration platform and a legacy application integration tool. For the dependable platform for service integration, the service integration bus, the service management framework, the dependable engine for service composition, and the service registry and discovery components are described. For the legacy application integration tool, its basic organization, functionalities and dependable measures taken are presented. Due to its service-oriented integration model, the light-weight extensible container, the service component combination-oriented p-lattice structure, and other features, SDIE has advantages in openness, flexibility, performance-price ratio and feature support over commercial products, is better than most of the open source integration software in functionality, performance and dependability support.Keywords: Application integration, dependability, legacy, SOA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1179368 Web Search Engine Based Naming Procedure for Independent Topic
Authors: Takahiro Nishigaki, Takashi Onoda
Abstract:
In recent years, the number of document data has been increasing since the spread of the Internet. Many methods have been studied for extracting topics from large document data. We proposed Independent Topic Analysis (ITA) to extract topics independent of each other from large document data such as newspaper data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis. The topic represented by ITA is represented by a set of words. However, the set of words is quite different from the topics the user imagines. For example, the top five words with high independence of a topic are as follows. Topic1 = {"scor", "game", "lead", "quarter", "rebound"}. This Topic 1 is considered to represent the topic of "SPORTS". This topic name "SPORTS" has to be attached by the user. ITA cannot name topics. Therefore, in this research, we propose a method to obtain topics easy for people to understand by using the web search engine, topics given by the set of words given by independent topic analysis. In particular, we search a set of topical words, and the title of the homepage of the search result is taken as the topic name. And we also use the proposed method for some data and verify its effectiveness.Keywords: Independent topic analysis, topic extraction, topic naming, web search engine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 500367 A New Approach for Prioritization of Failure Modes in Design FMEA using ANOVA
Authors: Sellappan Narayanagounder, Karuppusami Gurusami
Abstract:
The traditional Failure Mode and Effects Analysis (FMEA) uses Risk Priority Number (RPN) to evaluate the risk level of a component or process. The RPN index is determined by calculating the product of severity, occurrence and detection indexes. The most critically debated disadvantage of this approach is that various sets of these three indexes may produce an identical value of RPN. This research paper seeks to address the drawbacks in traditional FMEA and to propose a new approach to overcome these shortcomings. The Risk Priority Code (RPC) is used to prioritize failure modes, when two or more failure modes have the same RPN. A new method is proposed to prioritize failure modes, when there is a disagreement in ranking scale for severity, occurrence and detection. An Analysis of Variance (ANOVA) is used to compare means of RPN values. SPSS (Statistical Package for the Social Sciences) statistical analysis package is used to analyze the data. The results presented are based on two case studies. It is found that the proposed new methodology/approach resolves the limitations of traditional FMEA approach.Keywords: Failure mode and effects analysis, Risk priority code, Critical failure mode, Analysis of variance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5438366 A Structural Support Vector Machine Approach for Biometric Recognition
Authors: Vishal Awasthi, Atul Kumar Agnihotri
Abstract:
Face is a non-intrusive strong biometrics for identification of original and dummy facial by different artificial means. Face recognition is extremely important in the contexts of computer vision, psychology, surveillance, pattern recognition, neural network, content based video processing. The availability of a widespread face database is crucial to test the performance of these face recognition algorithms. The openly available face databases include face images with a wide range of poses, illumination, gestures and face occlusions but there is no dummy face database accessible in public domain. This paper presents a face detection algorithm based on the image segmentation in terms of distance from a fixed point and template matching methods. This proposed work is having the most appropriate number of nodal points resulting in most appropriate outcomes in terms of face recognition and detection. The time taken to identify and extract distinctive facial features is improved in the range of 90 to 110 sec. with the increment of efficiency by 3%.Keywords: Face recognition, Principal Component Analysis, PCA, Linear Discriminant Analysis, LDA, Improved Support Vector Machine, iSVM, elastic bunch mapping technique.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 493365 Development of an Intelligent Decision Support System for Smart Viticulture
Authors: C. M. Balaceanu, G. Suciu, C. S. Bosoc, O. Orza, C. Fernandez, Z. Viniczay
Abstract:
The Internet of Things (IoT) represents the best option for smart vineyard applications, even if it is necessary to integrate the technologies required for the development. This article is based on the research and the results obtained in the DISAVIT project. For Smart Agriculture, the project aims to provide a trustworthy, intelligent, integrated vineyard management solution that is based on the IoT. To have interoperability through the use of a multiprotocol technology (being the future connected wireless IoT) it is necessary to adopt an agnostic approach, providing a reliable environment to address cyber security, IoT-based threats and traceability through blockchain-based design, but also creating a concept for long-term implementations (modular, scalable). The ones described above represent the main innovative technical aspects of this project. The DISAVIT project studies and promotes the incorporation of better management tools based on objective data-based decisions, which are necessary for agriculture adapted and more resistant to climate change. It also exploits the opportunities generated by the digital services market for smart agriculture management stakeholders. The project's final result aims to improve decision-making, performance, and viticulturally infrastructure and increase real-time data accuracy and interoperability. Innovative aspects such as end-to-end solutions, adaptability, scalability, security and traceability, place our product in a favorable situation over competitors. None of the solutions in the market meet every one of these requirements by a unique product being innovative.
Keywords: Blockchain, IoT, smart agriculture, vineyard.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1039364 Context Aware Lightweight Energy Efficient Framework
Authors: D. Sathan, A. Meetoo, R. K. Subramaniam
Abstract:
Context awareness is a capability whereby mobile computing devices can sense their physical environment and adapt their behavior accordingly. The term context-awareness, in ubiquitous computing, was introduced by Schilit in 1994 and has become one of the most exciting concepts in early 21st-century computing, fueled by recent developments in pervasive computing (i.e. mobile and ubiquitous computing). These include computing devices worn by users, embedded devices, smart appliances, sensors surrounding users and a variety of wireless networking technologies. Context-aware applications use context information to adapt interfaces, tailor the set of application-relevant data, increase the precision of information retrieval, discover services, make the user interaction implicit, or build smart environments. For example: A context aware mobile phone will know that the user is currently in a meeting room, and reject any unimportant calls. One of the major challenges in providing users with context-aware services lies in continuously monitoring their contexts based on numerous sensors connected to the context aware system through wireless communication. A number of context aware frameworks based on sensors have been proposed, but many of them have neglected the fact that monitoring with sensors imposes heavy workloads on ubiquitous devices with limited computing power and battery. In this paper, we present CALEEF, a lightweight and energy efficient context aware framework for resource limited ubiquitous devices.Keywords: Context-Aware, Energy-Efficient, Lightweight, Ubiquitous Devices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1947363 Hydrogen Rich Fuel Gas Production from 2- Propanol Using Pt/Al2O3 and Ni/Al2O3 Catalysts in Supercritical Water
Authors: Yağmur Karakuş, Fatih Aynacı, Ekin Kıpçak, Mesut Akgün
Abstract:
Hydrogen is an important chemical in many industries and it is expected to become one of the major fuels for energy generation in the future. Unfortunately, hydrogen does not exist in its elemental form in nature and therefore has to be produced from hydrocarbons, hydrogen-containing compounds or water. Above its critical point (374.8oC and 22.1MPa), water has lower density and viscosity, and a higher heat capacity than those of ambient water. Mass transfer in supercritical water (SCW) is enhanced due to its increased diffusivity and transport ability. The reduced dielectric constant makes supercritical water a better solvent for organic compounds and gases. Hence, due to the aforementioned desirable properties, there is a growing interest toward studies regarding the gasification of organic matter containing biomass or model biomass solutions in supercritical water. In this study, hydrogen and biofuel production by the catalytic gasification of 2-Propanol in supercritical conditions of water was investigated. Pt/Al2O3and Ni/Al2O3were the catalysts used in the gasification reactions. All of the experiments were performed under a constant pressure of 25MPa. The effects of five reaction temperatures (400, 450, 500, 550 and 600°C) and five reaction times (10, 15, 20, 25 and 30 s) on the gasification yield and flammable component content were investigated.Keywords: 2-Propanol, Gasification, Ni/Al2O3, Pt/Al2O3, Supercritical water.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2052362 Working Capital Management, Firms- Performance and Market Valuation in Nigeria
Authors: Sunday. E. Ogundipe, Abiola Idowu, Lawrencia. O. Ogundipe
Abstract:
This study examines the impact of working capital management on firms- performance and market value of the firms in Nigeria. A sample of fifty four non-financial quoted firms in Nigeria listed on the Nigeria Stock Exchange was used for this study. Data were collected from annual reports of the sampled firms for the period 1995-2009. This result shows there is a significant negative relationship between cash conversion cycle and market valuation and firm-s performance. It also shows that debt ratio is positively related to market valuation and negatively related firm-s performance. The findings confirm that there is a significant relationship between Market valuation, profitability and working capital component in line with previous studies. This mean that Nigeria firms should ensure adequate management of working capital especially cash conversion cycle components of account receivables, account payables and inventories, as efficiency working capital management is expected to contribute positively to the firms- market value.Keywords: Cash Conversion Cycle, Firms' Performance, Market Valuation, Working Capital Management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6342361 Roller Guide Design and Manufacturing for Spatial Cylindrical Cams
Authors: Yuan L. Lai, Jui P. Hung, Jian H. Chen
Abstract:
This paper was aimed at developing a computer aided design and manufacturing system for spatial cylindrical cams. In the proposed system, a milling tool with a diameter smaller than that of the roller, instead of the standard cutter for traditional machining process, was used to generate the tool path for spatial cams. To verify the feasibility of the proposed method, a multi-axis machining simulation software was further used to simulate the practical milling operation of spatial cams. It was observed from computer simulation that the tool path of small-sized cutter were within the motion range of a standard cutter, no occurrence of overcutting. Examination of a finished cam component clearly verifies the accuracy of the tool path generated for small-sized milling tool. It is believed that the use of small-sized cutter for the machining of the spatial cylindrical cams can generate a better surface morphology with higher accuracy. The improvement in efficiency and cost for the manufacturing of the spatial cylindrical cam can be expected through the proposed method.Keywords: Cylindrical cams, Computer-aided manufacturing, Tool path.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3451360 Assessment of Time-Lapse in Visible and Thermal Face Recognition
Authors: Sajad Farokhi, Siti Mariyam Shamsuddin, Jan Flusser, Usman Ullah Sheikh
Abstract:
Although face recognition seems as an easy task for human, automatic face recognition is a much more challenging task due to variations in time, illumination and pose. In this paper, the influence of time-lapse on visible and thermal images is examined. Orthogonal moment invariants are used as a feature extractor to analyze the effect of time-lapse on thermal and visible images and the results are compared with conventional Principal Component Analysis (PCA). A new triangle square ratio criterion is employed instead of Euclidean distance to enhance the performance of nearest neighbor classifier. The results of this study indicate that the ideal feature vectors can be represented with high discrimination power due to the global characteristic of orthogonal moment invariants. Moreover, the effect of time-lapse has been decreasing and enhancing the accuracy of face recognition considerably in comparison with PCA. Furthermore, our experimental results based on moment invariant and triangle square ratio criterion show that the proposed approach achieves on average 13.6% higher in recognition rate than PCA.Keywords: Infrared Face recognition, Time-lapse, Zernike moment invariants
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784359 A Fragile Watermarking Scheme for Color Image Authentication
Authors: M. Hamad Hassan, S.A.M. Gilani
Abstract:
In this paper, a fragile watermarking scheme is proposed for color image specified object-s authentication. The color image is first transformed from RGB to YST color space, suitable for watermarking the color media. The T channel corresponds to the chrominance component of a color image andYS ÔèÑ T , therefore selected for embedding the watermark. The T channel is first divided into 2×2 non-overlapping blocks and the two LSBs are set to zero. The object that is to be authenticated is also divided into 2×2 nonoverlapping blocks and each block-s intensity mean is computed followed by eight bit encoding. The generated watermark is then embedded into T channel randomly selected 2×2 block-s LSBs using 2D-Torus Automorphism. Selection of block size is paramount for exact localization and recovery of work. The proposed scheme is blind, efficient and secure with ability to detect and locate even minor tampering applied to the image with full recovery of original work. The quality of watermarked media is quite high both subjectively and objectively. The technique is suitable for class of images with format such as gif, tif or bitmap.
Keywords: Image Authentication, LSBs, PSNR, 2D-Torus Automorphism, YST Color Space.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1888