Search results for: code division multiple access (CDMA)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3072

Search results for: code division multiple access (CDMA)

792 Optical Road Monitoring of the Future Smart Roads – Preliminary Results

Authors: Maria Jokela, Matti Kutila, Jukka Laitinen, Florian Ahlers, Nicolas Hautière, TobiasSchendzielorz

Abstract:

It has been shown that in most accidents the driver is responsible due to being distracted or misjudging the situation. In order to solve such problems research has been dedicated to developing driver assistance systems that are able to monitor the traffic situation around the vehicle. This paper presents methods for recognizing several circumstances on a road. The methods use both the in-vehicle warning systems and the roadside infrastructure. Preliminary evaluation results for fog and ice-on-road detection are presented. The ice detection results are based on data recorded in a test track dedicated to tyre friction testing. The achieved results anticipate that ice detection could work at a performance of 70% detection with the right setup, which is a good foundation for implementation. However, the full benefit of the presented cooperative system is achieved by fusing the outputs of multiple data sources, which is the key point of discussion behind this publication.

Keywords: Smart roads, traffic monitoring, traffic scenedetection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1627
791 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500

Authors: Mustafa Elfituri, Jonathan Cook

Abstract:

Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.

Keywords: Graph computation, Graph500 benchmark, parallel architectures, parallel programming, workload characterization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 548
790 Re-Optimization MVPP Using Common Subexpression for Materialized View Selection

Authors: Boontita Suchyukorn, Raweewan Auepanwiriyakul

Abstract:

A Data Warehouses is a repository of information integrated from source data. Information stored in data warehouse is the form of materialized in order to provide the better performance for answering the queries. Deciding which appropriated views to be materialized is one of important problem. In order to achieve this requirement, the constructing search space close to optimal is a necessary task. It will provide effective result for selecting view to be materialized. In this paper we have proposed an approach to reoptimize Multiple View Processing Plan (MVPP) by using global common subexpressions. The merged queries which have query processing cost not close to optimal would be rewritten. The experiment shows that our approach can help to improve the total query processing cost of MVPP and sum of query processing cost and materialized view maintenance cost is reduced as well after views are selected to be materialized.

Keywords: Data Warehouse, materialized views, query rewriting, common subexpressions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1678
789 PhilSHORE: Development of a WebGIS-Based Marine Spatial Planning Tool for Tidal Current Energy Resource Assessment and Site Suitability Analysis

Authors: Ma. Rosario Concepcion O. Ang, Luis Caezar Ian K. Panganiban, Charmyne B. Mamador, Oliver Dan G. De Luna, Michael D. Bausas, Joselito P. Cruz

Abstract:

PhilSHORE is a multi-site, multi-device and multicriteria decision support tool designed to support the development of tidal current energy in the Philippines. Its platform is based on Geographic Information Systems (GIS) which allows for the collection, storage, processing, analyses and display of geospatial data. Combining GIS tools with open source web development applications, PhilSHORE becomes a webGIS-based marine spatial planning tool. To date, PhilSHORE displays output maps and graphs of power and energy density, site suitability and site-device analysis. It enables stakeholders and the public easy access to the results of tidal current energy resource assessments and site suitability analyses. Results of the initial development show that PhilSHORE is a promising decision support tool for ORE project developments.

Keywords: GIS, Site Suitability Analysis, Tidal Current Energy Resource Assessment, WebGIS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2714
788 Sustainability Policies and Corporate Social Responsibility (CSR): Ergonomics Contribution Regarding Work in Companies

Authors: I. Bolis, S. N. Morioka, L. I. Sznelwar

Abstract:

The growing importance of sustainability in corporate policies represents a great opportunity for workers to gain more consideration, with great benefits to their well being. Sustainable work is believed to be one which improves the organization-s performance and fosters professional development as well as workers- health. In a multiple case study based on document research, information was sought about work activities and their sustainability or corporate social responsibility (CSR) policies, as disseminated by corporations. All the companies devoted attention to work activities and delivered a good amount of information about them. Nevertheless, the information presented was generic; all the actions developed were top-down and there was no information about the impact of changes aimed at sustainability on the workers- activities. It was found that the companies seemed to be at an early stage. In the future, they need to show more commitment through concrete goals: they must be aware that workers contribute directly to the corporations- sustainability. This would allow room for Ergonomics and Work Psychodynamics to be incorporated and to be useful for both companies and society, so as to promote and ensure work sustainability.

Keywords: Sustainability, ergonomics, work psychodynamics, multinational companies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1968
787 Comparative Studies of Support Vector Regression between Reproducing Kernel and Gaussian Kernel

Authors: Wei Zhang, Su-Yan Tang, Yi-Fan Zhu, Wei-Ping Wang

Abstract:

Support vector regression (SVR) has been regarded as a state-of-the-art method for approximation and regression. The importance of kernel function, which is so-called admissible support vector kernel (SV kernel) in SVR, has motivated many studies on its composition. The Gaussian kernel (RBF) is regarded as a “best" choice of SV kernel used by non-expert in SVR, whereas there is no evidence, except for its superior performance on some practical applications, to prove the statement. Its well-known that reproducing kernel (R.K) is also a SV kernel which possesses many important properties, e.g. positive definiteness, reproducing property and composing complex R.K by simpler ones. However, there are a limited number of R.Ks with explicit forms and consequently few quantitative comparison studies in practice. In this paper, two R.Ks, i.e. SV kernels, composed by the sum and product of a translation invariant kernel in a Sobolev space are proposed. An exploratory study on the performance of SVR based general R.K is presented through a systematic comparison to that of RBF using multiple criteria and synthetic problems. The results show that the R.K is an equivalent or even better SV kernel than RBF for the problems with more input variables (more than 5, especially more than 10) and higher nonlinearity.

Keywords: admissible support vector kernel, reproducing kernel, reproducing kernel Hilbert space, support vector regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1595
786 Kerma Profile Measurements in CT Chest Scans– a Comparison of Methodologies

Authors: Bruno B. Oliveira, Arnaldo P. Mourão, Teógenes A. da Silva

Abstract:

The Brazilian legislation has only established diagnostic reference levels (DRLs) in terms of Multiple Scan Average Dose (MSAD) as a quality control parameter for computed tomography (CT) scanners. Compliance with DRLs can be verified by measuring the Computed Tomography Kerma Index (Ca,100) with a pencil ionization chamber or by obtaining the kerma distribution in CT scans with radiochromic films or rod shape lithium fluoride termoluminescent dosimeters (TLD-100). TL dosimeters were used to record kerma profiles and to determine MSAD values of a Bright Speed model GE CT scanner. Measurements were done with radiochromic films and TL dosimeters distributed in cylinders positioned in the center and in four peripheral bores of a standard polymethylmethacrylate (PMMA) body CT dosimetry phantom. Irradiations were done using a protocol for adult chest. The maximum values were found at the midpoint of the longitudinal axis. The MSAD values obtained with three dosimetric techniques were compared.

Keywords: Kerma profile, CT, MSAD, patient dosimetry

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2062
785 A Self Adaptive Genetic Based Algorithm for the Identification and Elimination of Bad Data

Authors: A. A. Hossam-Eldin, E. N. Abdallah, M. S. El-Nozahy

Abstract:

The identification and elimination of bad measurements is one of the basic functions of a robust state estimator as bad data have the effect of corrupting the results of state estimation according to the popular weighted least squares method. However this is a difficult problem to handle especially when dealing with multiple errors from the interactive conforming type. In this paper, a self adaptive genetic based algorithm is proposed. The algorithm utilizes the results of the classical linearized normal residuals approach to tune the genetic operators thus instead of making a randomized search throughout the whole search space it is more likely to be a directed search thus the optimum solution is obtained at very early stages(maximum of 5 generations). The algorithm utilizes the accumulating databases of already computed cases to reduce the computational burden to minimum. Tests are conducted with reference to the standard IEEE test systems. Test results are very promising.

Keywords: Bad Data, Genetic Algorithms, Linearized Normal residuals, Observability, Power System State Estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1346
784 Issues in the User Interface Design of a Content Rich Vocational Training Application for Digitally Illiterate Users

Authors: Jamie Otelsberg, Nagarajan Akshay, Rao R. Bhavani

Abstract:

This paper discusses our preliminary experiences in the design of a user interface of a computerized content-rich vocational training courseware meant for users with little or no computer experience. In targeting a growing population with limited access to skills training of any sort, we faced numerous challenges, including language and cultural differences, resource limits, gender boundaries and, in many cases, the simple lack of trainee motivation. With the size of the unskilled population increasing much more rapidly than the numbers of sufficiently skilled teachers, there is little choice but to develop teaching techniques that will take advantage of emerging computer-based training technologies. However, in striving to serve populations with minimal computer literacy, one must carefully design the user interface to accommodate their cultural, social, educational, motivational and other differences. Our work, which uses computer based and haptic simulation technologies to deliver training to these populations, has provided some useful insights on potential user interface design approaches.

Keywords: User interface design, digitally illiterate, vocational training, navigation issues, computer human interaction, human factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2377
783 Barriers to the Uptake of Technology in the Quantity Surveying Industry

Authors: Mnisi Blessing, Christopher Amoah

Abstract:

The usage of modern technology is widespread in industrialised nations. The issue still pertains to developing countries since they struggle to use technology in the building sector. The study aims to identify the barriers to technology usage in quantity surveying firms. Quantity Surveyors were interviewed via Microsoft teams due to the dispersed nature of the participants. However, where the interview was not possible, the interview guide was emailed to the participants to fill in. In all, 12 participants were interviewed out of the 25 participants contacted. The data received were analysed using the content analysis process. The study's findings demonstrate that quantity surveyors have access to a wide range of technology that significantly enhances their project activities. However, quantity surveying companies are hesitant to use technology for several reasons, including the cost and maintenance associated with it. Other obstacles include a lack of knowledge, poor market acceptance, legal obstacles, and budgetary constraints. Despite the advantages associated with modern technology applications, quantity surveying firms are not using them, which may ultimately affect their work output. Therefore, firms need to re-examine these obstacles, inhibiting their adoption of technology in the work process to enhance their production. The study reveals the main hindrances to technology usage, which may help firms institute measures to address them.

Keywords: Technology usage barriers, technology implementation, technology acceptance, quantity surveying.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 255
782 Efficient Feature Fusion for Noise Iris in Unconstrained Environment

Authors: Yao-Hong Tsai

Abstract:

This paper presents an efficient fusion algorithm for iris images to generate stable feature for recognition in unconstrained environment. Recently, iris recognition systems are focused on real scenarios in our daily life without the subject’s cooperation. Under large variation in the environment, the objective of this paper is to combine information from multiple images of the same iris. The result of image fusion is a new image which is more stable for further iris recognition than each original noise iris image. A wavelet-based approach for multi-resolution image fusion is applied in the fusion process. The detection of the iris image is based on Adaboost algorithm and then local binary pattern (LBP) histogram is then applied to texture classification with the weighting scheme. Experiment showed that the generated features from the proposed fusion algorithm can improve the performance for verification system through iris recognition.

Keywords: Image fusion, iris recognition, local binary pattern, wavelet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2217
781 The Design of the Multi-Agent Classification System (MACS)

Authors: Mohamed R. Mhereeg

Abstract:

The paper discusses the design of a .NET Windows Service based agent system called MACS (Multi-Agent Classification System). MACS is a system aims to accurately classify spreadsheet developers competency over a network. It is designed to automatically and autonomously monitor spreadsheet users and gather their development activities based on the utilization of the software multi-agent technology (MAS). This is accomplished in such a way that makes management capable to efficiently allow for precise tailor training activities for future spreadsheet development. The monitoring agents of MACS are intended to be distributed over the WWW in order to satisfy the monitoring and classification of the multiple developer aspect. The Prometheus methodology is used for the design of the agents of MACS. Prometheus has been used to undertake this phase of the system design because it is developed specifically for specifying and designing agent-oriented systems. Additionally, Prometheus specifies also the communication needed between the agents in order to coordinate to achieve their delegated tasks.

Keywords: Classification, Design, MACS, MAS, Prometheus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1689
780 Engineered Cement Composite Materials Characterization for Tunneling Applications

Authors: S. Boughanem, D. A. Jesson, M. J. Mulheron, P.A. Smith C. Eddie, S. Psomas, M. Rimes

Abstract:

Cements, which are intrinsically brittle materials, can exhibit a degree of pseudo-ductility when reinforced with a sufficient volume fraction of a fibrous phase. This class of materials, called Engineered Cement Composites (ECC) has the potential to be used in future tunneling applications where a level of pseudo-ductility is required to avoid brittle failures. However uncertainties remain regarding mechanical performance. Previous work has focused on comparatively thin specimens; however for future civil engineering applications, it is imperative that the behavior in tension of thicker specimens is understood. In the present work, specimens containing cement powder and admixtures have been manufactured following two different processes and tested in tension. Multiple matrix cracking has been observed during tensile testing, leading to a “strain-hardening" behavior, confirming the possible suitability of ECC material when used as thick sections (greater than 50mm) in tunneling applications.

Keywords: Cement composite, polymeric fibers, pseudoductility, test-geometry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2469
779 Scaling Strategy of a New Experimental Rig for Wheel-Rail Contact

Authors: Meysam Naeimi, Zili Li, Rolf Dollevoet

Abstract:

A new small–scale test rig developed for rolling contact fatigue (RCF) investigations in wheel–rail material. This paper presents the scaling strategy of the rig based on dimensional analysis and mechanical modelling. The new experimental rig is indeed a spinning frame structure with multiple wheel components over a fixed rail-track ring, capable of simulating continuous wheelrail contact in a laboratory scale. This paper describes the dimensional design of the rig, to derive its overall scaling strategy and to determine the key elements’ specifications. Finite element (FE) modelling is used to simulate the mechanical behavior of the rig with two sample scale factors of 1/5 and 1/7. The results of FE models are compared with the actual railway system to observe the effectiveness of the chosen scales. The mechanical properties of the components and variables of the system are finally determined through the design process.

Keywords: New test rig, rolling contact fatigue, rail, small scale.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2342
778 Co-tier and Co-channel Interference Avoidance Algorithm for Femtocell Networks

Authors: S. Padmapriya, M. Tamilarasi

Abstract:

Femtocells are regarded as a milestone for next generation cellular networks. As femtocells are deployed in an unplanned manner, there is a chance of assigning same resource to neighboring femtocells. This scenario may induce co-channel interference and may seriously affect the service quality of neighboring femtocells. In addition, the dominant transmit power of a femtocell will induce co-tier interference to neighboring femtocells. Thus to jointly handle co-tier and co-channel interference, we propose an interference-free power and resource block allocation (IFPRBA) algorithm for closely located, closed access femtocells. Based on neighboring list, inter-femto-base station distance and uplink noise power, the IFPRBA algorithm assigns non-interfering power and resource to femtocells. The IFPRBA algorithm also guarantees the quality of service to femtouser based on the knowledge of resource requirement, connection type, and the tolerable delay budget. Simulation result shows that the interference power experienced in IFPRBA algorithm is below the tolerable interference power and hence the overall service success ratio, PRB efficiency and network throughput are maximum when compared to conventional resource allocation framework for femtocell (RAFF) algorithm.

Keywords: Co-channel interference, co-tier interference, femtocells, guaranteed QoS, power optimization, resource assignment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2466
777 Anonymous Editing Prevention Technique Using Gradient Method for High-Quality Video

Authors: Jiwon Lee, Chanho Jung, Si-Hwan Jang, Kyung-Ill Kim, Sanghyun Joo, Wook-Ho Son

Abstract:

Since the advances in digital imaging technologies have led to development of high quality digital devices, there are a lot of illegal copies of copyrighted video content on the Internet. Also, unauthorized editing is occurred frequently. Thus, we propose an editing prevention technique for high-quality (HQ) video that can prevent these illegally edited copies from spreading out. The proposed technique is applied spatial and temporal gradient methods to improve the fidelity and detection performance. Also, the scheme duplicates the embedding signal temporally to alleviate the signal reduction caused by geometric and signal-processing distortions. Experimental results show that the proposed scheme achieves better performance than previously proposed schemes and it has high fidelity. The proposed scheme can be used in unauthorized access prevention method of visual communication or traitor tracking applications which need fast detection process to prevent illegally edited video content from spreading out.

Keywords: Editing prevention technique, gradient method, high-quality video, luminance change, visual communication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1930
776 Real Time Speed Estimation of Vehicles

Authors: Azhar Hussain, Kashif Shahzad, Chunming Tang

Abstract:

this paper gives a novel approach towards real-time speed estimation of multiple traffic vehicles using fuzzy logic and image processing techniques with proper arrangement of camera parameters. The described algorithm consists of several important steps. First, the background is estimated by computing median over time window of specific frames. Second, the foreground is extracted using fuzzy similarity approach (FSA) between estimated background pixels and the current frame pixels containing foreground and background. Third, the traffic lanes are divided into two parts for both direction vehicles for parallel processing. Finally, the speeds of vehicles are estimated by Maximum a Posterior Probability (MAP) estimator. True ground speed is determined by utilizing infrared sensors for three different vehicles and the results are compared to the proposed algorithm with an accuracy of ± 0.74 kmph.

Keywords: Defuzzification, Fuzzy similarity approach, lane cropping, Maximum a Posterior Probability (MAP) estimator, Speed estimation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2806
775 A Decision Support System Based on Leprosy Scales

Authors: Dennys Robson Girardi, Hugo Bulegon, Claudia Maria Moro Barra

Abstract:

Leprosy is an infectious disease caused by Mycobacterium Leprae, this disease, generally, compromises the neural fibers, leading to the development of disability. Disabilities are changes that limit daily activities or social life of a normal individual. When comes to leprosy, the study of disability considered the functional limitation (physical disabilities), the limitation of activity and social participation, which are measured respectively by the scales: EHF, SALSA and PARTICIPATION SCALE. The objective of this work is to propose an on-line monitoring of leprosy patients, which is based on information scales EHF, SALSA and PARTICIPATION SCALE. It is expected that the proposed system is applied in monitoring the patient during treatment and after healing therapy of the disease. The correlations that the system is between the scales create a variety of information, presented the state of the patient and full of changes or reductions in disability. The system provides reports with information from each of the scales and the relationships that exist between them. This way, health professionals, with access to patient information, can intervene with techniques for the Prevention of Disability. Through the automated scale, the system shows the level of the patient and allows the patient, or the responsible, to take a preventive measure. With an online system, it is possible take the assessments and monitor patients from anywhere.

Keywords: Leprosy, Medical Informatics, Decision SupportSystem, Disability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2048
774 Load Transfer Mechanism Based Unified Strut-and-Tie Modeling for Design of Concrete Beams

Authors: Ahmed, M., Yasser A., Mahmoud H., Ahmed, A., Abdulla M. S., Nazar, S.

Abstract:

Strut-and-Tie Models (STM) for the design of concrete beams, comprising of struts, ties, nodes as the basic tools, is conceptually simple, but its realization for complex concrete structure is not straightforward and depends on flow of internal forces in the structure. STM technique has won wide acceptance for deep member and shear design. STM technique is a unified approach that considers all load effects (bending, axial, shear, and torsion) simultaneously, not just applicable to shear loading only. The present study is to portray Strut-and-Tie Modeling based on Load-Transfer-Mechanisms as a unified method to analyze, design and detailing for deep and slender concrete beams. Three shear span- effective depth ratio (a/ d) are recommended for the modeling of STM elements corresponding to dominant load paths. The study also discusses the research work conduct on effective stress of concrete, tie end anchorage, and transverse reinforcement demand under different load transfer mechanism. It is also highlighted that to make the STM versatile tool for design of beams applicable to all shear spans, the effective stress of concrete and, transverse reinforcement demand, inclined angle of strut, and anchorage requirements of tie bars is required to be correlated with respect to load transfer mechanism. The country code provisions are to be modified and updated to apply for generalized design of concrete deep and slender member using load transfer mechanism based STM technique. Examples available in literature are reanalyzed with refined STM based on load transfer mechanisms and results are compared. It is concluded from the results that proposed approach will require true reinforcement demand depending on dominant force transfer action in concrete beam.

Keywords: Deep member, Load transfer mechanism, Strut-and-Tie Model, Strut, Truss.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5986
773 A Case Study on Performance of Isolated Bridges under Near-Fault Ground Motion

Authors: Daniele Losanno, H. A. Hadad, Giorgio Serino

Abstract:

This paper presents a numerical investigation on the seismic performance of a benchmark bridge with different optimal isolation systems under near fault ground motion. Usually, very large displacements make seismic isolation an unfeasible solution due to boundary conditions, especially in case of existing bridges or high risk seismic regions. Hence, near-fault ground motions are most likely to affect either structures with long natural period range like isolated structures or structures sensitive to velocity content such as viscously damped structures. The work is aimed at analyzing the seismic performance of a three-span continuous bridge designed with different isolation systems having different levels of damping. The case study was analyzed in different configurations including: (a) simply supported, (b) isolated with lead rubber bearings (LRBs), (c) isolated with rubber isolators and 10% classical damping (HDLRBs), and (d) isolated with rubber isolators and 70% supplemental damping ratio. Case (d) represents an alternative control strategy that combines the effect of seismic isolation with additional supplemental damping trying to take advantages from both solutions. The bridge is modeled in SAP2000 and solved by time history direct-integration analyses under a set of six recorded near-fault ground motions. In addition to this, a set of analysis under Italian code provided seismic action is also conducted, in order to evaluate the effectiveness of the suggested optimal control strategies under far field seismic action. Results of the analysis demonstrated that an isolated bridge equipped with HDLRBs and a total equivalent damping ratio of 70% represents a very effective design solution for both mitigation of displacement demand at the isolation level and base shear reduction in the piers also in case of near fault ground motion.

Keywords: Isolated bridges, optimal design, near-fault motion, supplemental damping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1271
772 Dynamic Economic Dispatch Using Glowworm Swarm Optimization Technique

Authors: K. C. Meher, R. K. Swain, C. K. Chanda

Abstract:

This paper gives an intuition regarding glowworm swarm optimization (GSO) technique to solve dynamic economic dispatch (DED) problems of thermal generating units. The objective of the problem is to schedule optimal power generation of dedicated thermal units over a specific time band. In this study, Glowworm swarm optimization technique enables a swarm of agents to split into subgroup, exhibit simultaneous taxis towards each other and rendezvous at multiple optima (not necessarily equal) of a given multimodal function. The feasibility of the GSO method has been tested on ten-unit-test systems where the power balance constraints, operating limits, valve point effects, and ramp rate limits are taken into account. The results obtained by the proposed technique are compared with other heuristic techniques. The results show that GSO technique is capable of producing better results.

Keywords: Dynamic economic dispatch, Glowworm swarm optimization, Luciferin, Valve–point loading effect, Ramp rate limits.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1316
771 Multicast Optimization Techniques using Best Effort Genetic Algorithms

Authors: Dinesh Kumar, Y. S. Brar, V. K. Banga

Abstract:

Multicast Network Technology has pervaded our lives-a few examples of the Networking Techniques and also for the improvement of various routing devices we use. As we know the Multicast Data is a technology offers many applications to the user such as high speed voice, high speed data services, which is presently dominated by the Normal networking and the cable system and digital subscriber line (DSL) technologies. Advantages of Multi cast Broadcast such as over other routing techniques. Usually QoS (Quality of Service) Guarantees are required in most of Multicast applications. The bandwidth-delay constrained optimization and we use a multi objective model and routing approach based on genetic algorithm that optimizes multiple QoS parameters simultaneously. The proposed approach is non-dominated routes and the performance with high efficiency of GA. Its betterment and high optimization has been verified. We have also introduced and correlate the result of multicast GA with the Broadband wireless to minimize the delay in the path.

Keywords: GA (genetic Algorithms), Quality of Service, MOGA, Steiner Tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1556
770 Electroremediation of Cu-Contaminated Soil

Authors: Darius Jay R. Bongay, Roberto L. Ngo

Abstract:

This study investigated the removal efficiency of electrokinetic remediation of copper-contaminated soil at different combinations of enhancement reagents used as anolyte and catholyte. Sodium hydroxide (at 0.1, 0.5, and 1.0 M concentrations) and distilled water were used as anolyte, while lactic acid (at 0.01, 0.1, and 0.5 M concentrations), ammonium citrate (also at 0.01, 0.1, and 0.5 M concentrations) and distilled water were used as catholyte. A continuous voltage application (1.0 VDC/cm) was employed for 240 hours for each experiment. The copper content of the catholyte was determined at the end of the 240-hour period. Optimization was carried out with a Response Surface Methodology - Optimal Design, including F test, and multiple comparison method, to determine which pair of anolyte-catholyte was the most significant for the removal efficiency. "1.0 M NaOH" was found to be the most significant anolyte while it was established that lactic acid was the most significant type of catholyte to be used for the most successful electrokinetic experiments. Concentrations of lactic acid should be at the range of 0.1 M to 0.5 M to achieve maximum percent removal values.

Keywords: Electrokinetic remediation, copper contamination, heavy metal contamination, soil remediation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1727
769 Evaluation of Energy-Aware QoS Routing Protocol for Ad Hoc Wireless Sensor Networks

Authors: M.K.Jeya Kumar

Abstract:

Many advanced Routing protocols for wireless sensor networks have been implemented for the effective routing of data. Energy awareness is an essential design issue and almost all of these routing protocols are considered as energy efficient and its ultimate objective is to maximize the whole network lifetime. However, the introductions of video and imaging sensors have posed additional challenges. Transmission of video and imaging data requires both energy and QoS aware routing in order to ensure efficient usage of the sensors and effective access to the gathered measurements. In this paper, the performance of the energy-aware QoS routing Protocol are analyzed in different performance metrics like average lifetime of a node, average delay per packet and network throughput. The parameters considered in this study are end-to-end delay, real time data generation/capture rates, packet drop probability and buffer size. The network throughput for realtime and non-realtime data was also has been analyzed. The simulation has been done in NS2 simulation environment and the simulation results were analyzed with respect to different metrics.

Keywords: Cluster nodes, end-to-end delay, QoS routing, routing protocols, sensor networks, least-cost-path.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1939
768 A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations

Authors: Ramon Santana

Abstract:

The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.

Keywords: Fingerprint, template protection, bio-cryptography, minutiae protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 842
767 Influence of Ambiguity Cluster on Quality Improvement in Image Compression

Authors: Safaa Al-Ali, Ahmad Shahin, Fadi Chakik

Abstract:

Image coding based on clustering provides immediate access to targeted features of interest in a high quality decoded image. This approach is useful for intelligent devices, as well as for multimedia content-based description standards. The result of image clustering cannot be precise in some positions especially on pixels with edge information which produce ambiguity among the clusters. Even with a good enhancement operator based on PDE, the quality of the decoded image will highly depend on the clustering process. In this paper, we introduce an ambiguity cluster in image coding to represent pixels with vagueness properties. The presence of such cluster allows preserving some details inherent to edges as well for uncertain pixels. It will also be very useful during the decoding phase in which an anisotropic diffusion operator, such as Perona-Malik, enhances the quality of the restored image. This work also offers a comparative study to demonstrate the effectiveness of a fuzzy clustering technique in detecting the ambiguity cluster without losing lot of the essential image information. Several experiments have been carried out to demonstrate the usefulness of ambiguity concept in image compression. The coding results and the performance of the proposed algorithms are discussed in terms of the peak signal-tonoise ratio and the quantity of ambiguous pixels.

Keywords: Ambiguity Cluster, Anisotropic Diffusion, Fuzzy Clustering, Image Compression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570
766 Modes of Collapse of Compress–Expand Member under Axial Loading

Authors: Shigeyuki Haruyama, Aidil Khaidir Bin Muhamad, Ken Kaminishi, Dai-Heng Chen

Abstract:

In this paper, a study on the modes of collapse of compress- expand members are presented. Compress- expand member is a compact, multiple-combined cylinders, to be proposed as energy absorbers. Previous studies on the compress- expand member have clarified its energy absorption efficiency, proposed an approximate equation to describe its deformation characteristics and also highlighted the improvement that it has brought. However, for the member to be practical, the actual range of geometrical dimension that it can maintain its applicability must be investigated. In this study, using a virtualized materials that comply the bilinear hardening law, Finite element Method (FEM) analysis on the collapse modes of compress- expand member have been conducted. Deformation maps that plotted the member's collapse modes with regards to the member's geometric and material parameters were then presented in order to determine the dimensional range of each collapse modes.

Keywords: Axial collapse, compress-expand member, tubular member, finite element method, modes of collapse, thin-walled cylindrical tube.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2023
765 A Straightforward Approach for Determining the Weights of Decision Makers Based on Angle Cosine and Projection Method

Authors: Qiang Yang, Ping-An Du

Abstract:

Group decision making with multiple attribute has attracted intensive concern in the decision analysis area. This paper assumes that the contributions of all the decision makers (DMs) are not equal to the decision process based on different knowledge and experience in group setting. The aim of this paper is to develop a novel approach to determine weights of DMs in the group decision making problems. In this paper, the weights of DMs are determined in the group decision environment via angle cosine and projection method. First of all, the average decision of all individual decisions is defined as the ideal decision. After that, we define the weight of each decision maker (DM) by aggregating the angle cosine and projection between individual decision and ideal decision with associated direction indicator μ. By using the weights of DMs, all individual decisions are aggregated into a collective decision. Further, the preference order of alternatives is ranked in accordance with the overall row value of collective decision. Finally, an example in a chemical company is provided to illustrate the developed approach.

Keywords: Angel cosine, ideal decision, projection method, weights of decision makers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1859
764 Phishing Attacks Facilitated by Open-Source Intelligence

Authors: Urva Maryam

Abstract:

Private data are more often breached by clever social engineering rather than exploiting technical vulnerabilities in the systems. Complete information security requires good data safety practices to go along with technical solutions. Hackers often begin their operation by simply sending spoofed emails or fraudulent URLs to their targets and trick them into providing sensitive information such as passwords or bank account details. This technique is called phishing. Phishing attacks can be launched on email addresses, open ports and unsecured web browsers. This study uses quantitative method of research to execute phishing experiments on the participants to test their response to the phishing emails. These experiments were run on Kali Linux distribution which came bundled with multiple open-source intelligence (OSINT) tools that were used in the study. The aim of this research is to see how successful phishing attacks can be launched using OSINT and to test the response of people to spoofed emails.

Keywords: OSINT, phishing, spear phishing, email spoofing, theHarvester, Maltego.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 189
763 Transmitter Macrodiversity in Multihopping- SFN Based Algorithm for Improved Node Reachability and Robust Routing

Authors: Magnus Eriksson, Arif Mahmud

Abstract:

A novel idea presented in this paper is to combine multihop routing with single-frequency networks (SFNs) for a broadcasting scenario. An SFN is a set of multiple nodes that transmit the same data simultaneously, resulting in transmitter macrodiversity. Two of the most important performance factors of multihop networks, node reachability and routing robustness, are analyzed. Simulation results show that our proposed SFN-D routing algorithm improves the node reachability by 37 percentage points as compared to non-SFN multihop routing. It shows a diversity gain of 3.7 dB, meaning that 3.7 dB lower transmission powers are required for the same reachability. Even better results are possible for larger networks. If an important node becomes inactive, this algorithm can find new routes that a non-SFN scheme would not be able to find. Thus, two of the major problems in multihopping are addressed; achieving robust routing as well as improving node reachability or reducing transmission power.

Keywords: OFDM, single-frequency networks (SFN), DSFN, MANET; multihop routing, transmitter macrodiversity, broadcasting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927