Search results for: quality approaches
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4080

Search results for: quality approaches

2850 Comparative Study of Different Enhancement Techniques for Computed Tomography Images

Authors: C. G. Jinimole, A. Harsha

Abstract:

One of the key problems facing in the analysis of Computed Tomography (CT) images is the poor contrast of the images. Image enhancement can be used to improve the visual clarity and quality of the images or to provide a better transformation representation for further processing. Contrast enhancement of images is one of the acceptable methods used for image enhancement in various applications in the medical field. This will be helpful to visualize and extract details of brain infarctions, tumors, and cancers from the CT image. This paper presents a comparison study of five contrast enhancement techniques suitable for the contrast enhancement of CT images. The types of techniques include Power Law Transformation, Logarithmic Transformation, Histogram Equalization, Contrast Stretching, and Laplacian Transformation. All these techniques are compared with each other to find out which enhancement provides better contrast of CT image. For the comparison of the techniques, the parameters Peak Signal to Noise Ratio (PSNR) and Mean Square Error (MSE) are used. Logarithmic Transformation provided the clearer and best quality image compared to all other techniques studied and has got the highest value of PSNR. Comparison concludes with better approach for its future research especially for mapping abnormalities from CT images resulting from Brain Injuries.

Keywords: Computed tomography, enhancement techniques, increasing contrast, PSNR and MSE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1382
2849 Comparative Survey of Object Serialization Techniques and the Programming Supports

Authors: Kazuaki Maeda

Abstract:

This paper compares six approaches of object serialization from qualitative and quantitative aspects. Those are object serialization in Java, IDL, XStream, Protocol Buffers, Apache Avro, and MessagePack. Using each approach, a common example is serialized to a file and the size of the file is measured. The qualitative comparison works are investigated in the way of checking whether schema definition is required or not, whether schema compiler is required or not, whether serialization is based on ascii or binary, and which programming languages are supported. It is clear that there is no best solution. Each solution makes good in the context it was developed.

Keywords: structured data, serialization, programming

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2523
2848 Reliability Levels of Reinforced Concrete Bridges Obtained by Mixing Approaches

Authors: Adrián D. García-Soto, Alejandro Hernández-Martínez, Jesús G. Valdés-Vázquez, Reyna A. Vizguerra-Alvarez

Abstract:

Reinforced concrete bridges designed by code are intended to achieve target reliability levels adequate for the geographical environment where the code is applicable. Several methods can be used to estimate such reliability levels. Many of them require the establishment of an explicit limit state function (LSF). When such LSF is not available as a close-form expression, the simulation techniques are often employed. The simulation methods are computing intensive and time consuming. Note that if the reliability of real bridges designed by code is of interest, numerical schemes, the finite element method (FEM) or computational mechanics could be required. In these cases, it can be quite difficult (or impossible) to establish a close-form of the LSF, and the simulation techniques may be necessary to compute reliability levels. To overcome the need for a large number of simulations when no explicit LSF is available, the point estimate method (PEM) could be considered as an alternative. It has the advantage that only the probabilistic moments of the random variables are required. However, in the PEM, fitting of the resulting moments of the LSF to a probability density function (PDF) is needed. In the present study, a very simple alternative which allows the assessment of the reliability levels when no explicit LSF is available and without the need of extensive simulations is employed. The alternative includes the use of the PEM, and its applicability is shown by assessing reliability levels of reinforced concrete bridges in Mexico when a numerical scheme is required. Comparisons with results by using the Monte Carlo simulation (MCS) technique are included. To overcome the problem of approximating the probabilistic moments from the PEM to a PDF, a well-known distribution is employed. The approach mixes the PEM and other classic reliability method (first order reliability method, FORM). The results in the present study are in good agreement whit those computed with the MCS. Therefore, the alternative of mixing the reliability methods is a very valuable option to determine reliability levels when no close form of the LSF is available, or if numerical schemes, the FEM or computational mechanics are employed.

Keywords: Structural reliability, reinforced concrete bridges, mixing approaches, point estimate method, Monte Carlo simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1427
2847 Wireless Building Monitoring and Control System

Authors: J.-P. Skön, M. Johansson, O. Kauhanen, M. Raatikainen, K. Leiviskä, M. Kolehmainen

Abstract:

The building sector is the largest energy consumer and CO2 emitter in the European Union (EU) and therefore the active reduction of energy consumption and elimination of energy wastage are among the main goals in it. Healthy housing and energy efficiency are affected by many factors which set challenges to monitoring, control and research of indoor air quality (IAQ) and energy consumption, especially in old buildings. These challenges include measurement and equipment costs, for example. Additionally, the measurement results are difficult to interpret and their usage in the ventilation control is also limited when taking into account the energy efficiency of housing at the same time. The main goal of this study is to develop a cost-effective building monitoring and control system especially for old buildings. The starting point or keyword of the development process is a wireless system; otherwise the installation costs become too high. As the main result, this paper describes an idea of a wireless building monitoring and control system. The first prototype of the system has been installed in 10 residential buildings and in 10 school buildings located in the City of Kuopio, Finland.

Keywords: Energy efficiency, Indoor air quality, Monitoring system, Building automation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
2846 Annoyance Caused by Air Pollution: A Comparative Study of Two Industrialized Regions

Authors: Milena M. Melo, Jane M. Santos, Severine Frere, Valderio A. Reisen, Neyval C. Reis Jr., Maria de Fátima S. Leite

Abstract:

Although there had been a many studies that shows the impact of air pollution on physical health, comparatively less was known of human behavioral responses and annoyance impacts. Annoyance caused by air pollution is a public health problem because it can be an ambient stressor causing stress and disease and can affect quality of life. The objective of this work is to evaluate the annoyance caused by air pollution in two different industrialized urban areas, Dunkirk (France) and Vitoria (Brazil). The populations of these cities often report feeling annoyed by dust. Surveys were conducted, and the collected data were analyzed using statistical analyses. The results show that sociodemographic variables, importance of air quality, perceived industrial risk, perceived air pollution and occurrence of health problems play important roles in the perceived annoyance. These results show the existence of a common problem in geographically distant areas and allow stakeholders to develop prevention strategies.

Keywords: Air pollution, annoyance, industrial risks, perception of pollution, public health, settled dust.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2323
2845 Bridging Quantitative and Qualitative of Glaucoma Detection

Authors: Noor Elaiza Abdul Khalid, Noorhayati Mohamed Noor, Zamalia Mahmud, Saadiah Yahya, and Norharyati Md Ariff

Abstract:

Glaucoma diagnosis involves extracting three features of the fundus image; optic cup, optic disc and vernacular. Present manual diagnosis is expensive, tedious and time consuming. A number of researches have been conducted to automate this process. However, the variability between the diagnostic capability of an automated system and ophthalmologist has yet to be established. This paper discusses the efficiency and variability between ophthalmologist opinion and digital technique; threshold. The efficiency and variability measures are based on image quality grading; poor, satisfactory or good. The images are separated into four channels; gray, red, green and blue. A scientific investigation was conducted on three ophthalmologists who graded the images based on the image quality. The images are threshold using multithresholding and graded as done by the ophthalmologist. A comparison of grade from the ophthalmologist and threshold is made. The results show there is a small variability between result of ophthalmologists and digital threshold.

Keywords: Digital Fundus Image, Glaucoma Detection, Multithresholding, Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2050
2844 A Developmental Survey of Local Stereo Matching Algorithms

Authors: André Smith, Amr Abdel-Dayem

Abstract:

This paper presents an overview of the history and development of stereo matching algorithms. Details from its inception, up to relatively recent techniques are described, noting challenges that have been surmounted across these past decades. Different components of these are explored, though focus is directed towards the local matching techniques. While global approaches have existed for some time, and demonstrated greater accuracy than their counterparts, they are generally quite slow. Many strides have been made more recently, allowing local methods to catch up in terms of accuracy, without sacrificing the overall performance.

Keywords: Developmental survey, local stereo matching, stereo correspondence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1478
2843 On Modified Numerical Schemes in Vortex Element Method for 2D Flow Simulation Around Airfoils

Authors: Ilia Marchevsky, Victoriya Moreva

Abstract:

The problem of incompressible steady flow simulation around an airfoil is discussed. For some simplest airfoils (circular, elliptical, Zhukovsky airfoils) the exact solution is known from complex analysis. It allows to compute the intensity of vortex layer which simulates the airfoil. Some modifications of the vortex element method are proposed and test computations are carried out. It-s shown that the these approaches are much more effective in comparison with the classical numerical scheme.

Keywords: Vortex element method, vortex layer, integral equation, ill-conditioned matrix.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1677
2842 Distribution and Source of PAHs in Surface Sediments of Canon River Mouth, Taiwan

Authors: Chiu-Wen Chen, Chih-Feng Chen, Cheng-Di Dong

Abstract:

Surface sediment samples were collected from the Canon River mouth, Taiwan and analyzed for polycyclic aromatic hydrocarbons (PAHs). Total PAHs concentrations varied from 337 to 1,252 ng/g dry weight, with a mean concentration of 827 ng/g dry weight. The spatial distribution of PAHs reveals that the PAHs concentration is relatively high in the river mouth region, and gradually diminishes toward the harbor region. Diagnostic ratios showed that the possible source of PAHs in the Canon River mouth could be petroleum combustion. The toxic equivalent concentrations (TEQcarc) of PAHs varied from 47 to 112 ng TEQ/g dry weight. Higher total TEQcarc values were found in the river mouth region. As compared with the US Sediment Quality Guidelines (SQGs), the observed levels of PAHs at Canon River mouth were lower than the effects range low (ERL), and would probably not exert adverse biological effects.

Keywords: PAHs, sediment, river mouth, sediment quality guidelines (SQGs), toxic equivalent (TEQcarc)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1832
2841 Project Risk Management Techniques in Resource Allocation, Scheduling and Planning

Authors: Hossein Amoozad Khalili, Anahita Maleki

Abstract:

Normally business changes are made in order to change a level of activity in some way, whether it is sales, cash flow, productivity, or product portfolio. When attempts are made to make such changes, too often the business reverts to the old levels of activity as soon as management attention is diverted. Risk management is a field of growing interest to project managers as well as in general business and organizational management. There are several approaches used to manage risk in projects and this paper is a brief outline of some that you might encounter, with an indication of their strengths and weaknesses.

Keywords: Risk Management, Project Management, Scheduling, Planning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3422
2840 Identifying Mitigation Plans in Reducing Usability Risk Using Delphi Method

Authors: Jayaletchumi T. Sambantha Moorthy, Suhaimi bin Ibrahim, Mohd Naz’ri Mahrin

Abstract:

Most quality models have defined usability as a significant factor that leads to improving product acceptability, increasing user satisfaction, improving product reliability, and also financially benefitting companies. Usability is also the best factor that balances both the technical and human aspects of a software product, which is an important aspect in defining quality during software development process. A usability risk consist risk factors that could impact the usability of a software product thereby contributing to negative user experiences and causing a possible software product failure. Hence, it is important to mitigate and reduce usability risks in the software development process itself. By managing possible usability risks in software development process, failure of software product could be reduced. Therefore, this research uses the Delphi method to identify mitigation plans for reducing potential usability risks. The Delphi method is conducted with seven experts from the field of risk management and software development.

Keywords: Usability, Usability Risk, Risk Management, Risk Mitigation, Delphi Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2238
2839 Factorial Structure and Psychometric Validation of Ecotourism Experiential Value Construct: Insights from Taman Negara National Park, Malaysia

Authors: Rosidah Musa, Rezian-na Muhammad Kassim

Abstract:

The purpose of this research is to disentangle and validate the underlying factorial-structure of Ecotourism Experiential Value (EEV) measurement scale and subsequently investigate its psychometric properties. The analysis was based on a sample of 225 eco-tourists, collected at the vicinity of Taman Negara National Park (TNNP) via interviewer-administered questionnaire. Exploratory factor analysis (EFA) was performed to determine the factorial structure of EEV. Subsequently, to confirm and validate the factorial structure and assess the psychometric properties of EEV, confirmatory factor analysis (CFA) was executed. In addition, to establish the nomological validity of EEV a structural model was developed to examine the effect of EEV on Total Eco-tourist Experience Quality (TEEQ). It is unveiled that EEV is a secondorder six-factorial structure construct and it scale has adequately met the psychometric criteria, thus could permit interpretation of results confidently. The findings have important implications for future research directions and management of ecotourism destination.

Keywords: ecotourism, experiential value, experience quality, national park,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2029
2838 Bounded Rational Heterogeneous Agents in Artificial Stock Markets: Literature Review and Research Direction

Authors: Talal Alsulaiman, Khaldoun Khashanah

Abstract:

In this paper, we provided a literature survey on the artificial stock problem (ASM). The paper began by exploring the complexity of the stock market and the needs for ASM. ASM aims to investigate the link between individual behaviors (micro level) and financial market dynamics (macro level). The variety of patterns at the macro level is a function of the AFM complexity. The financial market system is a complex system where the relationship between the micro and macro level cannot be captured analytically. Computational approaches, such as simulation, are expected to comprehend this connection. Agent-based simulation is a simulation technique commonly used to build AFMs. The paper proceeds by discussing the components of the ASM. We consider the roles of behavioral finance (BF) alongside the traditionally risk-averse assumption in the construction of agent’s attributes. Also, the influence of social networks in the developing of agents interactions is addressed. Network topologies such as a small world, distance-based, and scale-free networks may be utilized to outline economic collaborations. In addition, the primary methods for developing agents learning and adaptive abilities have been summarized. These incorporated approach such as Genetic Algorithm, Genetic Programming, Artificial neural network and Reinforcement Learning. In addition, the most common statistical properties (the stylized facts) of stock that are used for calibration and validation of ASM are discussed. Besides, we have reviewed the major related previous studies and categorize the utilized approaches as a part of these studies. Finally, research directions and potential research questions are argued. The research directions of ASM may focus on the macro level by analyzing the market dynamic or on the micro level by investigating the wealth distributions of the agents.

Keywords: Artificial stock markets, agent based simulation, bounded rationality, behavioral finance, artificial neural network, interaction, scale-free networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2534
2837 Control and Navigation with Knowledge Bases

Authors: Miloš Šeda, Tomáš Březina

Abstract:

In this paper, we focus on the use of knowledge bases in two different application areas – control of systems with unknown or strongly nonlinear models (i.e. hardly controllable by the classical methods), and robot motion planning in eight directions. The first one deals with fuzzy logic and the paper presents approaches for setting and aggregating the rules of a knowledge base. Te second one is concentrated on a case-based reasoning strategy for finding the path in a planar scene with obstacles.

Keywords: fuzzy controller, fuzzification, rule base, inference, defuzzification, genetic algorithm, neural network, case-based reasoning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1598
2836 In-flight Meals, Passengers- Level of Satisfaction and Re-flying Intention

Authors: Mohd Zahari, M. S, Salleh, N. K., Kamaruddin, M. S. Y, Kutut, M. Z.

Abstract:

Service quality has become a centerpiece for airline companies in vying with one another and keeps their image in the minds of passengers. Many airlines have pushed service quality through service personalization which includes both ground and on board especially from the viewpoint of retaining satisfied passengers and attracting new ones. Besides those, in-flight meals/food service is another important aspect of the airline operation. The in flight meals/food services now are seen as part of marketing strategies in attracting business or leisure travelers. This study reports the outcomes of the investigation on in-flight meals/food attributes toward passengers- level of satisfaction and re-flying intention. Taste, freshness, appearance of in-flight meals/food served and menu choices are important to the airlines passengers especially for the long haul flight. Food not only contributes to the prediction of the airline passengers- levels of satisfaction but besides other factors slightly influence passengers- re- flying intention. Airline companies therefore should not ignore this element but take the opportunity to create more attractive and acceptable in-flight meals/food along with other matter as marketing tools in attracting passengers to re-flying with them.

Keywords: In-flight meal, passengers, satisfaction, re-flying and intention

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8080
2835 European Radical Right Parties as Actors in Securitization of Migration

Authors: Mehmet Gökay Özerim

Abstract:

This study reveals that anti-immigrant policies in Europe result from a process of securitization, and that, within this process, radical right parties have been formulating discourses and approaches through a construction process by using some common security themes. These security themes can be classified as national security, economic security, cultural security and internal security. The frequency with which radical right parties use these themes may vary according to the specific historical, social and cultural characteristics of a particular country.

Keywords: European Union, International Migration, Radical Right Parties, Securitization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3420
2834 Ecological Risk Assessment of Polycyclic Aromatic Hydrocarbons in the Northwest of the Persian Gulf

Authors: Ghazaleh Monazami Tehrani, Reza Khani Jazani, Rosli Hashim , Ahmad Savari, Belin Tavakoly Sany, Parastoo Parivar, Zhamak Monazami Tehrani

Abstract:

This study investigated the presence of polycyclic aromatic hydrocarbons (PAHs) in the sediments of the Musa Bay (around the PETZONE coastal area) from Feb 2010 to Jun 2010. Concentrations of PAHs recorded in the Musa Bay sediments ranged from 537.89 to 26,659.06 ng/g dry weight with a mean value of 3990.74 ng/g. the highest concentration of PAHs was observed at station 4, which is located near the aromatic outlet of Imam Khomeini petrochemical company (station 4: BI-PC Aromatic effluent outlet) in which its concentration level was more than the NOAA sediment quality guideline value (ERL= 4022 ng/g dry weight). Owing to the concentration of PAHs in the study area, its concentration level was still meet the NOAA sediment quality guideline value (ERL: 4022 ng/g dry weight); however, according to the PELq factor, slightly adverse biological effects are associated with the exposure to PAHs levels in the study area (0.1< PELq= 0.24 > 0.5).

Keywords: Musa Bay, PAHs, PETZONE, NOAA, PELq.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2078
2833 Web Pages Aesthetic Evaluation Using Low-Level Visual Features

Authors: Maryam Mirdehghani, S. Amirhassan Monadjemi

Abstract:

Web sites are rapidly becoming the preferred media choice for our daily works such as information search, company presentation, shopping, and so on. At the same time, we live in a period where visual appearances play an increasingly important role in our daily life. In spite of designers- effort to develop a web site which be both user-friendly and attractive, it would be difficult to ensure the outcome-s aesthetic quality, since the visual appearance is a matter of an individual self perception and opinion. In this study, it is attempted to develop an automatic system for web pages aesthetic evaluation which are the building blocks of web sites. Based on the image processing techniques and artificial neural networks, the proposed method would be able to categorize the input web page according to its visual appearance and aesthetic quality. The employed features are multiscale/multidirectional textural and perceptual color properties of the web pages, fed to perceptron ANN which has been trained as the evaluator. The method is tested using university web sites and the results suggested that it would perform well in the web page aesthetic evaluation tasks with around 90% correct categorization.

Keywords: Web Page Design, Web Page Aesthetic, Color Spaces, Texture, Neural Networks

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645
2832 Seawater Desalination for Production of Highly Pure Water Using a Hydrophobic PTFE Membrane and Direct Contact Membrane Distillation (DCMD)

Authors: Ahmad Kayvani Fard, Yehia Manawi

Abstract:

Qatar’s primary source of fresh water is through seawater desalination. Amongst the major processes that are commercially available on the market, the most common large scale techniques are Multi-Stage Flash distillation (MSF), Multi Effect distillation (MED), and Reverse Osmosis (RO). Although commonly used, these three processes are highly expensive down to high energy input requirements and high operating costs allied with maintenance and stress induced on the systems in harsh alkaline media. Beside that cost, environmental footprint of these desalination techniques are significant; from damaging marine eco-system, to huge land use, to discharge of tons of GHG and huge carbon footprint. Other less energy consuming techniques based on membrane separation are being sought to reduce both the carbon footprint and operating costs is membrane distillation (MD). Emerged in 1960s, MD is an alternative technology for water desalination attracting more attention since 1980s. MD process involves the evaporation of a hot feed, typically below boiling point of brine at standard conditions, by creating a water vapor pressure difference across the porous, hydrophobic membrane. Main advantages of MD compared to other commercially available technologies (MSF and MED) and specially RO are reduction of membrane and module stress due to absence of trans-membrane pressure, less impact of contaminant fouling on distillate due to transfer of only water vapor, utilization of low grade or waste heat from oil and gas industries to heat up the feed up to required temperature difference across the membrane, superior water quality, and relatively lower capital and operating cost. To achieve the objective of this study, state of the art flat-sheet cross-flow DCMD bench scale unit was designed, commissioned, and tested. The objective of this study is to analyze the characteristics and morphology of the membrane suitable for DCMD through SEM imaging and contact angle measurement and to study the water quality of distillate produced by DCMD bench scale unit. Comparison with available literature data is undertaken where appropriate and laboratory data is used to compare a DCMD distillate quality with that of other desalination techniques and standards. Membrane SEM analysis showed that the PTFE membrane used for the study has contact angle of 127º with highly porous surface supported with less porous and bigger pore size PP membrane. Study on the effect of feed solution (salinity) and temperature on water quality of distillate produced from ICP and IC analysis showed that with any salinity and different feed temperature (up to 70ºC) the electric conductivity of distillate is less than 5 μS/cm with 99.99% salt rejection and proved to be feasible and effective process capable of consistently producing high quality distillate from very high feed salinity solution (i.e. 100000 mg/L TDS) even with substantial quality difference compared to other desalination methods such as RO and MSF.

Keywords: Membrane Distillation, Waste Heat, Seawater Desalination, Membrane, Freshwater, Direct Contact Membrane Distillation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4158
2831 Dynamic Metrics for Polymorphism in Object Oriented Systems

Authors: Parvinder Singh Sandhu, Gurdev Singh

Abstract:

Metrics is the process by which numbers or symbols are assigned to attributes of entities in the real world in such a way as to describe them according to clearly defined rules. Software metrics are instruments or ways to measuring all the aspect of software product. These metrics are used throughout a software project to assist in estimation, quality control, productivity assessment, and project control. Object oriented software metrics focus on measurements that are applied to the class and other characteristics. These measurements convey the software engineer to the behavior of the software and how changes can be made that will reduce complexity and improve the continuing capability of the software. Object oriented software metric can be classified in two types static and dynamic. Static metrics are concerned with all the aspects of measuring by static analysis of software and dynamic metrics are concerned with all the measuring aspect of the software at run time. Major work done before, was focusing on static metric. Also some work has been done in the field of dynamic nature of the software measurements. But research in this area is demanding for more work. In this paper we give a set of dynamic metrics specifically for polymorphism in object oriented system.

Keywords: Metrics, Software, Quality, Object oriented system, Polymorphism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1769
2830 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information

Authors: A. Preetha Priyadharshini, S. B. M. Priya

Abstract:

In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.

Keywords: Imperfect channel state information, outage probability, multiuser- multi input single output.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1124
2829 Use of Semantic Networks as Learning Material and Evaluation of the Approach by Students

Authors: Philippe A. Martin

Abstract:

This article first summarizes reasons why current approaches supporting Open Learning and Distance Education need to be complemented by tools permitting lecturers, researchers and students to cooperatively organize the semantic content of Learning related materials (courses, discussions, etc.) into a fine-grained shared semantic network. This first part of the article also quickly describes the approach adopted to permit such a collaborative work. Then, examples of such semantic networks are presented. Finally, an evaluation of the approach by students is provided and analyzed.

Keywords: knowledge sharing, knowledge evaluation, e-learning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1518
2828 An Innovative Green Cooling Approach Using Peltier Chip in Milling Operation for Surface Roughness Improvement

Authors: Md. Anayet U. Patwari, Mohammad Ahsan Habib, Md. Tanzib Ehsan, Md Golam Ahnaf, Md. S. I. Chowdhury

Abstract:

Surface roughness is one of the key quality parameters of the finished product. During any machining operation, high temperatures are generated at the tool-chip interface impairing surface quality and dimensional accuracy of products. Cutting fluids are generally applied during machining to reduce temperature at the tool-chip interface. However, usages of cutting fluids give rise to problems such as waste disposal, pollution, high cost, and human health hazard. Researchers, now-a-days, are opting towards dry machining and other cooling techniques to minimize use of coolants during machining while keeping surface roughness of products within desirable limits. In this paper, a concept of using peltier cooling effects during aluminium milling operation has been presented and adopted with an aim to improve surface roughness of the machined surface. Experimental evidence shows that peltier cooling effect provides better surface roughness of the machined surface compared to dry machining.

Keywords: Aluminium, surface roughness, Peltier cooling effect, milling operation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 962
2827 Flocculation on the Treatment of Olive Oil Mill Wastewater: Pretreatment

Authors: G. Hodaifa, J. A. Páez, C. Agabo, E. Ramos, J. C. Gutiérrez, A. Rosal

Abstract:

Currently, continuous two-phase decanter process used for olive oil production is the more internationally widespread. The wastewaters generated from this industry (OMW) are a real environmental problem because of its high organic load. Among proposed treatments for these wastewaters, advanced oxidation technologies (Fenton, ozone, photoFenton, etc.) are the most favourable. The direct application of these processes is somewhat expensive. Therefore, the application of a previous stage based on a flocculation-sedimentation operation is of high importance. In this research five commercial flocculants (three cationic, and two anionic) have been used to achieve the separation of phases (liquid clarifiedsludge). For each flocculant, different concentrations (0-1000 mg/L) have been studied. In these experiments, sludge volume formed and the final water quality were determined. The final removal percentages of total phenols (11.3-25.1%), COD (5.6-20.4%), total carbon (2.3-26.5%), total organic carbon (1.50-23.8%), total nitrogen (1.45-24.8%), and turbidity (27.9-61.4%) were determined. The variation on electric conductivity reduction percentage (1-8%) was also determined. Finally, the best flocculants with highest removal percentages have been determined (QG2001 and Flocudex CS49).

Keywords: Flocculants, flocculation, olive oil mill wastewater, water quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2560
2826 Two Approaches to Code Mobility in an Agent-based E-commerce System

Authors: Costin Badica, Maria Ganzha, Marcin Paprzycki

Abstract:

Recently, a model multi-agent e-commerce system based on mobile buyer agents and transfer of strategy modules was proposed. In this paper a different approach to code mobility is introduced, where agent mobility is replaced by local agent creation supplemented by similar code mobility as in the original proposal. UML diagrams of agents involved in the new approach to mobility and the augmented system activity diagram are presented and discussed.

Keywords: Agent system, agent mobility, code mobility, e-commerce, UML formalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1440
2825 LTE Performance Analysis in the City of Bogota Northern Zone for Two Different Mobile Broadband Operators over Qualipoc

Authors: Víctor D. Rodríguez, Edith P. Estupiñán, Juan C. Martínez

Abstract:

The evolution in mobile broadband technologies has allowed to increase the download rates in users considering the current services. The evaluation of technical parameters at the link level is of vital importance to validate the quality and veracity of the connection, thus avoiding large losses of data, time and productivity. Some of these failures may occur between the eNodeB (Evolved Node B) and the user equipment (UE), so the link between the end device and the base station can be observed. LTE (Long Term Evolution) is considered one of the IP-oriented mobile broadband technologies that work stably for data and VoIP (Voice Over IP) for those devices that have that feature. This research presents a technical analysis of the connection and channeling processes between UE and eNodeB with the TAC (Tracking Area Code) variables, and analysis of performance variables (Throughput, Signal to Interference and Noise Ratio (SINR)). Three measurement scenarios were proposed in the city of Bogotá using QualiPoc, where two operators were evaluated (Operator 1 and Operator 2). Once the data were obtained, an analysis of the variables was performed determining that the data obtained in transmission modes vary depending on the parameters BLER (Block Error Rate), performance and SNR (Signal-to-Noise Ratio). In the case of both operators, differences in transmission modes are detected and this is reflected in the quality of the signal. In addition, due to the fact that both operators work in different frequencies, it can be seen that Operator 1, despite having spectrum in Band 7 (2600 MHz), together with Operator 2, is reassigning to another frequency, a lower band, which is AWS (1700 MHz), but the difference in signal quality with respect to the establishment with data by the provider Operator 2 and the difference found in the transmission modes determined by the eNodeB in Operator 1 is remarkable.

Keywords: BLER, LTE, Network, Qualipoc, SNR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 542
2824 Evaluating Complexity – Ethical Challenges in Computational Design Processes

Authors: J.Partanen

Abstract:

Complexity, as a theoretical background has made it easier to understand and explain the features and dynamic behavior of various complex systems. As the common theoretical background has confirmed, borrowing the terminology for design from the natural sciences has helped to control and understand urban complexity. Phenomena like self-organization, evolution and adaptation are appropriate to describe the formerly inaccessible characteristics of the complex environment in unpredictable bottomup systems. Increased computing capacity has been a key element in capturing the chaotic nature of these systems. A paradigm shift in urban planning and architectural design has forced us to give up the illusion of total control in urban environment, and consequently to seek for novel methods for steering the development. New methods using dynamic modeling have offered a real option for more thorough understanding of complexity and urban processes. At best new approaches may renew the design processes so that we get a better grip on the complex world via more flexible processes, support urban environmental diversity and respond to our needs beyond basic welfare by liberating ourselves from the standardized minimalism. A complex system and its features are as such beyond human ethics. Self-organization or evolution is either good or bad. Their mechanisms are by nature devoid of reason. They are common in urban dynamics in both natural processes and gas. They are features of a complex system, and they cannot be prevented. Yet their dynamics can be studied and supported. The paradigm of complexity and new design approaches has been criticized for a lack of humanity and morality, but the ethical implications of scientific or computational design processes have not been much discussed. It is important to distinguish the (unexciting) ethics of the theory and tools from the ethics of computer aided processes based on ethical decisions. Urban planning and architecture cannot be based on the survival of the fittest; however, the natural dynamics of the system cannot be impeded on grounds of being “non-human". In this paper the ethical challenges of using the dynamic models are contemplated in light of a few examples of new architecture and dynamic urban models and literature. It is suggested that ethical challenges in computational design processes could be reframed under the concepts of responsibility and transparency.

Keywords: urban planning, architecture, dynamic modeling, ethics, complexity theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1895
2823 Availability Strategy of Medical Information for Telemedicine Services

Authors: Rozo D. Juan Felipe, Ramírez L. Leonardo Juan, Puerta A. Gabriel Alberto

Abstract:

The telemedicine services require correct computing resource management to guarantee productivity and efficiency for medical and non-medical staff. The aim of this study was to examine web management strategies to ensure the availability of resources and services in telemedicine so as to provide medical information management with an accessible strategy. In addition, to evaluate the quality-of-service parameters, the followings were measured: delays, throughput, jitter, latency, available bandwidth, percent of access and denial of services based of web management performance map with profiles permissions and database management. Through 24 different test scenarios, the results show 100% in availability of medical information, in relation to access of medical staff to web services, and quality of service (QoS) of 99% because of network delay and performance of computer network. The findings of this study suggest that the proposed strategy of web management is an ideal solution to guarantee the availability, reliability, and accessibility of medical information. Finally, this strategy offers seven user profile used at telemedicine center of Bogota-Colombia keeping QoS parameters suitable to telemedicine services.

Keywords: Availability, medical information, QoS, strategy, telemedicine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1324
2822 The Dependence of the Liquid Application on the Coverage of the Sprayed Objects in Terms of the Characteristics of the Sprayed Object during Spraying

Authors: Beata Cieniawska, Deta Łuczycka, Katarzyna Dereń

Abstract:

When assessing the quality of the spraying procedure, three indicators are used: uneven distribution of precipitation of liquid sprayed, degree of coverage of sprayed surfaces, and deposition of liquid spraying However, there is a lack of information on the relationship between the quality parameters of the procedure. Therefore, the research was carried out at the Institute of Agricultural Engineering of Wrocław University of Environmental and Life Sciences. The aim of the study was to determine the relationship between the degree of coverage of sprayed surfaces and the deposition of liquid in the aspect of the parametric characteristics of the protected plant using selected single and double stream nozzles. Experiments were conducted under laboratory conditions. The carrier of nozzles acted as an independent self-propelled sprayer used for spraying, whereas the parametric characteristics of plants were determined using artificial plants as the ratio of the vertical projection surface and the horizontal projection surface. The results and their analysis showed a strong and very strong correlation between the analyzed parameters in terms of the characteristics of the sprayed object.

Keywords: Degree of coverage, deposition of liquid, nozzle, spraying.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 771
2821 Genetic Programming Based Data Projections for Classification Tasks

Authors: César Estébanez, Ricardo Aler, José M. Valls

Abstract:

In this paper we present a GP-based method for automatically evolve projections, so that data can be more easily classified in the projected spaces. At the same time, our approach can reduce dimensionality by constructing more relevant attributes. Fitness of each projection measures how easy is to classify the dataset after applying the projection. This is quickly computed by a Simple Linear Perceptron. We have tested our approach in three domains. The experiments show that it obtains good results, compared to other Machine Learning approaches, while reducing dimensionality in many cases.

Keywords: Classification, genetic programming, projections.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1403