Search results for: Relative Bandwidth Service Differentiation(RBSD)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2112

Search results for: Relative Bandwidth Service Differentiation(RBSD)

1152 Thyroids Dose Evaluation and Calculation of Backscatter Factors for Co-60 Irradiations

Authors: D. Kısınma, A. B. Tugrul

Abstract:

The aim of the study is evaluation of absorbed doses for thyroids by using neck phantoms. For this purpose, it was arranged the irradiation set with different phantoms. Three different materials were used for phantom materials as, water, parafine and wood. The phantoms were three different dimensions for simulation of different ages and human race for each material. Co-60 gammao source was used for irradiation and the experimental procedure applied rigorously with narrow beam geometry.  As the results of the experiments the relative radiation doses are evaluated for therapic applications for thyroids and backscattering factors were calculated and shown that water, parafine and wood can appropriate for phantom material with the converge values of backscattering factors.

Keywords: Co-60, Dosimetry, phantom, thyroids.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1631
1151 Developing a Structured and Strategically Focused Performance Assessment System

Authors: Isabel Duarte de Almeida, João Vilas-Boas, Ana Abrantes Cabral

Abstract:

The number and adequacy of Performance-Indicators (PIs) for organisational purposes are core to the success of organisations and a major concern to the sponsor of this research. This assignment developed a procedure to improve a firm’s performance assessment system, by identifying two key-PIs out of 28 initial ones, and by setting criteria and their relative importance to validate and rank the adequacy and the right number of operational metrics. The Analytical-Hierarchy-Process was used with a synthesismethod to treat data coming from the management inquiries. Although organisational alignment has been achieved, business processes should also be targeted and PIs continuously revised.

Keywords: Strategic performance assessment systems, Key Performance Indicators (KPIs), Analytical Hierarchy Process (AHP).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
1150 A Secure Auditing Framework for Load Balancing in Cloud Environment

Authors: R. Geetha, T. Padmavathy

Abstract:

Security audit is an important aspect or feature to be considered in cloud service customer. It is basically a certification process to audit the controls that deliver the security requirements. Security audits are conducted by trained and qualified staffs that belong to an independent auditing organization. Security audits must be carried as a standard of security controls. Proper check to be made that the cloud user has a proper reporting and logging facilities with the customer's system and hence ensuring appropriate business and operational flow of data through cloud service. We propose a cloud-based secure auditing framework, which enables confided in power to safely store their mystery information on the semi-believed cloud specialist co-ops, and specifically share their mystery information with a wide scope of information recipient, to diminish the key administration intricacy for power proprietors and information collectors. Unique in relation to past cloud-based information framework, data proprietors transfer their mystery information into cloud utilizing static and dynamic evaluating plan. Another propelled determination is, if any information beneficiary needs individual record to download, the information collector will send the solicitation to the expert. The specialist proprietor has the Access Control. At the off probability, the businessman must impart the primary record to the knowledge collector, acknowledge statistics beneficiary solicitation. Once the acknowledgement for the records is over, the recipient downloads the first record and this record shifting time with date and downloading time with date are monitored by the inspector. In addition to deduplication concept, diminished cloud memory area using dynamic document distribution has been proposed.

Keywords: Cloud computing, cloud storage auditing, data integrity, key exposure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1168
1149 Application of Formyl-TIPPCu (II) for Temperature and Light Sensing

Authors: Dil Nawaz Khan, M. H. Sayyad, Muhammad Yaseen, Munawar Ali Munawar, Mukhtar Ali

Abstract:

Effect of temperature and light was investigated on a thin film of organic semiconductor formyl-TIPPCu(II) deposited on a glass substrate with preliminary evaporated gold electrodes. The electrical capacitance and resistance of the fabricated device were evaluated under the effect of temperature and light. The relative capacitance of the fabricated sensor increased by 4.3 times by rising temperature from 27 to 1870C, while under illumination up to 25000 lx, the capacitance of the Au/formyl-TIPPCu(II)/Au photo capacitive sensor increased continuously by 13.2 times as compared to dark conditions.

Keywords: formyl-TIPPCu(II), Organic semiconductor, Photocapacitance, Polarizability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1732
1148 The Legal Procedure of Attestation of Public Servants

Authors: Armen Yezekyan

Abstract:

The main purpose of this research is to comprehensively explore and identify the problems of attestation of the public servants and to propose solutions for these issues through deeply analyzing laws and the legal theoretical literature. For the detailed analysis of the above-mentioned problems we will use some research methods, the implementation of which has a goal to ensure the objectivity and clarity of scientific research and its results.

Keywords: Attestation, attestation commission, competition commission, public servant, public service, testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2920
1147 Multimodal Biometric Authentication Using Choquet Integral and Genetic Algorithm

Authors: Anouar Ben Khalifa, Sami Gazzah, Najoua Essoukri BenAmara

Abstract:

The Choquet integral is a tool for the information fusion that is very effective in the case where fuzzy measures associated with it are well chosen. In this paper, we propose a new approach for calculating fuzzy measures associated with the Choquet integral in a context of data fusion in multimodal biometrics. The proposed approach is based on genetic algorithms. It has been validated in two databases: the first base is relative to synthetic scores and the second one is biometrically relating to the face, fingerprint and palmprint. The results achieved attest the robustness of the proposed approach.

Keywords: Multimodal biometrics, data fusion, Choquet integral, fuzzy measures, genetic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2516
1146 MinRoot and CMesh: Interconnection Architectures for Network-on-Chip Systems

Authors: Mohammad Ali Jabraeil Jamali, Ahmad Khademzadeh

Abstract:

The success of an electronic system in a System-on- Chip is highly dependent on the efficiency of its interconnection network, which is constructed from routers and channels (the routers move data across the channels between nodes). Since neither classical bus based nor point to point architectures can provide scalable solutions and satisfy the tight power and performance requirements of future applications, the Network-on-Chip (NoC) approach has recently been proposed as a promising solution. Indeed, in contrast to the traditional solutions, the NoC approach can provide large bandwidth with moderate area overhead. The selected topology of the components interconnects plays prime rule in the performance of NoC architecture as well as routing and switching techniques that can be used. In this paper, we present two generic NoC architectures that can be customized to the specific communication needs of an application in order to reduce the area with minimal degradation of the latency of the system. An experimental study is performed to compare these structures with basic NoC topologies represented by 2D mesh, Butterfly-Fat Tree (BFT) and SPIN. It is shown that Cluster mesh (CMesh) and MinRoot schemes achieves significant improvements in network latency and energy consumption with only negligible area overhead and complexity over existing architectures. In fact, in the case of basic NoC topologies, CMesh and MinRoot schemes provides substantial savings in area as well, because they requires fewer routers. The simulation results show that CMesh and MinRoot networks outperforms MESH, BFT and SPIN in main performance metrics.

Keywords: MinRoot, CMesh, NoC, Topology, Performance Evaluation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2127
1145 Septic B-Spline Collocation Method for Numerical Solution of the Kuramoto-Sivashinsky Equation

Authors: M. Zarebnia, R. Parvaz

Abstract:

In this paper the Kuramoto-Sivashinsky equation is solved numerically by collocation method. The solution is approximated as a linear combination of septic B-spline functions. Applying the Von-Neumann stability analysis technique, we show that the method is unconditionally stable. The method is applied on some test examples, and the numerical results have been compared with the exact solutions. The global relative error and L∞ in the solutions show the efficiency of the method computationally.

Keywords: Kuramoto-Sivashinsky equation, Septic B-spline, Collocation method, Finite difference.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2062
1144 Characterization of HD-V2 Gafchromic Film for Measurement of Spatial Dose Distribution from Alpha Particle of 5.5 MeV

Authors: A. Aydarous, M. El Ghazaly

Abstract:

The purpose of this study was to investigate the response of the newly released Gafchromic HD-V2 film for alpha particle of 5.5 MeV. Gafchromic HD-V2 was exposed to alpha particles of energy 5 MeV from 241Am for different durations. Then the films were scanned with a flatbed scanner. The dose response curve up to 2200 Gy has been achieved. The film’s reproducibility and sensitivity were evaluated. The results obtained show that the net optical density increases almost exponentially with the increase in the exposure time, and it becomes saturated after prolonged exposure times. The red channel shows the highest sensitivity, with a value of 4 x 10-3 Gy-1 at netOD of 0.4. The inter-film reproducibility was measured and the relative uncertainty found was 1.7 %, 2.1 % and 2.3 % for grey, red and green channels, respectively.

Keywords: Alpha dosimetry, 241Am, Gafchromic film.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3142
1143 Cascade Kalman Filter Configuration for Low Cost IMU/GPS Integration in Car Navigation Like Robot

Authors: Othman Maklouf, Abdurazag Ghila, Ahmed Abdulla

Abstract:

This paper introduces a low cost INS/GPS algorithm for land vehicle navigation application. The data fusion process is done with an extended Kalman filter in cascade configuration mode. In order to perform numerical simulations, MATLAB software has been developed. Loosely coupled configuration is considered. The results obtained in this work demonstrate that a low-cost INS/GPS navigation system is partially capable of meeting the performance requirements for land vehicle navigation. The relative effectiveness of the kalman filter implementation in integrated GPS/INS navigation algorithm is highlighted. The paper also provides experimental results; field test using a car is carried out.

Keywords: GPS, INS, IMU, Kalman filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3849
1142 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: G. Candel, D. Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embedding. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic, and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n2) to O(n2/k), and the memory requirement from n2 to 2(n/k)2 which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: Concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 489
1141 Performance Analysis of Chrominance Red and Chrominance Blue in JPEG

Authors: Mamta Garg

Abstract:

While compressing text files is useful, compressing still image files is almost a necessity. A typical image takes up much more storage than a typical text message and without compression images would be extremely clumsy to store and distribute. The amount of information required to store pictures on modern computers is quite large in relation to the amount of bandwidth commonly available to transmit them over the Internet and applications. Image compression addresses the problem of reducing the amount of data required to represent a digital image. Performance of any image compression method can be evaluated by measuring the root-mean-square-error & peak signal to noise ratio. The method of image compression that will be analyzed in this paper is based on the lossy JPEG image compression technique, the most popular compression technique for color images. JPEG compression is able to greatly reduce file size with minimal image degradation by throwing away the least “important" information. In JPEG, both color components are downsampled simultaneously, but in this paper we will compare the results when the compression is done by downsampling the single chroma part. In this paper we will demonstrate more compression ratio is achieved when the chrominance blue is downsampled as compared to downsampling the chrominance red in JPEG compression. But the peak signal to noise ratio is more when the chrominance red is downsampled as compared to downsampling the chrominance blue in JPEG compression. In particular we will use the hats.jpg as a demonstration of JPEG compression using low pass filter and demonstrate that the image is compressed with barely any visual differences with both methods.

Keywords: JPEG, Discrete Cosine Transform, Quantization, Color Space Conversion, Image Compression, Peak Signal to Noise Ratio & Compression Ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1676
1140 Optimal Image Representation for Linear Canonical Transform Multiplexing

Authors: Navdeep Goel, Salvador Gabarda

Abstract:

Digital images are widely used in computer applications. To store or transmit the uncompressed images requires considerable storage capacity and transmission bandwidth. Image compression is a means to perform transmission or storage of visual data in the most economical way. This paper explains about how images can be encoded to be transmitted in a multiplexing time-frequency domain channel. Multiplexing involves packing signals together whose representations are compact in the working domain. In order to optimize transmission resources each 4 × 4 pixel block of the image is transformed by a suitable polynomial approximation, into a minimal number of coefficients. Less than 4 × 4 coefficients in one block spares a significant amount of transmitted information, but some information is lost. Different approximations for image transformation have been evaluated as polynomial representation (Vandermonde matrix), least squares + gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev polynomials or singular value decomposition (SVD). Results have been compared in terms of nominal compression rate (NCR), compression ratio (CR) and peak signal-to-noise ratio (PSNR) in order to minimize the error function defined as the difference between the original pixel gray levels and the approximated polynomial output. Polynomial coefficients have been later encoded and handled for generating chirps in a target rate of about two chirps per 4 × 4 pixel block and then submitted to a transmission multiplexing operation in the time-frequency domain.

Keywords: Chirp signals, Image multiplexing, Image transformation, Linear canonical transform, Polynomial approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2129
1139 Inner Quality Parameters of Rapeseed (Brassica napus) Populations in Different Sowing Technology Models

Authors: É. Vincze

Abstract:

Demand on plant oils has increased to an enormous extent that is due to the change of human nutrition habits on the one hand, while on the other hand to the increase of raw material demand of some industrial sectors, just as to the increase of biofuel production. Besides the determining importance of sunflower in Hungary the production area, just as in part the average yield amount of rapeseed has increased among the produced oil crops. The variety/hybrid palette has changed significantly during the past decade. The available varieties’/hybrids’ palette has been extended to a significant extent. It is agreed that rapeseed production demands professionalism and local experience. Technological elements are successive; high yield amounts cannot be produced without system-based approach. The aim of the present work was to execute the complex study of one of the most critical production technology element of rapeseed production, that was sowing technology. Several sowing technology elements are studied in this research project that are the following: biological basis (the hybrid Arkaso is studied in this regard), sowing time (sowing time treatments were set so that they represent the wide period used in industrial practice: early, optimal and late sowing time) plant density (in this regard reaction of rare, optimal and too dense populations) were modelled. The multifactorial experimental system enables the single and complex evaluation of rapeseed sowing technology elements, just as their modelling using experimental result data. Yield quality and quantity have been determined as well in the present experiment, just as the interactions between these factors. The experiment was set up in four replications at the Látókép Plant Production Research Site of the University of Debrecen. Two different sowing times were sown in the first experimental year (2014), while three in the second (2015). Three different plant densities were set in both years: 200, 350 and 500 thousand plants ha-1. Uniform nutrient supply and a row spacing of 45 cm were applied. Winter wheat was used as pre-crop. Plant physiological measurements were executed in the populations of the Arkaso rapeseed hybrid that were: relative chlorophyll content analysis (SPAD) and leaf area index (LAI) measurement. Relative chlorophyll content (SPAD) and leaf area index (LAI) were monitored in 7 different measurement times.

Keywords: Inner quality, plant density, rapeseed, sowing time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 786
1138 Development of a Performance Measurement System for Forwarders

Authors: K. Schmidt, Z. Miodrag, C. Geiger

Abstract:

Performance Measurement is still a difficult task for forwarding companies. This is caused on the one hand by missing resources and on the other hand by missing tools. The research project “Management Information System for Logistics Service Providers" aims for closing the gap between needed and disposable solutions. Core of the project is the development

Keywords: Forwarder, Logistics, Management Information, Performance Measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314
1137 An ICA Algorithm for Separation of Convolutive Mixture of Speech Signals

Authors: Rajkishore Prasad, Hiroshi Saruwatari, Kiyohiro Shikano

Abstract:

This paper describes Independent Component Analysis (ICA) based fixed-point algorithm for the blind separation of the convolutive mixture of speech, picked-up by a linear microphone array. The proposed algorithm extracts independent sources by non- Gaussianizing the Time-Frequency Series of Speech (TFSS) in a deflationary way. The degree of non-Gaussianization is measured by negentropy. The relative performances of algorithm under random initialization and Null beamformer (NBF) based initialization are studied. It has been found that an NBF based initial value gives speedy convergence as well as better separation performance

Keywords: Blind signal separation, independent component analysis, negentropy, convolutive mixture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1778
1136 Energy Efficient Resource Allocation in Distributed Computing Systems

Authors: Samee Ullah Khan, C. Ardil

Abstract:

The problem of mapping tasks onto a computational grid with the aim to minimize the power consumption and the makespan subject to the constraints of deadlines and architectural requirements is considered in this paper. To solve this problem, we propose a solution from cooperative game theory based on the concept of Nash Bargaining Solution. The proposed game theoretical technique is compared against several traditional techniques. The experimental results show that when the deadline constraints are tight, the proposed technique achieves superior performance and reports competitive performance relative to the optimal solution.

Keywords: Energy efficient algorithms, resource allocation, resource management, cooperative game theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1790
1135 Load Forecasting in Microgrid Systems with R and Cortana Intelligence Suite

Authors: F. Lazzeri, I. Reiter

Abstract:

Energy production optimization has been traditionally very important for utilities in order to improve resource consumption. However, load forecasting is a challenging task, as there are a large number of relevant variables that must be considered, and several strategies have been used to deal with this complex problem. This is especially true also in microgrids where many elements have to adjust their performance depending on the future generation and consumption conditions. The goal of this paper is to present a solution for short-term load forecasting in microgrids, based on three machine learning experiments developed in R and web services built and deployed with different components of Cortana Intelligence Suite: Azure Machine Learning, a fully managed cloud service that enables to easily build, deploy, and share predictive analytics solutions; SQL database, a Microsoft database service for app developers; and PowerBI, a suite of business analytics tools to analyze data and share insights. Our results show that Boosted Decision Tree and Fast Forest Quantile regression methods can be very useful to predict hourly short-term consumption in microgrids; moreover, we found that for these types of forecasting models, weather data (temperature, wind, humidity and dew point) can play a crucial role in improving the accuracy of the forecasting solution. Data cleaning and feature engineering methods performed in R and different types of machine learning algorithms (Boosted Decision Tree, Fast Forest Quantile and ARIMA) will be presented, and results and performance metrics discussed.

Keywords: Time-series, features engineering methods for forecasting, energy demand forecasting, Azure machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1290
1134 Clean Sky 2 – Project PALACE: Aeration’s Experimental Sound Velocity Investigations for High-Speed Gerotor Simulations

Authors: Benoît Mary, Thibaut Gras, Gaëtan Fagot, Yvon Goth, Ilyes Mnassri-Cetim

Abstract:

A Gerotor pump is composed of an external and internal gear with conjugate cycloidal profiles. From suction to delivery ports, the fluid is transported inside cavities formed by teeth and driven by the shaft. From a geometric and conceptional side it is worth to note that the internal gear has one tooth less than the external one. Simcenter Amesim v.16 includes a new submodel for modelling the hydraulic Gerotor pumps behavior (THCDGP0). This submodel considers leakages between teeth tips using Poiseuille and Couette flows contributions. From the 3D CAD model of the studied pump, the “CAD import” tool takes out the main geometrical characteristics and the submodel THCDGP0 computes the evolution of each cavity volume and their relative position according to the suction or delivery areas. This module, based on international publications, presents robust results up to 6 000 rpm for pressure greater than atmospheric level. For higher rotational speeds or lower pressures, oil aeration and cavitation effects are significant and highly drop the pump’s performance. The liquid used in hydraulic systems always contains some gas, which is dissolved in the liquid at high pressure and tends to be released in a free form (i.e. undissolved as bubbles) when pressure drops. In addition to gas release and dissolution, the liquid itself may vaporize due to cavitation. To model the relative density of the equivalent fluid, modified Henry’s law is applied in Simcenter Amesim v.16 to predict the fraction of undissolved gas or vapor. Three parietal pressure sensors have been set up upstream from the pump to estimate the sound speed in the oil. Analytical models have been compared with the experimental sound speed to estimate the occluded gas content. Simcenter Amesim v.16 model was supplied by these previous analyses marks which have successfully improved the simulations results up to 14 000 rpm. This work provides a sound foundation for designing the next Gerotor pump generation reaching high rotation range more than 25 000 rpm. This improved module results will be compared to tests on this new pump demonstrator.

Keywords: Gerotor pump, high speed, simulations, aeronautic, aeration, cavitation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 568
1133 Impact of Fair Share and its Configurations on Parallel Job Scheduling Algorithms

Authors: Sangsuree Vasupongayya

Abstract:

To provide a better understanding of fair share policies supported by current production schedulers and their impact on scheduling performance, A relative fair share policy supported in four well-known production job schedulers is evaluated in this study. The experimental results show that fair share indeed reduces heavy-demand users from dominating the system resources. However, the detailed per-user performance analysis show that some types of users may suffer unfairness under fair share, possibly due to priority mechanisms used by the current production schedulers. These users typically are not heavy-demands users but they have mixture of jobs that do not spread out.

Keywords: Fair share, Parallel job scheduler, Backfill, Measures

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2090
1132 Identifying Project Delay Factors in the Australian Construction Industry

Authors: Syed Sohaib Bin Hasib, Hiyam Al-Kilidar

Abstract:

Meeting project deadlines is a major challenge for most construction projects. In this study, perceptions of contractors, clients, and consultants are compared relative to a list of factors derived from the review of the extant literature on project delay. 59 causes (categorized into 8 groups) of project delays were identified from the literature. A survey was devised to get insights and ranking of these factors from clients, consultants & contractors in the Australian construction industry. Findings showed that project delays in the Australian construction industry are mainly the result of skill shortages, interference in execution, and poor coordination and communication between the project stakeholders.

Keywords: Construction, delay factors, time delay, Australian construction industry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1260
1131 Feasibility Study of a BLDC Motor with Integrated Drive Circuit

Authors: Jun-Hyuk Choi, Joon Sung Park, Jung-Moo Seo, In-Soung Jung

Abstract:

A brushless DC motor with integrated drive circuit for air management system is presented. Using magnetic equivalent circuit model a basic design of the motor is determined, and specific configurations are inspected thanks to finite element analysis. In order to reduce an unbalanced magnetic force in an axial direction, induced forces between a stator core and a permanent magnet are calculated with respect to the relative positions of them. For the high efficiency, and high power density, BLDC motor and drive are developed. Also vibration mode and eccentricity of a rotor are considered at the rated and maximum rotational speed Through the experimental results, a validity of the simulated one is confirmed.

Keywords: blower, BLDC, inverter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2216
1130 Systems with Queueing and their Simulation

Authors: Miloš Šeda, Pavel Ošmera, Jindřich Petrucha

Abstract:

In the queueing theory, it is assumed that customer arrivals correspond to a Poisson process and service time has the exponential distribution. Using these assumptions, the behaviour of the queueing system can be described by means of Markov chains and it is possible to derive the characteristics of the system. In the paper, these theoretical approaches are presented on several types of systems and it is also shown how to compute the characteristics in a situation when these assumptions are not satisfied

Keywords: Queueing theory, Poisson process, Markov chains.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1286
1129 Study of Effect Different of Ozone Doses on Sugars Content in Tomatoes at Different Stages of Ripening

Authors: Milad. A. Shalluf

Abstract:

The determination of sugars in foods is very significant. Their relation in fact, can affect the chemical and sensorial quality of the matrix (e.g., sweetness, pH, total acidity, microbial stability, global acceptability) and can provide information on food to optimize several selected technological processes. Three stages of ripeness (green, yellow and red) of tomatoes (Lycopersicon Esculentum cv. Elegance) at different harvest dates were evaluated. Fruit from all harvests were exposed to different of ozone doses (0.25, 0.50 and 1 mg O3/g tomatoes) and clean air for 5 day at 15 °C±2 and 90-95 % relative humidity. Then, fruits were submitted for extraction and analysis after a day from the finish of exposure of each stage. The concentrations of the glucose and fructose increased in the tomatoes which were subjected to ozone treatments.

Keywords: Post-harvest Treatment, Controlled Atmosphere Storage, Ozone, Tomatoes, Glucose, Fructose

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1930
1128 Atoms in Molecules, An Other Method For Analyzing Dibenzoylmethane

Authors: S. Heydarian

Abstract:

Proton transfer and hydrogen bonding are two aspects of the chemistry of hydrogen that respectively govern the behaviour and structure of many molecules, both simple and complex. All the theoretical enol and keto conformations of 1,3-diphenyl-1,3- propandion known as dibenzoylmethane (DBM), have been investigated by means of atoms in molecules (AIM) theory. It was found that the most stable conformers are those stabilized by hydrogen bridges.The aim of the present paper is a thorough conformational analysis of DBM (with special attention on chelated cis-enol conformers) in order to obtain detailed information on the geometrical parameters, relative stabilities and rotational motion of the phenyl groups. It is also important to estimate the barrier height for ptoton transfer and hydrogen bond strength, which are the main factors governing conformational stability.

Keywords: Acetylacetone, Atoms in molecules, Dibenzoylmethane, Intramolecular hydrogen bond, Resonanceconjugation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1761
1127 Protection Plan of Medium Voltage Distribution Network in Tunisia

Authors: S. Chebbi, A. Meddeb

Abstract:

The distribution networks are often exposed to harmful incidents which can halt the electricity supply of the customer. In this context, we studied a real case of a critical zone of the Tunisian network which is currently characterized by the dysfunction of its plan of protection. In this paper, we were interested in the harmonization of the protection plan settings in order to ensure a perfect selectivity and a better continuity of service on the whole of the network.

Keywords: Distribution network Gabes-Tunisia, NEPLAN©DACH, protection plan settings, selectivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2958
1126 Beneficial Use of Coal Combustion By-products in the Rehabilitation of Failed Asphalt Pavements

Authors: Tarunjit S. Butalia, William E. Wolfe

Abstract:

This study demonstrates the use of Class F fly ash in combination with lime or lime kiln dust in the full depth reclamation (FDR) of asphalt pavements. FDR, in the context of this paper, is a process of pulverizing a predetermined amount of flexible pavement that is structurally deficient, blending it with chemical additives and water, and compacting it in place to construct a new stabilized base course. Test sections of two structurally deficient asphalt pavements were reclaimed using Class F fly ash in combination with lime and lime kiln dust. In addition, control sections were constructed using cement, cement and emulsion, lime kiln dust and emulsion, and mill and fill. The service performance and structural behavior of the FDR pavement test sections were monitored to determine how the fly ash sections compared to other more traditional pavement rehabilitation techniques. Service performance and structural behavior were determined with the use of sensors embedded in the road and Falling Weight Deflectometer (FWD) tests. Monitoring results of the FWD tests conducted up to 2 years after reclamation show that the cement, fly ash+LKD, and fly ash+lime sections exhibited two year resilient modulus values comparable to open graded cement stabilized aggregates (more than 750 ksi). The cement treatment resulted in a significant increase in resilient modulus within 3 weeks of construction and beyond this curing time, the stiffness increase was slow. On the other hand, the fly ash+LKD and fly ash+lime test sections indicated slower shorter-term increase in stiffness. The fly ash+LKD and fly ash+lime section average resilient modulus values at two years after construction were in excess of 800 ksi. Additional longer-term testing data will be available from ongoing pavement performance and environmental condition data collection at the two pavement sites.

Keywords: Coal fly ash, full depth reclamation, FWD, pavement rehabilitation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1946
1125 Experimental and Computational Analysis of Hygrothermal Performance of an Interior Thermal Insulation System

Authors: Z. Pavlík, J. Kočí, M. Pavlíková, R. Černý

Abstract:

Combined experimental and computational analysis of hygrothermal performance of an interior thermal insulation system applied on a brick wall is presented in the paper. In the experimental part, the functionality of the insulation system is tested at simulated difference climate conditions using a semi-scale device. The measured temperature and relative humidity profiles are used for the calibration of computer code HEMOT that is finally applied for a long-term hygrothermal analysis of the investigated structure.

Keywords: Additional thermal insulation, hygrothermal analysis, semi-scale testing, long-term computational analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
1124 The Dialectic between Effectiveness and Humanity in the Era of Open Knowledge from the Perspective of Pedagogy

Authors: Sophia Ming Lee Wen, Chao-Ching Kuo, Yu-Line Hu, Yu-Lung Ho, Chih-Cheng Huang, Yi-Hwa Lee

Abstract:

Teaching and learning should involve social issues by which effectiveness and humanity is due consideration as a guideline for sharing and co-creating knowledge. A qualitative method was used after a pioneer study to confirm pre-service teachers’ awareness of open knowledge. There are 17 in-service teacher candidates sampling from 181 schools in Taiwan. Two questions are to resolve: a) How did teachers change their educational ideas, in particular, their attitudes to meet the needs of knowledge sharing and co-creativity; and b) How did they acknowledge the necessity of working out an appropriate way between the educational efficiency and the nature of education for high performance management. This interview investigated teachers’ attitude of sharing and co-creating knowledge. The results show two facts in Taiwan: A) Individuals who must be able to express themselves will be capable of taking part in an open learning environment; and B) Teachers must lead the direction to inspire high performance and improve students’ capacity via knowledge sharing and co-creating knowledge, according to the student-centered philosophy. Collected data from interviewing showed that the teachers were well aware of changing their teaching methods and make some improvements to balance the educational efficiency and the nature of education. Almost all teachers acknowledge that ICT is helpful to motivate learning enthusiasm. Further, teaching integrated with ICT saves teachers’ time and energy on teaching preparation and promoting effectiveness. Teachers are willing to co-create knowledge with students, though using information is not easy due to the lack of operating skills of the website and ICT. Some teachers are against to co-create knowledge in the informational background since they hold that is not feasible for there being a knowledge gap between teachers and students. Technology would easily mislead teachers and students to the goal of instrumental rationality, which makes pedagogy dysfunctional and inhumane; however, any high quality of teaching should take a dialectical balance between effectiveness and humanity.

Keywords: Open knowledge, dialect between effectiveness and humanity, pedagogy, critical thinking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1385
1123 Identification and Classification of Plastic Resins using Near Infrared Reflectance Spectroscopy

Authors: Hamed Masoumi, Seyed Mohsen Safavi, Zahra Khani

Abstract:

In this paper, an automated system is presented for identification and separation of plastic resins based on near infrared (NIR) reflectance spectroscopy. For identification and separation among resins, a "Two-Filter" identification method is proposed that is capable to distinguish among polyethylene terephthalate (PET), high density polyethylene (HDPE), polyvinyl chloride (PVC), polypropylene (PP) and polystyrene (PS). Through surveying effects of parameters such as surface contamination, sample thickness, label and cap existence, it was obvious that the "Two-Filter" method has a high efficiency in identification of resins. It is shown that accurate identification and separation of five major resins can be obtained through calculating the relative reflectance at two wavelengths in the NIR region.

Keywords: Identification, Near Infrared, Plastic, Separation, Spectroscopy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10016