Search results for: level of service
2180 On the Way to the European Research Area: Programmes of the European Union as Factor of the Innovation Development the Scientific Organization in Ukraine
Authors: Yuri Nikitin, Veronika Rukas
Abstract:
Within the framework of the FP7 project "START" the cooperation with European research centres has had a positive impact on raising the level of innovation researches and the introduction of innovations Institute for Superhard Materials of the National Academy of Sciences (ISM NAS) of Ukraine in the economy of Europe and Ukraine, which in turn permits to speeds up the way for Ukrainian science to the European research area through the creation in Ukraine the scientific organizations of innovative type.
Keywords: Programs of the EU, innovative scientific results, innovation competence of the staff, commercialization in business of industry of the Europe and Ukraine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20482179 Heuristics Analysis for Distributed Scheduling using MONARC Simulation Tool
Authors: Florin Pop
Abstract:
Simulation is a very powerful method used for highperformance and high-quality design in distributed system, and now maybe the only one, considering the heterogeneity, complexity and cost of distributed systems. In Grid environments, foe example, it is hard and even impossible to perform scheduler performance evaluation in a repeatable and controllable manner as resources and users are distributed across multiple organizations with their own policies. In addition, Grid test-beds are limited and creating an adequately-sized test-bed is expensive and time consuming. Scalability, reliability and fault-tolerance become important requirements for distributed systems in order to support distributed computation. A distributed system with such characteristics is called dependable. Large environments, like Cloud, offer unique advantages, such as low cost, dependability and satisfy QoS for all users. Resource management in large environments address performant scheduling algorithm guided by QoS constrains. This paper presents the performance evaluation of scheduling heuristics guided by different optimization criteria. The algorithms for distributed scheduling are analyzed in order to satisfy users constrains considering in the same time independent capabilities of resources. This analysis acts like a profiling step for algorithm calibration. The performance evaluation is based on simulation. The simulator is MONARC, a powerful tool for large scale distributed systems simulation. The novelty of this paper consists in synthetic analysis results that offer guidelines for scheduler service configuration and sustain the empirical-based decision. The results could be used in decisions regarding optimizations to existing Grid DAG Scheduling and for selecting the proper algorithm for DAG scheduling in various actual situations.Keywords: Scheduling, Simulation, Performance Evaluation, QoS, Distributed Systems, MONARC
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17512178 Game-Theory-Based on Downlink Spectrum Allocation in Two-Tier Networks
Authors: Yu Zhang, Ye Tian, Fang Ye Yixuan Kang
Abstract:
The capacity of conventional cellular networks has reached its upper bound and it can be well handled by introducing femtocells with low-cost and easy-to-deploy. Spectrum interference issue becomes more critical in peace with the value-added multimedia services growing up increasingly in two-tier cellular networks. Spectrum allocation is one of effective methods in interference mitigation technology. This paper proposes a game-theory-based on OFDMA downlink spectrum allocation aiming at reducing co-channel interference in two-tier femtocell networks. The framework is formulated as a non-cooperative game, wherein the femto base stations are players and frequency channels available are strategies. The scheme takes full account of competitive behavior and fairness among stations. In addition, the utility function reflects the interference from the standpoint of channels essentially. This work focuses on co-channel interference and puts forward a negative logarithm interference function on distance weight ratio aiming at suppressing co-channel interference in the same layer network. This scenario is more suitable for actual network deployment and the system possesses high robustness. According to the proposed mechanism, interference exists only when players employ the same channel for data communication. This paper focuses on implementing spectrum allocation in a distributed fashion. Numerical results show that signal to interference and noise ratio can be obviously improved through the spectrum allocation scheme and the users quality of service in downlink can be satisfied. Besides, the average spectrum efficiency in cellular network can be significantly promoted as simulations results shown.Keywords: Femtocell networks, game theory, interference mitigation, spectrum allocation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7382177 The Characterisation of TLC NAND Flash Memory, Leading to a Definable Endurance/Retention Trade-Off
Authors: Sorcha Bennett, Joe Sullivan
Abstract:
Triple-Level Cell (TLC) NAND Flash memory at, and below, 20nm (nanometer) is still largely unexplored by researchers, and with the ever more commonplace existence of Flash in consumer and enterprise applications there is a need for such gaps in knowledge to be filled. At the time of writing, there was little published data or literature on TLC, and more specifically reliability testing, with a further emphasis on both endurance and retention. This paper will give an introduction to NAND Flash memory, followed by an overview of the relevant current research on the reliability of Flash memory, along with the planned future work which will provide results to help characterise the reliability of TLC memory.Keywords: TLC NAND flash memory, reliability, endurance, retention, trade-off, raw flash, patterns.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35132176 A Virtual Simulation Environment for a Design and Verification of a GPGPU
Authors: Kwang Y. Lee, Tae R. Park, Jae C. Kwak, Yong S. Koo
Abstract:
When a small H/W IP is designed, we can develop an appropriate verification environment by observing the simulated signal waves, or using the serial test vectors for the fixed output. In the case of design and verification of a massive parallel processor with multiple IPs, it-s difficult to make a verification system with existing common verification environment, and to verify each partial IP. A TestDrive verification environment can build easy and reliable verification system that can produce highly intuitive results by applying Modelsim and SystemVerilog-s DPI. It shows many advantages, for example a high-level design of a GPGPU processor design can be migrate to FPGA board immediately.Keywords: Virtual Simulation, Verification, IP Design, GPGPU
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16602175 Stereotype Student Model for an Adaptive e-Learning System
Authors: Ani Grubišić, Slavomir Stankov, Branko Žitko
Abstract:
This paper describes a concept of stereotype student model in adaptive knowledge acquisition e-learning system. Defined knowledge stereotypes are based on student's proficiency level and on Bloom's knowledge taxonomy. The teacher module is responsible for the whole adaptivity process: the automatic generation of courseware elements, their dynamic selection and sorting, as well as their adaptive presentation using templates for statements and questions. The adaptation of courseware is realized according to student-s knowledge stereotype.Keywords: Adaptive e-learning systems, adaptive courseware, stereotypes, Bloom's knowledge taxonomy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28992174 Achieving Design-Stage Elemental Cost Planning Accuracy: Case Study of New Zealand
Authors: Johnson Adafin, James O. B. Rotimi, Suzanne Wilkinson, Abimbola O. Windapo
Abstract:
An aspect of client expenditure management that requires attention is the level of accuracy achievable in design-stage elemental cost planning. This has been a major concern for construction clients and practitioners in New Zealand (NZ). Pre-tender estimating inaccuracies are significantly influenced by the level of risk information available to estimators. Proper cost planning activities should ensure the production of a project’s likely construction costs (initial and final), and subsequent cost control activities should prevent unpleasant consequences of cost overruns, disputes and project abandonment. If risks were properly identified and priced at the design stage, observed variance between design-stage elemental cost plans (ECPs) and final tender sums (FTS) (initial contract sums) could be reduced. This study investigates the variations between design-stage ECPs and FTS of construction projects, with a view to identifying risk factors that are responsible for the observed variance. Data were sourced through interviews, and risk factors were identified by using thematic analysis. Access was obtained to project files from the records of study participants (consultant quantity surveyors), and document analysis was employed in complementing the responses from the interviews. Study findings revealed the discrepancies between ECPs and FTS in the region of -14% and +16%. It is opined in this study that the identified risk factors were responsible for the variability observed. The values obtained from the analysis would enable greater accuracy in the forecast of FTS by Quantity Surveyors. Further, whilst inherent risks in construction project developments are observed globally, these findings have important ramifications for construction projects by expanding existing knowledge on what is needed for reasonable budgetary performance and successful delivery of construction projects. The findings contribute significantly to the study by providing quantitative confirmation to justify the theoretical conclusions generated in the literature from around the world. This therefore adds to and consolidates existing knowledge.
Keywords: Accuracy, design-stage, elemental cost plan, final tender sum, New Zealand.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18032173 The Flashnews as a Commercial Session of Political Marketing: The Content Analysis of the Embedded Political Narratives in Non-Political Media Products
Authors: Zsolt Szabolcsi
Abstract:
Political communication in Hungary has undergone a significant change in the 2010s. One element of the transformation is the Flashnews. This media product was launched in March 2015 and since then 40-50 blocks are broadcasted, daily, on 5 channels. Flashnews blocks are condensed news sessions, containing the summary of political narratives. It starts with the introduction of the narrator, then, usually four news topics are presented and, finally, the narrator concludes the block. The block lasts only one minute and, therefore, it provides a blink session into the main narratives of political communication at the time. Beyond its rapid pace, what makes its avoidance difficult is that these blocks are always in the first position in the commercial break of a non-political media product. Although it is only one minute long, its significance is high. The content of the Flashnews reflects the main governmental narratives and, therefore, the Flashnews is part of the agenda-setting capacity of political communication. It reaches media consumers who have limited knowledge and interest in politics, and their use of media products is not politically related. For this audience, the Flashnews pops up in the same way as commercials. Due to its structure and appearance, the impact of Flashnews seems to be similar to commercials, imbedded into the break of media products. It activates existing knowledge constructs, builds up associational links and maintains their presence in a way that the recipient is not aware of the phenomenon. The research aims to examine the extent to which the Flashnews and the main news narratives are identical in their content. This aim is realized with the content analysis of the two news products by examining the Flashnews and the evening news during main sport events from 2016 to 2018. The initial hypothesis of the research is that Flashnews is a contribution to the news management technique for an effective articulation of political narratives in public service media channels.
Keywords: Flashnews, political communication, political marketing, news management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5962172 Real Time Control Learning Game - Speed Race by Learning at the Wheel - Development of Data Acquisition System
Authors: Κonstantinos Kalovrektis, Chryssanthi Palazi
Abstract:
Schools today face ever-increasing demands in their attempts to ensure that students are well equipped to enter the workforce and navigate a complex world. Research indicates that computer technology can help support learning, implementation of various experiments or learning games, and that it is especially useful in developing the higher-order skills of critical thinking, observation, comprehension, implementation, comparison, analysis and active attention to activities such as research, field work, simulations and scientific inquiry. The ICT in education supports the learning procedure by enabling it to be more flexible and effective, create a rich and attractive training environment and equip the students with knowledge and potential useful for the competitive social environment in which they live. This paper presents the design, the development, and the results of the evaluation analysis of an interactive educational game which using real electric vehicles - toys (material) on a toy race track. When the game starts each student selects a specific vehicle toy. Then students are answering questionnaires in the computer. The vehicles' speed is related to the percentage of right answers in a multiple choice questionnaire (software). Every question has its own significant value depending of the different level of questionnaire. Via the developed software, each right or wrong answers in questionnaire increase or decrease the real time speed of their vehicle toys. Moreover the rate of vehicle's speed increase or decrease depends on the difficulty level of each question. The aim of the work is to attract the student’s interest in a learning process and also to improve their scores. The developed real time game was tested using independent populations of students of age groups: 8-10, 11-14, 15-18 years. Standard educational and statistical analysis tools were used for the evaluation analysis of the game. Results reveal that students using the developed real time control game scored much higher (60%) than students using a traditional simulation game on the same questionnaire. Results further indicate that student's interest in repeating the developed real time control gaming was far higher (70%) than the interest of students using a traditional simulation game.
Keywords: Real time game, sensor, learning games, LabVIEW
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17292171 C-LNRD: A Cross-Layered Neighbor Route Discovery for Effective Packet Communication in Wireless Sensor Network
Authors: K. Kalaikumar, E. Baburaj
Abstract:
One of the problems to be addressed in wireless sensor networks is the issues related to cross layer communication. Cross layer architecture shares the information across the layer, ensuring Quality of Services (QoS). With this shared information, MAC protocol adapts effective functionality maintenance such as route selection on changeable sensor network environment. However, time slot assignment and neighbour route selection time duration for cross layer have not been carried out. The time varying physical layer communication over cross layer causes high traffic load in the sensor network. Though, the traffic load was reduced using cross layer optimization procedure, the computational cost is high. To improve communication efficacy in the sensor network, a self-determined time slot based Cross-Layered Neighbour Route Discovery (C-LNRD) method is presented in this paper. In the presented work, the initial process is to discover the route in the sensor network using Dynamic Source Routing based Medium Access Control (MAC) sub layers. This process considers MAC layer operation with dynamic route neighbour table discovery. Then, the discovered route path for packet communication employs Broad Route Distributed Time Slot Assignment method on Cross-Layered Sensor Network system. Broad Route means time slotting on varying length of the route paths. During packet communication in this sensor network, transmission of packets is adjusted over the different time with varying ranges for controlling the traffic rate. Finally, Rayleigh fading model is developed in C-LNRD to identify the performance of the sensor network communication structure. The main task of Rayleigh Fading is to measure the power level of each communication under MAC sub layer. The minimized power level helps to easily reduce the computational cost of packet communication in the sensor network. Experiments are conducted on factors such as power factor, on packet communication, neighbour route discovery time, and information (i.e., packet) propagation speed.
Keywords: Medium access control, neighbour route discovery, wireless sensor network, Rayleigh fading, distributed time slot assignment
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7732170 Use of Nanoclay in Various Modified Polyolefins
Authors: Michael Tupý, Alice Tesaříková-Svobodová, Dagmar Měřínská, Vít Petránek
Abstract:
Polyethylene (PE), Polypropylene (PP), Polyethylene (vinyl acetate) (EVA) and PE-ionomer nanocomposite samples were prepared by mixing of the polymer with organofilized montmorillonite fillers Cloisite 93A and Dellite 67G. The amount of each modified montmorillonite (MMT) was fixed to 5% (w/w). The twin-screw kneader was used for the compounding of polymer matrix and chosen nanofillers. The level of MMT exfoliation was studied by the transmission electron microscopy (TEM) observations. The mechanical properties of prepared materials were evaluated by dynamical mechanical analysis at 30°C and by the measurement of tensile properties (stress and strain at break).
Keywords: Polyethylene, Polypropylene, Polyethylene (vinyl acetate), Clay, Nanocomposite, Montmorillonite.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21712169 Assets Integrity Management in Oil and Gas Production Facilities Through Corrosion Mitigation and Inspection Strategy: A Case Study of Sarir Oilfield
Authors: Iftikhar Ahmad, Youssef Elkezza
Abstract:
Sarir oilfield is in North Africa. It has facilities of oil and gas production. The assets of the Sarir oilfield can be divided into five following categories, namely: (i) Well bore and wellheads; (ii) Vessels such as separators, desalters, and gas processing facilities; (iii) Pipelines including all flow lines, trunk lines, and shipping lines; (iv) storage tanks; (v) Other assets such as turbines and compressors, etc. The nature of the petroleum industry recognizes the potential human, environmental and financial consequences that can result from failing to maintain the integrity of wellheads, vessels, tanks, pipelines, and other assets. The importance of effective asset integrity management increases as the industry infrastructure continues to age. The primary objective of assets integrity management (AIM) is to maintain assets in a fit-for-service condition while extending their remaining life in the most reliable, safe, and cost-effective manner. Corrosion management is one of the important aspects of successful asset integrity management. It covers corrosion mitigation, monitoring, inspection, and risk evaluation. External corrosion on pipelines, well bores, buried assets, and bottoms of tanks is controlled with a combination of coatings by cathodic protection, while the external corrosion on surface equipment, wellheads, and storage tanks is controlled by coatings. The periodic cleaning of the pipeline by pigging helps in the prevention of internal corrosion. Further, internal corrosion of pipelines is prevented by chemical treatment and controlled operations. This paper describes the integrity management system used in the Sarir oil field for its oil and gas production facilities based on standard practices of corrosion mitigation and inspection.
Keywords: Assets integrity management, corrosion prevention in oilfield assets, corrosion management in oilfield, corrosion prevention and inspection activities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1712168 On Constructing Approximate Convex Hull
Authors: M. Zahid Hossain, M. Ashraful Amin
Abstract:
The algorithms of convex hull have been extensively studied in literature, principally because of their wide range of applications in different areas. This article presents an efficient algorithm to construct approximate convex hull from a set of n points in the plane in O(n + k) time, where k is the approximation error control parameter. The proposed algorithm is suitable for applications preferred to reduce the computation time in exchange of accuracy level such as animation and interaction in computer graphics where rapid and real-time graphics rendering is indispensable.
Keywords: Convex hull, Approximation algorithm, Computational geometry, Linear time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22992167 Analysis of Trend and Variability of Rainfall in the Mid-Mahanadi River Basin of Eastern India
Authors: Rabindra K. Panda, Gurjeet Singh
Abstract:
The major objective of this study was to analyze the trend and variability of rainfall in the middle Mahandi river basin located in eastern India. The trend of variation of extreme rainfall events has predominant effect on agricultural water management and extreme hydrological events such as floods and droughts. Mahanadi river basin is one of the major river basins of India having an area of 1,41,589 km2 and divided into three regions: Upper, middle and delta region. The middle region of Mahanadi river basin has an area of 48,700 km2 and it is mostly dominated by agricultural land, where agriculture is mostly rainfed. The study region has five Agro-climatic zones namely: East and South Eastern Coastal Plain, North Eastern Ghat, Western Undulating Zone, Western Central Table Land and Mid Central Table Land, which were numbered as zones 1 to 5 respectively for convenience in reporting. In the present study, analysis of variability and trends of annual, seasonal, and monthly rainfall was carried out, using the daily rainfall data collected from the Indian Meteorological Department (IMD) for 35 years (1979-2013) for the 5 agro-climatic zones. The long term variability of rainfall was investigated by evaluating the mean, standard deviation and coefficient of variation. The long term trend of rainfall was analyzed using the Mann-Kendall test on monthly, seasonal and annual time scales. It was found that there is a decreasing trend in the rainfall during the winter and pre monsoon seasons for zones 2, 3 and 4; whereas in the monsoon (rainy) season there is an increasing trend for zones 1, 4 and 5 with a level of significance ranging between 90-95%. On the other hand, the mean annual rainfall has an increasing trend at 99% significance level. The estimated seasonality index showed that the rainfall distribution is asymmetric and distributed over 3-4 months period. The study will help to understand the spatio-temporal variation of rainfall and to determine the correlation between the current rainfall trend and climate change scenario of the study region for multifarious use.
Keywords: Eastern India, long-term variability and trends, Mann-Kendall test, seasonality index, spatio-temporal variation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16332166 Distribution Voltage Regulation Under Three- Phase Fault by Using D-STATCOM
Authors: Chaiyut Sumpavakup, Thanatchai Kulworawanichpong
Abstract:
This paper presents the voltage regulation scheme of D-STATCOM under three-phase faults. It consists of the voltage detection and voltage regulation schemes in the 0dq reference. The proposed control strategy uses the proportional controller in which the proportional gain, kp, is appropriately adjusted by using genetic algorithms. To verify its use, a simplified 4-bus test system is situated by assuming a three-phase fault at bus 4. As a result, the DSTATCOM can resume the load voltage to the desired level within 1.8 ms. This confirms that the proposed voltage regulation scheme performs well under three-phase fault events.Keywords: D-STATCOM, proportional controller, genetic algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17892165 The Impact of Temporal Impairment on Quality of Experience (QoE) in Video Streaming: A No Reference (NR) Subjective and Objective Study
Authors: Muhammad Arslan Usman, Muhammad Rehan Usman, Soo Young Shin
Abstract:
Live video streaming is one of the most widely used service among end users, yet it is a big challenge for the network operators in terms of quality. The only way to provide excellent Quality of Experience (QoE) to the end users is continuous monitoring of live video streaming. For this purpose, there are several objective algorithms available that monitor the quality of the video in a live stream. Subjective tests play a very important role in fine tuning the results of objective algorithms. As human perception is considered to be the most reliable source for assessing the quality of a video stream subjective tests are conducted in order to develop more reliable objective algorithms. Temporal impairments in a live video stream can have a negative impact on the end users. In this paper we have conducted subjective evaluation tests on a set of video sequences containing temporal impairment known as frame freezing. Frame Freezing is considered as a transmission error as well as a hardware error which can result in loss of video frames on the reception side of a transmission system. In our subjective tests, we have performed tests on videos that contain a single freezing event and also for videos that contain multiple freezing events. We have recorded our subjective test results for all the videos in order to give a comparison on the available No Reference (NR) objective algorithms. Finally, we have shown the performance of no reference algorithms used for objective evaluation of videos and suggested the algorithm that works better. The outcome of this study shows the importance of QoE and its effect on human perception. The results for the subjective evaluation can serve the purpose for validating objective algorithms.Keywords: Objective evaluation, subjective evaluation, quality of experience (QoE), video quality assessment (VQA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16302164 Self-Organizing Maps in Evolutionary Approachmeant for Dimensioning Routes to the Demand
Authors: J.-C. Créput, A. Koukam, A. Hajjam
Abstract:
We present a non standard Euclidean vehicle routing problem adding a level of clustering, and we revisit the use of self-organizing maps as a tool which naturally handles such problems. We present how they can be used as a main operator into an evolutionary algorithm to address two conflicting objectives of route length and distance from customers to bus stops minimization and to deal with capacity constraints. We apply the approach to a real-life case of combined clustering and vehicle routing for the transportation of the 780 employees of an enterprise. Basing upon a geographic information system we discuss the influence of road infrastructures on the solutions generated.Keywords: Evolutionary algorithm, self-organizing map, clustering and vehicle routing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13812163 Utilization of Industrial Byproducts in Concrete Applications by Adopting Grey Taguchi Method for Optimization
Authors: V. K. Bansal, M. Kumar, P. P. Bansal, A. Batish
Abstract:
This paper presents the results of an experimental investigation carried out to evaluate the effects of partial replacement of cement and fine aggregate with industrial waste by-products on concrete strength properties. The Grey Taguchi approach has been used to optimize the mix proportions for desired properties. In this research work, a ternary combination of industrial waste by-products has been used. The experiments have been designed using Taguchi's L9 orthogonal array with four factors having three levels each. The cement was partially replaced by ladle furnace slag (LFS), fly ash (FA) and copper slag (CS) at 10%, 25% and 40% level and fine aggregate (sand) was partially replaced with electric arc furnace slag (EAFS), iron slag (IS) and glass powder (GP) at 20%, 30% and 40% level. Three water to binder ratios, fixed at 0.40, 0.44 and 0.48, were used, and the curing age was fixed at 7, 28 and 90 days. Thus, a series of nine experiments was conducted on the specimens for water to binder ratios of 0.40, 0.44 and 0.48 at 7, 28 and 90 days of the water curing regime. It is evident from the investigations that Grey Taguchi approach for optimization helps in identifying the factors affecting the final outcomes, i.e. compressive strength and split tensile strength of concrete. For the materials and a range of parameters used in this research, the present study has established optimum mixes in terms of strength properties. The best possible levels of mix proportions were determined for maximization through compressive and splitting tensile strength. To verify the results, the optimal mix was produced and tested. The mixture results in higher compressive strength and split tensile strength than other mixes. The compressive strength and split tensile strength of optimal mixtures are also compared with the control concrete mixtures. The results show that compressive strength and split tensile strength of concrete made with partial replacement of cement and fine aggregate is more than control concrete at all ages and w/c ratios. Based on the overall observations, it can be recommended that industrial waste by-products in ternary combinations can effectively be utilized as partial replacements of cement and fine aggregates in all concrete applications.
Keywords: Analysis of variance, ANOVA, compressive strength, concrete, grey Taguchi method, industrial by-products, split tensile strength.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8192162 Financial Regulations in the Process of Global Financial Crisis and Macroeconomics Impact of Basel III
Authors: M. Okan Tasar
Abstract:
Basel III (or the Third Basel Accord) is a global regulatory standard on bank capital adequacy, stress testing and market liquidity risk agreed upon by the members of the Basel Committee on Banking Supervision in 2010-2011, and scheduled to be introduced from 2013 until 2018. Basel III is a comprehensive set of reform measures. These measures aim to; (1) improve the banking sector-s ability to absorb shocks arising from financial and economic stress, whatever the source, (2) improve risk management and governance, (3) strengthen banks- transparency and disclosures. Similarly the reform target; (1) bank level or micro-prudential, regulation, which will help raise the resilience of individual banking institutions to periods of stress. (2) Macro-prudential regulations, system wide risk that can build up across the banking sector as well as the pro-cyclical implication of these risks over time. These two approaches to supervision are complementary as greater resilience at the individual bank level reduces the risk system wide shocks. Macroeconomic impact of Basel III; OECD estimates that the medium-term impact of Basel III implementation on GDP growth is in the range -0,05 percent to -0,15 percent per year. On the other hand economic output is mainly affected by an increase in bank lending spreads as banks pass a rise in banking funding costs, due to higher capital requirements, to their customers. Consequently the estimated effects on GDP growth assume no active response from monetary policy. Basel III impact on economic output could be offset by a reduction (or delayed increase) in monetary policy rates by about 30 to 80 basis points. The aim of this paper is to create a framework based on the recent regulations in order to prevent financial crises. Thus the need to overcome the global financial crisis will contribute to financial crises that may occur in the future periods. In the first part of the paper, the effects of the global crisis on the banking system examine the concept of financial regulations. In the second part; especially in the financial regulations and Basel III are analyzed. The last section in this paper explored the possible consequences of the macroeconomic impacts of Basel III.Keywords: Banking Systems, Basel III, Financial regulation, Global Financial Crisis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22862161 Measuring Business and Information Technology Value in BPR: An Empirical Study in the Japanese Enterprises
Authors: Michiko Miyamoto, Shuhei Kudo, Kayo Iizuka
Abstract:
This paper presents an analysis result of relationship between business and information technology (IT) in business process reengineering (BPR). 258 Japanese firm-level data collected have been analyzed using structural equation modeling. This analysis was aimed to illuminating success factors of achieve effective BPR. Analysis was focused on management factors (including organizational factors) and implementing management method (e.g. balanced score card, internal control, etc.).These results would contribute for achieving effective BPR by showing effective tasks and environment to be focused.Keywords: BPR, SEM, IS Success Model, user satisfaction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14162160 Life Cycle Assessment of Residential Buildings: A Case Study in Canada
Authors: Venkatesh Kumar, Kasun Hewage, Rehan Sadiq
Abstract:
Residential buildings consume significant amounts of energy and produce large amount of emissions and waste. However, there is a substantial potential for energy savings in this sector which needs to be evaluated over the life cycle of residential buildings. Life Cycle Assessment (LCA) methodology has been employed to study the primary energy uses and associated environmental impacts of different phases (i.e., product, construction, use, end of life, and beyond building life) for residential buildings. Four different alternatives of residential buildings in Vancouver (BC, Canada) with a 50-year lifespan have been evaluated, including High Rise Apartment (HRA), Low Rise Apartment (LRA), Single family Attached House (SAH), and Single family Detached House (SDH). Life cycle performance of the buildings is evaluated for embodied energy, embodied environmental impacts, operational energy, operational environmental impacts, total life-cycle energy, and total life cycle environmental impacts. Estimation of operational energy and LCA are performed using DesignBuilder software and Athena Impact estimator software respectively. The study results revealed that over the life span of the buildings, the relationship between the energy use and the environmental impacts are identical. LRA is found to be the best alternative in terms of embodied energy use and embodied environmental impacts; while, HRA showed the best life-cycle performance in terms of minimum energy use and environmental impacts. Sensitivity analysis has also been carried out to study the influence of building service lifespan over 50, 75, and 100 years on the relative significance of embodied energy and total life cycle energy. The life-cycle energy requirements for SDH are found to be a significant component among the four types of residential buildings. The overall disclose that the primary operations of these buildings accounts for 90% of the total life cycle energy which far outweighs minor differences in embodied effects between the buildings.Keywords: Building simulation, environmental impacts, life cycle assessment, life cycle energy analysis, residential buildings.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 51852159 Transmission Line Congestion Management Using Hybrid Fish-Bee Algorithm with Unified Power Flow Controller
Authors: P. Valsalal, S. Thangalakshmi
Abstract:
There is a widespread changeover in the electrical power industry universally from old-style monopolistic outline towards a horizontally distributed competitive structure to come across the demand of rising consumption. When the transmission lines of derestricted system are incapable to oblige the entire service needs, the lines are overloaded or congested. The governor between customer and power producer is nominated as Independent System Operator (ISO) to lessen the congestion without obstructing transmission line restrictions. Among the existing approaches for congestion management, the frequently used approaches are reorganizing the generation and load curbing. There is a boundary for reorganizing the generators, and further loads may not be supplemented with the prevailing resources unless more private power producers are added in the system by considerably raising the cost. Hence, congestion is relaxed by appropriate Flexible AC Transmission Systems (FACTS) devices which boost the existing transfer capacity of transmission lines. The FACTs device, namely, Unified Power Flow Controller (UPFC) is preferred, and the correct placement of UPFC is more vital and should be positioned in the highly congested line. Hence, the weak line is identified by using power flow performance index with the new objective function with proposed hybrid Fish – Bee algorithm. Further, the location of UPFC at appropriate line reduces the branch loading and minimizes the voltage deviation. The power transfer capacity of lines is determined with and without UPFC in the identified congested line of IEEE 30 bus structure and the simulated results are compared with prevailing algorithms. It is observed that the transfer capacity of existing line is increased with the presented algorithm and thus alleviating the congestion.
Keywords: Available line transfer capability, congestion management, FACTS device, hybrid fish-bee algorithm, ISO, UPFC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15782158 Extracting Attributes for Twitter Hashtag Communities
Authors: Ashwaq Alsulami, Jianhua Shao
Abstract:
Various organisations often need to understand discussions on social media, such as what trending topics are and characteristics of the people engaged in the discussion. A number of approaches have been proposed to extract attributes that would characterise a discussion group. However, these approaches are largely based on supervised learning, and as such they require a large amount of labelled data. We propose an approach in this paper that does not require labelled data, but rely on lexical sources to detect meaningful attributes for online discussion groups. Our findings show an acceptable level of accuracy in detecting attributes for Twitter discussion groups.
Keywords: Attributed community, attribute detection, community, social network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5062157 The Sign in the Communication Process
Authors: S. Pesina, T. Solonchak
Abstract:
In the process of information transmission (concept verbalization) we deal mostly with the substance (contents), and then pay attention to the form. Recalling events from the remote past, often we cannot exactly reproduce specific heard or pronounced words, as well as the syntactic structures. We remember events, feelings, images; we recall the general contents of the discourse. The thought gets a specific language form only during the concept verbalization phase. With minimum time for pondering, depending on the language competence level, the grammar and syntactic shaping often occurs automatically with the use of famous models and stereotypes. This means that the language form adapts itself to the consciousness, and not vice versa.
Keywords: Lexical eidos, phenomenology, noema, polysemantic word, semantic core.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19412156 Software Architecture and Support for Patient Tracking Systems in Critical Scenarios
Authors: Gianluca Cornetta, Abdellah Touhafi, David J. Santos, Jose Manuel Vazquez
Abstract:
In this work a new platform for mobile-health systems is presented. System target application is providing decision support to rescue corps or military medical personnel in combat areas. Software architecture relies on a distributed client-server system that manages a wireless ad-hoc networks hierarchy in which several different types of client operate. Each client is characterized for different hardware and software requirements. Lower hierarchy levels rely in a network of completely custom devices that store clinical information and patient status and are designed to form an ad-hoc network operating in the 2.4 GHz ISM band and complying with the IEEE 802.15.4 standard (ZigBee). Medical personnel may interact with such devices, that are called MICs (Medical Information Carriers), by means of a PDA (Personal Digital Assistant) or a MDA (Medical Digital Assistant), and transmit the information stored in their local databases as well as issue a service request to the upper hierarchy levels by using IEEE 802.11 a/b/g standard (WiFi). The server acts as a repository that stores both medical evacuation forms and associated events (e.g., a teleconsulting request). All the actors participating in the diagnostic or evacuation process may access asynchronously to such repository and update its content or generate new events. The designed system pretends to optimise and improve information spreading and flow among all the system components with the aim of improving both diagnostic quality and evacuation process.Keywords: IEEE 802.15.4 (ZigBee), IEEE 802.11 a/b/g (WiFi), distributed client-server systems, embedded databases, issue trackers, ad-hoc networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20382155 Biaxial Testing of Fabrics - A Comparison of Various Testing Methodologies
Authors: O.B. Ozipek, E. Bozdag, E. Sunbuloglu, A. Abdullahoglu, E. Belen, E. Celikkanat
Abstract:
In textile industry, besides the conventional textile products, technical textile goods, that have been brought external functional properties into, are being developed for technical textile industry. Especially these products produced with weaving technology are widely preferred in areas such as sports, geology, medical, automotive, construction and marine sectors. These textile products are exposed to various stresses and large deformations under typical conditions of use. At this point, sufficient and reliable data could not be obtained with uniaxial tensile tests for determination of the mechanical properties of such products due to mainly biaxial stress state. Therefore, the most preferred method is a biaxial tensile test method and analysis. These tests and analysis is applied to fabrics with different functional features in order to establish the textile material with several characteristics and mechanical properties of the product. Planar biaxial tensile test, cylindrical inflation and bulge tests are generally required to apply for textile products that are used in automotive, sailing and sports areas and construction industry to minimize accidents as long as their service life. Airbags, seat belts and car tires in the automotive sector are also subject to the same biaxial stress states, and can be characterized by same types of experiments. In this study, in accordance with the research literature related to the various biaxial test methods are compared. Results with discussions are elaborated mainly focusing on the design of a biaxial test apparatus to obtain applicable experimental data for developing a finite element model. Sample experimental results on a prototype system are expressed.Keywords: Biaxial Stress, Bulge Test, Cylindrical Inflation, Fabric Testing, Planar Tension.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 41472154 Exploring the Applicability of a Rapid Health Assessment in India
Authors: Claudia Carbajal, Jija Dutt, Smriti Pahwa, Sumukhi Vaid, Karishma Vats
Abstract:
ASER Centre, the research and assessment arm of Pratham Education Foundation sees measurement as the first stage of action. ASER uses primary research to push and give empirical foundations to policy discussions at a multitude of levels. At a household level, common citizens use a simple assessment (a floor-level test) to measure learning across rural India. This paper presents the evidence on the applicability of an ASER approach to the health sector. A citizen-led assessment was designed and executed that collected information from young mothers with children up to a year of age. The pilot assessments were rolled-out in two different models: Paid surveyors and student volunteers. The survey covered three geographic areas: 1,239 children in the Jaipur District of Rajasthan, 2,086 in the Rae Bareli District of Uttar Pradesh, and 593 children in the Bhuj Block in Gujarat. The survey tool was designed to study knowledge of health-related issues, daily practices followed by young mothers and access to relevant services and programs. It provides insights on behaviors related to infant and young child feeding practices, child and maternal nutrition and supplementation, water and sanitation, and health services. Moreover, the survey studies the reasons behind behaviors giving policy-makers actionable pathways to improve implementation of social sector programs. Although data on health outcomes are available, this approach could provide a rapid annual assessment of health issues with indicators that are easy to understand and act upon so that measurements do not become an exclusive domain of experts. The results give many insights into early childhood health behaviors and challenges. Around 98% of children are breastfed, and approximately half are not exclusively breastfed (for the first 6 months). Government established diet diversity guidelines are met for less than 1 out of 10 children. Although most households are satisfied with the quality of drinking water, most tested households had contaminated water.
Keywords: Citizen-led assessment, infant and young children feeding, maternal nutrition, rapid health assessment supplementation, water and sanitation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17912153 Distributional Impacts of Changes in Value Added Tax Rates in the Czech Republic
Authors: Ondřej Bayer
Abstract:
The paper evaluates the ongoing reform of VAT in the Czech Republic in terms of impacts on individual households. The main objective is to analyse the impact of given changes on individual households. The adopted method is based on the data related to household consumption by individual household quintiles; obtained data are subjected to micro-simulation examining. Results are discussed in terms of vertical tax justice. Results of the analysis reveal that VAT behaves regressively and a sole consolidation of rates at a higher level only increases the regression of this tax in the Czech Republic.
Keywords: Consolidation of rates, household quintiles, tax impact, VAT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18542152 The Use of S Curves in Technology Forecasting and its Application On 3D TV Technology
Authors: Gizem Intepe, Tufan Koc
Abstract:
S-Curves are commonly used in technology forecasting. They show the paths of product performance in relation to time or investment in R&D. It is a useful tool to describe the inflection points and the limit of improvement of a technology. Companies use this information to base their innovation strategies. However inadequate use and some limitations of this technique lead to problems in decision making. In this paper first technology forecasting and its importance for company level strategies will be discussed. Secondly the S-Curve and its place among other forecasting techniques will be introduced. Thirdly its use in technology forecasting will be discussed based on its advantages, disadvantages and limitations. Finally an application of S-curve on 3D TV technology using patent data will also be presented and the results will be discussed.Keywords: Patent analysis, Technological forecasting. S curves, 3D TV
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 77832151 Comparison of Frequency Estimation Methods for Reflected Signals in Mobile Platforms
Authors: Kathrin Reinhold
Abstract:
Precise frequency estimation methods for pulseshaped echoes are a prerequisite to determine the relative velocity between sensor and reflector. Signal frequencies are analysed using three different methods: Fourier Transform, Chirp ZTransform and the MUSIC algorithm. Simulations of echoes are performed varying both the noise level and the number of reflecting points. The superposition of echoes with a random initial phase is found to influence the precision of frequency estimation severely for FFT and MUSIC. The standard deviation of the frequency using FFT is larger than for MUSIC. However, MUSIC is more noise-sensitive. The distorting effect of superpositions is less pronounced in experimental data.
Keywords: Frequency estimation, pulse-echo-method, superposition, echoes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1167