Search results for: Organization Level.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3872

Search results for: Organization Level.

2192 An Overview of Some High Order and Multi-Level Finite Difference Schemes in Computational Aeroacoustics

Authors: Appanah Rao Appadu, Muhammad Zaid Dauhoo

Abstract:

In this paper, we have combined some spatial derivatives with the optimised time derivative proposed by Tam and Webb in order to approximate the linear advection equation which is given by = 0. Ôêé Ôêé + Ôêé Ôêé x f t u These spatial derivatives are as follows: a standard 7-point 6 th -order central difference scheme (ST7), a standard 9-point 8 th -order central difference scheme (ST9) and optimised schemes designed by Tam and Webb, Lockard et al., Zingg et al., Zhuang and Chen, Bogey and Bailly. Thus, these seven different spatial derivatives have been coupled with the optimised time derivative to obtain seven different finite-difference schemes to approximate the linear advection equation. We have analysed the variation of the modified wavenumber and group velocity, both with respect to the exact wavenumber for each spatial derivative. The problems considered are the 1-D propagation of a Boxcar function, propagation of an initial disturbance consisting of a sine and Gaussian function and the propagation of a Gaussian profile. It is known that the choice of the cfl number affects the quality of results in terms of dissipation and dispersion characteristics. Based on the numerical experiments solved and numerical methods used to approximate the linear advection equation, it is observed in this work, that the quality of results is dependent on the choice of the cfl number, even for optimised numerical methods. The errors from the numerical results have been quantified into dispersion and dissipation using a technique devised by Takacs. Also, the quantity, Exponential Error for Low Dispersion and Low Dissipation, eeldld has been computed from the numerical results. Moreover, based on this work, it has been found that when the quantity, eeldld can be used as a measure of the total error. In particular, the total error is a minimum when the eeldld is a minimum.

Keywords: Optimised time derivative, dissipation, dispersion, cfl number, Nomenclature: k : time step, h : spatial step, β :advection velocity, r: cfl/Courant number, hkrβ= , w =θ, h : exact wave number, n :time level, RPE : Relative phase error per unit time step, AFM :modulus of amplification factor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614
2191 A Novel Steganographic Method for Gray-Level Images

Authors: Ahmad T. Al-Taani, Abdullah M. AL-Issa

Abstract:

In this work we propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by dividing the cover into blocks of equal sizes and then embeds the message in the edge of the block depending on the number of ones in left four bits of the pixel. The proposed approach is tested on a database consists of 100 different images. Experimental results, compared with other methods, showed that the proposed approach hide more large information and gave a good visual quality stego-image that can be seen by human eyes.

Keywords: Data Embedding, Cryptography, Watermarking, Steganography, Least Significant Bit, Information Hiding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2245
2190 Comparing Test Equating by Item Response Theory and Raw Score Methods with Small Sample Sizes on a Study of the ARTé: Mecenas Learning Game

Authors: Steven W. Carruthers

Abstract:

The purpose of the present research is to equate two test forms as part of a study to evaluate the educational effectiveness of the ARTé: Mecenas art history learning game. The researcher applied Item Response Theory (IRT) procedures to calculate item, test, and mean-sigma equating parameters. With the sample size n=134, test parameters indicated “good” model fit but low Test Information Functions and more acute than expected equating parameters. Therefore, the researcher applied equipercentile equating and linear equating to raw scores and compared the equated form parameters and effect sizes from each method. Item scaling in IRT enables the researcher to select a subset of well-discriminating items. The mean-sigma step produces a mean-slope adjustment from the anchor items, which was used to scale the score on the new form (Form R) to the reference form (Form Q) scale. In equipercentile equating, scores are adjusted to align the proportion of scores in each quintile segment. Linear equating produces a mean-slope adjustment, which was applied to all core items on the new form. The study followed a quasi-experimental design with purposeful sampling of students enrolled in a college level art history course (n=134) and counterbalancing design to distribute both forms on the pre- and posttests. The Experimental Group (n=82) was asked to play ARTé: Mecenas online and complete Level 4 of the game within a two-week period; 37 participants completed Level 4. Over the same period, the Control Group (n=52) did not play the game. The researcher examined between group differences from post-test scores on test Form Q and Form R by full-factorial Two-Way ANOVA. The raw score analysis indicated a 1.29% direct effect of form, which was statistically non-significant but may be practically significant. The researcher repeated the between group differences analysis with all three equating methods. For the IRT mean-sigma adjusted scores, form had a direct effect of 8.39%. Mean-sigma equating with a small sample may have resulted in inaccurate equating parameters. Equipercentile equating aligned test means and standard deviations, but resultant skewness and kurtosis worsened compared to raw score parameters. Form had a 3.18% direct effect. Linear equating produced the lowest Form effect, approaching 0%. Using linearly equated scores, the researcher conducted an ANCOVA to examine the effect size in terms of prior knowledge. The between group effect size for the Control Group versus Experimental Group participants who completed the game was 14.39% with a 4.77% effect size attributed to pre-test score. Playing and completing the game increased art history knowledge, and individuals with low prior knowledge tended to gain more from pre- to post test. Ultimately, researchers should approach test equating based on their theoretical stance on Classical Test Theory and IRT and the respective  assumptions. Regardless of the approach or method, test equating requires a representative sample of sufficient size. With small sample sizes, the application of a range of equating approaches can expose item and test features for review, inform interpretation, and identify paths for improving instruments for future study.

Keywords: Effectiveness, equipercentile equating, IRT, learning games, linear equating, mean-sigma equating.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 993
2189 Evolved Strokes in Non Photo–Realistic Rendering

Authors: Ashkan Izadi, Vic Ciesielski

Abstract:

We describe a work with an evolutionary computing algorithm for non photo–realistic rendering of a target image. The renderings are produced by genetic programming. We have used two different types of strokes: “empty triangle" and “filled triangle" in color level. We compare both empty and filled triangular strokes to find which one generates more aesthetic pleasing images. We found the filled triangular strokes have better fitness and generate more aesthetic images than empty triangular strokes.

Keywords: Artificial intelligence, Evolutionary programming, Geneticprogramming, Non photo–realistic rendering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1903
2188 Factors Affecting Media Literacy of Early Teenagers

Authors: Khajornjit Bunnag

Abstract:

The purposes of this research are: 1) to study the media literacy of early teenagers, and 2) to study the interaction between gender and timing of media exposure that affects the media literacy of teenagers. The sample of the study included 400 young people aged between 11 to 17 and who were living in Bangkok. The data was collected using questionnaires. Two-way ANOVA was used in analyzing the collected data. The result revealed that gender and timing of media exposure affected the media literacy of early teenagers with statistical significance at the level of 0.05.

Keywords: Gender, Media Literacy, Teenager, Timing of Media Exposure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2572
2187 Computer-Assisted Management of Building Climate and Microgrid with Model Predictive Control

Authors: Vinko Lešić, Mario Vašak, Anita Martinčević, Marko Gulin, Antonio Starčić, Hrvoje Novak

Abstract:

With 40% of total world energy consumption, building systems are developing into technically complex large energy consumers suitable for application of sophisticated power management approaches to largely increase the energy efficiency and even make them active energy market participants. Centralized control system of building heating and cooling managed by economically-optimal model predictive control shows promising results with estimated 30% of energy efficiency increase. The research is focused on implementation of such a method on a case study performed on two floors of our faculty building with corresponding sensors wireless data acquisition, remote heating/cooling units and central climate controller. Building walls are mathematically modeled with corresponding material types, surface shapes and sizes. Models are then exploited to predict thermal characteristics and changes in different building zones. Exterior influences such as environmental conditions and weather forecast, people behavior and comfort demands are all taken into account for deriving price-optimal climate control. Finally, a DC microgrid with photovoltaics, wind turbine, supercapacitor, batteries and fuel cell stacks is added to make the building a unit capable of active participation in a price-varying energy market. Computational burden of applying model predictive control on such a complex system is relaxed through a hierarchical decomposition of the microgrid and climate control, where the former is designed as higher hierarchical level with pre-calculated price-optimal power flows control, and latter is designed as lower level control responsible to ensure thermal comfort and exploit the optimal supply conditions enabled by microgrid energy flows management. Such an approach is expected to enable the inclusion of more complex building subsystems into consideration in order to further increase the energy efficiency.

Keywords: Energy-efficient buildings, Hierarchical model predictive control, Microgrid power flow optimization, Price-optimal building climate control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1499
2186 Assessment Methods for Surgical Skill

Authors: Siti Nor Zawani Ahmmad, Eileen Su Lee Ming, Yeong Che Fai, Fauzan Khairi bin Che Harun

Abstract:

The increasingly sophisticated technologies have now been able to provide assistance for surgeons to improve surgical performance through various training programs. Equally important to learning skills is the assessment method as it determines the learning and technical proficiency of a trainee. A consistent and rigorous assessment system will ensure that trainees acquire the specific level of competency prior to certification. This paper reviews the methods currently in use for assessment of surgical skill and some modern techniques using computer-based measurements and virtual reality systems for more quantitative measurements

Keywords: assessment, surgical skill, checklist, global rating, virtual reality

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2411
2185 Identification of Printed Punjabi Words and English Numerals Using Gabor Features

Authors: Rajneesh Rani, Renu Dhir, G. S. Lehal

Abstract:

Script identification is one of the challenging steps in the development of optical character recognition system for bilingual or multilingual documents. In this paper an attempt is made for identification of English numerals at word level from Punjabi documents by using Gabor features. The support vector machine (SVM) classifier with five fold cross validation is used to classify the word images. The results obtained are quite encouraging. Average accuracy with RBF kernel, Polynomial and Linear Kernel functions comes out to be greater than 99%.

Keywords: Script identification, gabor features, support vector machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2108
2184 Accreditation and Quality Assurance of Nigerian Universities: The Management Imperative

Authors: F. O Anugom

Abstract:

The general functions of the university amongst other things include teaching, research and community service. Universities are recognized as the apex of learning, accumulating and imparting knowledge and skills of all kinds to students to enable them to be productive, earn their living and to make optimum contributions to national development. This is equivalent to the production of human capital in the form of high level manpower needed to administer the educational society, be useful to the society and manage the economy. Quality has become a matter of major importance for university education in Nigeria. Accreditation is the systematic review of educational programs to ensure that acceptable standards of education, scholarship and infrastructure are being maintained. Accreditation ensures that institution maintain quality. The process is designed to determine whether or not an institution has met or exceeded the published standards for accreditation, and whether it is achieving its mission and stated purposes. Ensuring quality assurance in accreditation process falls in the hands of university management which justified the need for this study. This study examined accreditation and quality assurance: the management imperative. Three research questions and three hypotheses guided the study. The design was a correlation survey with a population of 2,893 university administrators out of which 578 Heads of department and Dean of faculties were sampled. The instrument for data collection was titled Programme Accreditation Exercise scale with high levels of reliability. The research questions were answered with Pearson ‘r’ statistics. T-test statistics was used to test the hypotheses. It was found among others that the quality of accredited programme depends on the level of funding of universities in Nigeria. It was also indicated that quality of programme accreditation and physical facilities of universities in Nigeria have high relationship. But it was also revealed that programme accreditation is positively related to staffing in Nigerian universities. Based on the findings of the study, the researcher recommend that academic administrators should be included in the team of those who ensure quality programs in the universities. Private sector partnership should be encouraged to fund programs to ensure quality of programme in the universities. Independent agencies should be engaged to monitor the activities of accreditation teams to avoid bias.

Keywords: Accreditation, quality assurance, NUC, physical facilities, staffing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1899
2183 The Characterisation of TLC NAND Flash Memory, Leading to a Definable Endurance/Retention Trade-Off

Authors: Sorcha Bennett, Joe Sullivan

Abstract:

Triple-Level Cell (TLC) NAND Flash memory at, and below, 20nm (nanometer) is still largely unexplored by researchers, and with the ever more commonplace existence of Flash in consumer and enterprise applications there is a need for such gaps in knowledge to be filled. At the time of writing, there was little published data or literature on TLC, and more specifically reliability testing, with a further emphasis on both endurance and retention. This paper will give an introduction to NAND Flash memory, followed by an overview of the relevant current research on the reliability of Flash memory, along with the planned future work which will provide results to help characterise the reliability of TLC memory.

Keywords: TLC NAND flash memory, reliability, endurance, retention, trade-off, raw flash, patterns.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3480
2182 A Virtual Simulation Environment for a Design and Verification of a GPGPU

Authors: Kwang Y. Lee, Tae R. Park, Jae C. Kwak, Yong S. Koo

Abstract:

When a small H/W IP is designed, we can develop an appropriate verification environment by observing the simulated signal waves, or using the serial test vectors for the fixed output. In the case of design and verification of a massive parallel processor with multiple IPs, it-s difficult to make a verification system with existing common verification environment, and to verify each partial IP. A TestDrive verification environment can build easy and reliable verification system that can produce highly intuitive results by applying Modelsim and SystemVerilog-s DPI. It shows many advantages, for example a high-level design of a GPGPU processor design can be migrate to FPGA board immediately.

Keywords: Virtual Simulation, Verification, IP Design, GPGPU

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1638
2181 Stereotype Student Model for an Adaptive e-Learning System

Authors: Ani Grubišić, Slavomir Stankov, Branko Žitko

Abstract:

This paper describes a concept of stereotype student model in adaptive knowledge acquisition e-learning system. Defined knowledge stereotypes are based on student's proficiency level and on Bloom's knowledge taxonomy. The teacher module is responsible for the whole adaptivity process: the automatic generation of courseware elements, their dynamic selection and sorting, as well as their adaptive presentation using templates for statements and questions. The adaptation of courseware is realized according to student-s knowledge stereotype.

Keywords: Adaptive e-learning systems, adaptive courseware, stereotypes, Bloom's knowledge taxonomy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2872
2180 Achieving Design-Stage Elemental Cost Planning Accuracy: Case Study of New Zealand

Authors: Johnson Adafin, James O. B. Rotimi, Suzanne Wilkinson, Abimbola O. Windapo

Abstract:

An aspect of client expenditure management that requires attention is the level of accuracy achievable in design-stage elemental cost planning. This has been a major concern for construction clients and practitioners in New Zealand (NZ). Pre-tender estimating inaccuracies are significantly influenced by the level of risk information available to estimators. Proper cost planning activities should ensure the production of a project’s likely construction costs (initial and final), and subsequent cost control activities should prevent unpleasant consequences of cost overruns, disputes and project abandonment. If risks were properly identified and priced at the design stage, observed variance between design-stage elemental cost plans (ECPs) and final tender sums (FTS) (initial contract sums) could be reduced. This study investigates the variations between design-stage ECPs and FTS of construction projects, with a view to identifying risk factors that are responsible for the observed variance. Data were sourced through interviews, and risk factors were identified by using thematic analysis. Access was obtained to project files from the records of study participants (consultant quantity surveyors), and document analysis was employed in complementing the responses from the interviews. Study findings revealed the discrepancies between ECPs and FTS in the region of -14% and +16%. It is opined in this study that the identified risk factors were responsible for the variability observed. The values obtained from the analysis would enable greater accuracy in the forecast of FTS by Quantity Surveyors. Further, whilst inherent risks in construction project developments are observed globally, these findings have important ramifications for construction projects by expanding existing knowledge on what is needed for reasonable budgetary performance and successful delivery of construction projects. The findings contribute significantly to the study by providing quantitative confirmation to justify the theoretical conclusions generated in the literature from around the world. This therefore adds to and consolidates existing knowledge.

Keywords: Accuracy, design-stage, elemental cost plan, final tender sum, New Zealand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1764
2179 Real Time Control Learning Game - Speed Race by Learning at the Wheel - Development of Data Acquisition System

Authors: Κonstantinos Kalovrektis, Chryssanthi Palazi

Abstract:

Schools today face ever-increasing demands in their attempts to ensure that students are well equipped to enter the workforce and navigate a complex world. Research indicates that computer technology can help support learning, implementation of various experiments or learning games, and that it is especially useful in developing the higher-order skills of critical thinking, observation, comprehension, implementation, comparison, analysis and active attention to activities such as research, field work, simulations and scientific inquiry. The ICT in education supports the learning procedure by enabling it to be more flexible and effective, create a rich and attractive training environment and equip the students with knowledge and potential useful for the competitive social environment in which they live. This paper presents the design, the development, and the results of the evaluation analysis of an interactive educational game which using real electric vehicles - toys (material) on a toy race track. When the game starts each student selects a specific vehicle toy. Then students are answering questionnaires in the computer. The vehicles' speed is related to the percentage of right answers in a multiple choice questionnaire (software). Every question has its own significant value depending of the different level of questionnaire. Via the developed software, each right or wrong answers in questionnaire increase or decrease the real time speed of their vehicle toys. Moreover the rate of vehicle's speed increase or decrease depends on the difficulty level of each question. The aim of the work is to attract the student’s interest in a learning process and also to improve their scores. The developed real time game was tested using independent populations of students of age groups: 8-10, 11-14, 15-18 years. Standard educational and statistical analysis tools were used for the evaluation analysis of the game. Results reveal that students using the developed real time control game scored much higher (60%) than students using a traditional simulation game on the same questionnaire. Results further indicate that student's interest in repeating the developed real time control gaming was far higher (70%) than the interest of students using a traditional simulation game.

Keywords: Real time game, sensor, learning games, LabVIEW

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1708
2178 C-LNRD: A Cross-Layered Neighbor Route Discovery for Effective Packet Communication in Wireless Sensor Network

Authors: K. Kalaikumar, E. Baburaj

Abstract:

One of the problems to be addressed in wireless sensor networks is the issues related to cross layer communication. Cross layer architecture shares the information across the layer, ensuring Quality of Services (QoS). With this shared information, MAC protocol adapts effective functionality maintenance such as route selection on changeable sensor network environment. However, time slot assignment and neighbour route selection time duration for cross layer have not been carried out. The time varying physical layer communication over cross layer causes high traffic load in the sensor network. Though, the traffic load was reduced using cross layer optimization procedure, the computational cost is high. To improve communication efficacy in the sensor network, a self-determined time slot based Cross-Layered Neighbour Route Discovery (C-LNRD) method is presented in this paper. In the presented work, the initial process is to discover the route in the sensor network using Dynamic Source Routing based Medium Access Control (MAC) sub layers. This process considers MAC layer operation with dynamic route neighbour table discovery. Then, the discovered route path for packet communication employs Broad Route Distributed Time Slot Assignment method on Cross-Layered Sensor Network system. Broad Route means time slotting on varying length of the route paths. During packet communication in this sensor network, transmission of packets is adjusted over the different time with varying ranges for controlling the traffic rate. Finally, Rayleigh fading model is developed in C-LNRD to identify the performance of the sensor network communication structure. The main task of Rayleigh Fading is to measure the power level of each communication under MAC sub layer. The minimized power level helps to easily reduce the computational cost of packet communication in the sensor network. Experiments are conducted on factors such as power factor, on packet communication, neighbour route discovery time, and information (i.e., packet) propagation speed.

Keywords: Medium access control, neighbour route discovery, wireless sensor network, Rayleigh fading, distributed time slot assignment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 752
2177 Use of Nanoclay in Various Modified Polyolefins

Authors: Michael Tupý, Alice Tesaříková-Svobodová, Dagmar Měřínská, Vít Petránek

Abstract:

Polyethylene (PE), Polypropylene (PP), Polyethylene (vinyl acetate) (EVA) and PE-ionomer nanocomposite samples were prepared by mixing of the polymer with organofilized montmorillonite fillers Cloisite 93A and Dellite 67G. The amount of each modified montmorillonite (MMT) was fixed to 5% (w/w). The twin-screw kneader was used for the compounding of polymer matrix and chosen nanofillers. The level of MMT exfoliation was studied by the transmission electron microscopy (TEM) observations. The mechanical properties of prepared materials were evaluated by dynamical mechanical analysis at 30°C and by the measurement of tensile properties (stress and strain at break).

Keywords: Polyethylene, Polypropylene, Polyethylene (vinyl acetate), Clay, Nanocomposite, Montmorillonite.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2151
2176 On Constructing Approximate Convex Hull

Authors: M. Zahid Hossain, M. Ashraful Amin

Abstract:

The algorithms of convex hull have been extensively studied in literature, principally because of their wide range of applications in different areas. This article presents an efficient algorithm to construct approximate convex hull from a set of n points in the plane in O(n + k) time, where k is the approximation error control parameter. The proposed algorithm is suitable for applications preferred to reduce the computation time in exchange of accuracy level such as animation and interaction in computer graphics where rapid and real-time graphics rendering is indispensable.

Keywords: Convex hull, Approximation algorithm, Computational geometry, Linear time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2281
2175 Analysis of Trend and Variability of Rainfall in the Mid-Mahanadi River Basin of Eastern India

Authors: Rabindra K. Panda, Gurjeet Singh

Abstract:

The major objective of this study was to analyze the trend and variability of rainfall in the middle Mahandi river basin located in eastern India. The trend of variation of extreme rainfall events has predominant effect on agricultural water management and extreme hydrological events such as floods and droughts. Mahanadi river basin is one of the major river basins of India having an area of 1,41,589 km2 and divided into three regions: Upper, middle and delta region. The middle region of Mahanadi river basin has an area of 48,700 km2 and it is mostly dominated by agricultural land, where agriculture is mostly rainfed. The study region has five Agro-climatic zones namely: East and South Eastern Coastal Plain, North Eastern Ghat, Western Undulating Zone, Western Central Table Land and Mid Central Table Land, which were numbered as zones 1 to 5 respectively for convenience in reporting. In the present study, analysis of variability and trends of annual, seasonal, and monthly rainfall was carried out, using the daily rainfall data collected from the Indian Meteorological Department (IMD) for 35 years (1979-2013) for the 5 agro-climatic zones. The long term variability of rainfall was investigated by evaluating the mean, standard deviation and coefficient of variation. The long term trend of rainfall was analyzed using the Mann-Kendall test on monthly, seasonal and annual time scales. It was found that there is a decreasing trend in the rainfall during the winter and pre monsoon seasons for zones 2, 3 and 4; whereas in the monsoon (rainy) season there is an increasing trend for zones 1, 4 and 5 with a level of significance ranging between 90-95%. On the other hand, the mean annual rainfall has an increasing trend at 99% significance level. The estimated seasonality index showed that the rainfall distribution is asymmetric and distributed over 3-4 months period. The study will help to understand the spatio-temporal variation of rainfall and to determine the correlation between the current rainfall trend and climate change scenario of the study region for multifarious use.

Keywords: Eastern India, long-term variability and trends, Mann-Kendall test, seasonality index, spatio-temporal variation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1606
2174 Distribution Voltage Regulation Under Three- Phase Fault by Using D-STATCOM

Authors: Chaiyut Sumpavakup, Thanatchai Kulworawanichpong

Abstract:

This paper presents the voltage regulation scheme of D-STATCOM under three-phase faults. It consists of the voltage detection and voltage regulation schemes in the 0dq reference. The proposed control strategy uses the proportional controller in which the proportional gain, kp, is appropriately adjusted by using genetic algorithms. To verify its use, a simplified 4-bus test system is situated by assuming a three-phase fault at bus 4. As a result, the DSTATCOM can resume the load voltage to the desired level within 1.8 ms. This confirms that the proposed voltage regulation scheme performs well under three-phase fault events.

Keywords: D-STATCOM, proportional controller, genetic algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
2173 Self-Organizing Maps in Evolutionary Approachmeant for Dimensioning Routes to the Demand

Authors: J.-C. Créput, A. Koukam, A. Hajjam

Abstract:

We present a non standard Euclidean vehicle routing problem adding a level of clustering, and we revisit the use of self-organizing maps as a tool which naturally handles such problems. We present how they can be used as a main operator into an evolutionary algorithm to address two conflicting objectives of route length and distance from customers to bus stops minimization and to deal with capacity constraints. We apply the approach to a real-life case of combined clustering and vehicle routing for the transportation of the 780 employees of an enterprise. Basing upon a geographic information system we discuss the influence of road infrastructures on the solutions generated.

Keywords: Evolutionary algorithm, self-organizing map, clustering and vehicle routing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1364
2172 Utilization of Industrial Byproducts in Concrete Applications by Adopting Grey Taguchi Method for Optimization

Authors: V. K. Bansal, M. Kumar, P. P. Bansal, A. Batish

Abstract:

This paper presents the results of an experimental investigation carried out to evaluate the effects of partial replacement of cement and fine aggregate with industrial waste by-products on concrete strength properties. The Grey Taguchi approach has been used to optimize the mix proportions for desired properties. In this research work, a ternary combination of industrial waste by-products has been used. The experiments have been designed using Taguchi's L9 orthogonal array with four factors having three levels each. The cement was partially replaced by ladle furnace slag (LFS), fly ash (FA) and copper slag (CS) at 10%, 25% and 40% level and fine aggregate (sand) was partially replaced with electric arc furnace slag (EAFS), iron slag (IS) and glass powder (GP) at 20%, 30% and 40% level. Three water to binder ratios, fixed at 0.40, 0.44 and 0.48, were used, and the curing age was fixed at 7, 28 and 90 days. Thus, a series of nine experiments was conducted on the specimens for water to binder ratios of 0.40, 0.44 and 0.48 at 7, 28 and 90 days of the water curing regime. It is evident from the investigations that Grey Taguchi approach for optimization helps in identifying the factors affecting the final outcomes, i.e. compressive strength and split tensile strength of concrete. For the materials and a range of parameters used in this research, the present study has established optimum mixes in terms of strength properties. The best possible levels of mix proportions were determined for maximization through compressive and splitting tensile strength. To verify the results, the optimal mix was produced and tested. The mixture results in higher compressive strength and split tensile strength than other mixes. The compressive strength and split tensile strength of optimal mixtures are also compared with the control concrete mixtures. The results show that compressive strength and split tensile strength of concrete made with partial replacement of cement and fine aggregate is more than control concrete at all ages and w/c ratios. Based on the overall observations, it can be recommended that industrial waste by-products in ternary combinations can effectively be utilized as partial replacements of cement and fine aggregates in all concrete applications.

Keywords: Analysis of variance, ANOVA, compressive strength, concrete, grey Taguchi method, industrial by-products, split tensile strength.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 795
2171 Financial Regulations in the Process of Global Financial Crisis and Macroeconomics Impact of Basel III

Authors: M. Okan Tasar

Abstract:

Basel III (or the Third Basel Accord) is a global regulatory standard on bank capital adequacy, stress testing and market liquidity risk agreed upon by the members of the Basel Committee on Banking Supervision in 2010-2011, and scheduled to be introduced from 2013 until 2018. Basel III is a comprehensive set of reform measures. These measures aim to; (1) improve the banking sector-s ability to absorb shocks arising from financial and economic stress, whatever the source, (2) improve risk management and governance, (3) strengthen banks- transparency and disclosures. Similarly the reform target; (1) bank level or micro-prudential, regulation, which will help raise the resilience of individual banking institutions to periods of stress. (2) Macro-prudential regulations, system wide risk that can build up across the banking sector as well as the pro-cyclical implication of these risks over time. These two approaches to supervision are complementary as greater resilience at the individual bank level reduces the risk system wide shocks. Macroeconomic impact of Basel III; OECD estimates that the medium-term impact of Basel III implementation on GDP growth is in the range -0,05 percent to -0,15 percent per year. On the other hand economic output is mainly affected by an increase in bank lending spreads as banks pass a rise in banking funding costs, due to higher capital requirements, to their customers. Consequently the estimated effects on GDP growth assume no active response from monetary policy. Basel III impact on economic output could be offset by a reduction (or delayed increase) in monetary policy rates by about 30 to 80 basis points. The aim of this paper is to create a framework based on the recent regulations in order to prevent financial crises. Thus the need to overcome the global financial crisis will contribute to financial crises that may occur in the future periods. In the first part of the paper, the effects of the global crisis on the banking system examine the concept of financial regulations. In the second part; especially in the financial regulations and Basel III are analyzed. The last section in this paper explored the possible consequences of the macroeconomic impacts of Basel III.

Keywords: Banking Systems, Basel III, Financial regulation, Global Financial Crisis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2261
2170 Measuring Business and Information Technology Value in BPR: An Empirical Study in the Japanese Enterprises

Authors: Michiko Miyamoto, Shuhei Kudo, Kayo Iizuka

Abstract:

This paper presents an analysis result of relationship between business and information technology (IT) in business process reengineering (BPR). 258 Japanese firm-level data collected have been analyzed using structural equation modeling. This analysis was aimed to illuminating success factors of achieve effective BPR. Analysis was focused on management factors (including organizational factors) and implementing management method (e.g. balanced score card, internal control, etc.).These results would contribute for achieving effective BPR by showing effective tasks and environment to be focused.

Keywords: BPR, SEM, IS Success Model, user satisfaction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1400
2169 Extracting Attributes for Twitter Hashtag Communities

Authors: Ashwaq Alsulami, Jianhua Shao

Abstract:

Various organisations often need to understand discussions on social media, such as what trending topics are and characteristics of the people engaged in the discussion. A number of approaches have been proposed to extract attributes that would characterise a discussion group. However, these approaches are largely based on supervised learning, and as such they require a large amount of labelled data. We propose an approach in this paper that does not require labelled data, but rely on lexical sources to detect meaningful attributes for online discussion groups. Our findings show an acceptable level of accuracy in detecting attributes for Twitter discussion groups.

Keywords: Attributed community, attribute detection, community, social network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 467
2168 The Sign in the Communication Process

Authors: S. Pesina, T. Solonchak

Abstract:

In the process of information transmission (concept verbalization) we deal mostly with the substance (contents), and then pay attention to the form. Recalling events from the remote past, often we cannot exactly reproduce specific heard or pronounced words, as well as the syntactic structures. We remember events, feelings, images; we recall the general contents of the discourse. The thought gets a specific language form only during the concept verbalization phase. With minimum time for pondering, depending on the language competence level, the grammar and syntactic shaping often occurs automatically with the use of famous models and stereotypes. This means that the language form adapts itself to the consciousness, and not vice versa.

Keywords: Lexical eidos, phenomenology, noema, polysemantic word, semantic core.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921
2167 Risk Management Approach for a Secure and Performant Integration of Automated Drug Dispensing Systems in Hospitals

Authors: Hind Bouami, Patrick Millot

Abstract:

Medication dispensing system is a life-critical system whose failure may result in preventable adverse events leading to longer patient stays in hospitals or patient death. Automation has led to great improvements in life-critical systems as it increased safety, efficiency, and comfort. However, critical risks related to medical organization complexity and automated solutions integration can threaten drug dispensing security and performance. Knowledge about the system’s complexity aspects and human machine parameters to control for automated equipment’s security and performance will help operators to secure their automation process and to optimize their system’s reliability. In this context, this study aims to document the operator’s situation awareness about automation risks and parameters involved in automation security and performance. Our risk management approach has been deployed in the North Luxembourg hospital center’s pharmacy, which is equipped with automated drug dispensing systems since 2009. With more than 4 million euros of gains generated, North Luxembourg hospital center’s success story was enabled by the management commitment, pharmacy’s involvement in the implementation and improvement of the automation project, and the close collaboration between the pharmacy and Sinteco’s firm to implement the necessary innovation and organizational actions for automated solutions integration security and performance. An analysis of the actions implemented by the hospital and the parameters involved in automated equipment’s integration security and performance has been made. The parameters to control for automated equipment’s integration security and performance are human aspects (6.25%), technical aspects (50%), and human-machine interaction (43.75%). The implementation of an anthropocentric analysis system before automation would have prevented and optimized the control of risks related to automation.

Keywords: Automated drug delivery systems, hospitals, human-centered automated system, risk management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 683
2166 Exploring the Applicability of a Rapid Health Assessment in India

Authors: Claudia Carbajal, Jija Dutt, Smriti Pahwa, Sumukhi Vaid, Karishma Vats

Abstract:

ASER Centre, the research and assessment arm of Pratham Education Foundation sees measurement as the first stage of action. ASER uses primary research to push and give empirical foundations to policy discussions at a multitude of levels. At a household level, common citizens use a simple assessment (a floor-level test) to measure learning across rural India. This paper presents the evidence on the applicability of an ASER approach to the health sector. A citizen-led assessment was designed and executed that collected information from young mothers with children up to a year of age. The pilot assessments were rolled-out in two different models: Paid surveyors and student volunteers. The survey covered three geographic areas: 1,239 children in the Jaipur District of Rajasthan, 2,086 in the Rae Bareli District of Uttar Pradesh, and 593 children in the Bhuj Block in Gujarat. The survey tool was designed to study knowledge of health-related issues, daily practices followed by young mothers and access to relevant services and programs. It provides insights on behaviors related to infant and young child feeding practices, child and maternal nutrition and supplementation, water and sanitation, and health services. Moreover, the survey studies the reasons behind behaviors giving policy-makers actionable pathways to improve implementation of social sector programs. Although data on health outcomes are available, this approach could provide a rapid annual assessment of health issues with indicators that are easy to understand and act upon so that measurements do not become an exclusive domain of experts. The results give many insights into early childhood health behaviors and challenges. Around 98% of children are breastfed, and approximately half are not exclusively breastfed (for the first 6 months). Government established diet diversity guidelines are met for less than 1 out of 10 children. Although most households are satisfied with the quality of drinking water, most tested households had contaminated water.

Keywords: Citizen-led assessment, infant and young children feeding, maternal nutrition, rapid health assessment supplementation, water and sanitation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1747
2165 Distributional Impacts of Changes in Value Added Tax Rates in the Czech Republic

Authors: Ondřej Bayer

Abstract:

The paper evaluates the ongoing reform of VAT in the Czech Republic in terms of impacts on individual households. The main objective is to analyse the impact of given changes on individual households. The adopted method is based on the data related to household consumption by individual household quintiles; obtained data are subjected to micro-simulation examining. Results are discussed in terms of vertical tax justice. Results of the analysis reveal that VAT behaves regressively and a sole consolidation of rates at a higher level only increases the regression of this tax in the Czech Republic.

Keywords: Consolidation of rates, household quintiles, tax impact, VAT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1835
2164 The Use of S Curves in Technology Forecasting and its Application On 3D TV Technology

Authors: Gizem Intepe, Tufan Koc

Abstract:

S-Curves are commonly used in technology forecasting. They show the paths of product performance in relation to time or investment in R&D. It is a useful tool to describe the inflection points and the limit of improvement of a technology. Companies use this information to base their innovation strategies. However inadequate use and some limitations of this technique lead to problems in decision making. In this paper first technology forecasting and its importance for company level strategies will be discussed. Secondly the S-Curve and its place among other forecasting techniques will be introduced. Thirdly its use in technology forecasting will be discussed based on its advantages, disadvantages and limitations. Finally an application of S-curve on 3D TV technology using patent data will also be presented and the results will be discussed.

Keywords: Patent analysis, Technological forecasting. S curves, 3D TV

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7739
2163 Comparison of Frequency Estimation Methods for Reflected Signals in Mobile Platforms

Authors: Kathrin Reinhold

Abstract:

Precise frequency estimation methods for pulseshaped echoes are a prerequisite to determine the relative velocity between sensor and reflector. Signal frequencies are analysed using three different methods: Fourier Transform, Chirp ZTransform and the MUSIC algorithm. Simulations of echoes are performed varying both the noise level and the number of reflecting points. The superposition of echoes with a random initial phase is found to influence the precision of frequency estimation severely for FFT and MUSIC. The standard deviation of the frequency using FFT is larger than for MUSIC. However, MUSIC is more noise-sensitive. The distorting effect of superpositions is less pronounced in experimental data.

Keywords: Frequency estimation, pulse-echo-method, superposition, echoes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1147