Search results for: holistic approach to teaching mathematics in secondary school
4320 Multi-Line Flexible Alternating Current Transmission System (FACTS) Controller for Transient Stability Analysis of a Multi-Machine Power System Network
Authors: A.V.Naresh Babu, S.Sivanagaraju
Abstract:
A considerable progress has been achieved in transient stability analysis (TSA) with various FACTS controllers. But, all these controllers are associated with single transmission line. This paper is intended to discuss a new approach i.e. a multi-line FACTS controller which is interline power flow controller (IPFC) for TSA of a multi-machine power system network. A mathematical model of IPFC, termed as power injection model (PIM) presented and this model is incorporated in Newton-Raphson (NR) power flow algorithm. Then, the reduced admittance matrix of a multi-machine power system network for a three phase fault without and with IPFC is obtained which is required to draw the machine swing curves. A general approach based on L-index has also been discussed to find the best location of IPFC to reduce the proximity to instability of a power system. Numerical results are carried out on two test systems namely, 6-bus and 11-bus systems. A program in MATLAB has been written to plot the variation of generator rotor angle and speed difference curves without and with IPFC for TSA and also a simple approach has been presented to evaluate critical clearing time for test systems. The results obtained without and with IPFC are compared and discussed.Keywords: Flexible alternating current transmission system (FACTS), first swing stability, interline power flow controller (IPFC), power injection model (PIM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21964319 A CDA-Driven Study of World English Series Published by Cengage Heinle
Authors: Mohammad Amin Mozaheb, Jalal Farzaneh Dehkordi, Khojasteh Hosseinzadehpilehvar
Abstract:
English Language Teaching (ELT) is widely promoted across the world. ELT textbooks play pivotal roles in the mentioned process. Since biases of authors have been an issue of continuing interest to analysts over the past few years, the present study seeks to analyze an ELT textbook using Critical Discourse Analysis (CDA). To obtain the goal of the study, the listening section of a book called World English 3 (new edition) has been analyzed in terms of the cultures and countries mentioned in the listening section of the book using content-based analysis. The analysis indicates biases towards certain cultures. Moreover, some countries are shown as rich and powerful countries, while some others have been shown as poor ones without considering the history behind them.
Keywords: ELT, textbooks, critical discourse analysis, World English.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13364318 MIM: A Species Independent Approach for Classifying Coding and Non-Coding DNA Sequences in Bacterial and Archaeal Genomes
Authors: Achraf El Allali, John R. Rose
Abstract:
A number of competing methodologies have been developed to identify genes and classify DNA sequences into coding and non-coding sequences. This classification process is fundamental in gene finding and gene annotation tools and is one of the most challenging tasks in bioinformatics and computational biology. An information theory measure based on mutual information has shown good accuracy in classifying DNA sequences into coding and noncoding. In this paper we describe a species independent iterative approach that distinguishes coding from non-coding sequences using the mutual information measure (MIM). A set of sixty prokaryotes is used to extract universal training data. To facilitate comparisons with the published results of other researchers, a test set of 51 bacterial and archaeal genomes was used to evaluate MIM. These results demonstrate that MIM produces superior results while remaining species independent.Keywords: Coding Non-coding Classification, Entropy, GeneRecognition, Mutual Information.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17284317 GPU-Accelerated Triangle Mesh Simplification Using Parallel Vertex Removal
Authors: Thomas Odaker, Dieter Kranzlmueller, Jens Volkert
Abstract:
We present an approach to triangle mesh simplification designed to be executed on the GPU. We use a quadric error metric to calculate an error value for each vertex of the mesh and order all vertices based on this value. This step is followed by the parallel removal of a number of vertices with the lowest calculated error values. To allow for the parallel removal of multiple vertices we use a set of per-vertex boundaries that prevent mesh foldovers even when simplification operations are performed on neighbouring vertices. We execute multiple iterations of the calculation of the vertex errors, ordering of the error values and removal of vertices until either a desired number of vertices remains in the mesh or a minimum error value is reached. This parallel approach is used to speed up the simplification process while maintaining mesh topology and avoiding foldovers at every step of the simplification.Keywords: Computer graphics, half edge collapse, mesh simplification, precomputed simplification, topology preserving.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27954316 Hexavalent Chromium Pollution Abatement by use of Scrap Iron
Authors: Marius Gheju, Laura Cocheci
Abstract:
In this study, the reduction of Cr(VI) by use of scrap iron, a cheap and locally available industrial waste, was investigated in continuous system. The greater scrap iron efficiency observed for the first two sections of the column filling indicate that most of the reduction process was carried out in the bottom half of the column filling. This was ascribed to a constant decrease of Cr(VI) concentration inside the filling, as the water front passes from the bottom to the top end of the column. While the bottom section of the column filling was heavily passivated with secondary mineral phases, the top section was less affected by the passivation process; therefore the column filling would likely ensure the reduction of Cr(VI) for time periods longer than 216 hours. The experimental results indicate that fixed beds columns packed with scrap iron could be successfully used for the first step of Cr(VI) polluted wastewater treatment. However, the mass of scrap iron filling should be carefully estimated since it significantly affects the Cr(VI) reduction efficiency.Keywords: hexavalent chromium, heavy metals, scrap iron, reduction capacity, wastewater treatment
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18364315 Computational Modeling in Strategic Marketing
Authors: Petr Cernohorsky, Jan Voracek
Abstract:
Well-developed strategic marketing planning is the essential prerequisite for establishment of the right and unique competitive advantage. Typical market, however, is a heterogeneous and decentralized structure with natural involvement of individual or group subjectivity and irrationality. These features cannot be fully expressed with one-shot rigorous formal models based on, e.g. mathematics, statistics or empirical formulas. We present an innovative solution, extending the domain of agent based computational economics towards the concept of hybrid modeling in service provider and consumer market such as telecommunications. The behavior of the market is described by two classes of agents - consumer and service provider agents - whose internal dynamics are fundamentally different. Customers are rather free multi-state structures, adjusting behavior and preferences quickly in accordance with time and changing environment. Producers, on the contrary, are traditionally structured companies with comparable internal processes and specific managerial policies. Their business momentum is higher and immediate reaction possibilities limited. This limitation underlines importance of proper strategic planning as the main process advising managers in time whether to continue with more or less the same business or whether to consider the need for future structural changes that would ensure retention of existing customers or acquisition of new ones.Keywords: Agent-based computational economics, hybrid modeling, strategic marketing, system dynamics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16414314 Creative Technology as Open Ended Learning Tool: A Case Study of Design School in Malaysia
Authors: Sri Kusumawati Md Daud, Fauzan Mustaffa, Hanafizan Hussain, Md Najib Osman
Abstract:
Does open ended creative technology give positive impact in learning design? Although there are many researchers had examined on the impact of technology on design education but there are very few conclusive researches done on the impact of open ended used of software to learning design. This paper sought to investigate a group of student-s experience on relatively wider range of software application within the context of design project. A typography design project was used to create a learning environment with the aim of inculcate design skills into the learners and increase their creative problem-solving and critical thinking skills. The methods used in this study were questionnaire survey and personal observation which will be focus on the individual and group response during the completion of the task.
Keywords: Learning Tool, Creative Technology, Software, Software Skills, Typography Design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16144313 Patient-Specific Modeling Algorithm for Medical Data Based on AUC
Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper
Abstract:
Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.Keywords: Approach instance-based, area Under the ROC Curve, Patient-specific Decision Path, clinical predictions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15804312 Analysis of Linear Equalizers for Cooperative Multi-User MIMO Based Reporting System
Authors: S. Hariharan, P. Muthuchidambaranathan
Abstract:
In this paper, we consider a multi user multiple input multiple output (MU-MIMO) based cooperative reporting system for cognitive radio network. In the reporting network, the secondary users forward the primary user data to the common fusion center (FC). The FC is equipped with linear equalizers and an energy detector to make the decision about the spectrum. The primary user data are considered to be a digital video broadcasting - terrestrial (DVB-T) signal. The sensing channel and the reporting channel are assumed to be an additive white Gaussian noise and an independent identically distributed Raleigh fading respectively. We analyzed the detection probability of MU-MIMO system with linear equalizers and arrived at the closed form expression for average detection probability. Also the system performance is investigated under various MIMO scenarios through Monte Carlo simulations.
Keywords: Cooperative MU-MIMO, DVB-T, Linear Equalizers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20234311 Proposing Problem-Based Learning as an Effective Pedagogical Technique for Social Work Education
Authors: Christine K. Fulmer
Abstract:
Social work education is competency based in nature. There is an expectation that graduates of social work programs throughout the world are to be prepared to practice at a level of competence, which is beneficial to both the well-being of individuals and community. Experiential learning is one way to prepare students for competent practice. The use of Problem-Based Learning (PBL) is a form experiential education that has been successful in a number of disciplines to bridge the gap between the theoretical concepts in the classroom to the real world. PBL aligns with the constructivist theoretical approach to learning, which emphasizes the integration of new knowledge with the beliefs students already hold. In addition, the basic tenants of PBL correspond well with the practice behaviors associated with social work practice including multi-disciplinary collaboration and critical thinking. This paper makes an argument for utilizing PBL in social work education.
Keywords: Constructivist theoretical approach, experiential learning, pedagogy, problem-based learning, social work education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13334310 A Retrospective Analysis of a Professional Learning Community: How Teachers- Capacities Shaped It
Authors: S.Pancucci
Abstract:
The purpose of this paper is to describe the process of setting up a learning community within an elementary school in Ontario, Canada. The description is provided through reflection and examination of field notes taken during the yearlong training and implementation process. Specifically the impact of teachers- capacity on the creation of a learning community was of interest. This paper is intended to inform and add to the debate around the tensions that exist in implementing a bottom-up professional development model like the learning community in a top-down organizational structure. My reflections of the process illustrate that implementation of the learning community professional development model may be difficult and yet transformative in the professional lives of the teachers, students, and administration involved in the change process. I conclude by suggesting the need for a new model of professional development that requires a transformative shift in power dynamics and a shift in the view of what constitutes effective professional learning.Keywords: Learning community model, professionaldevelopment, teacher capacity, teacher leadership.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16504309 Multi-Criteria Based Robust Markowitz Model under Box Uncertainty
Authors: Pulak Swain, A. K. Ojha
Abstract:
Portfolio optimization is based on dealing with the problems of efficient asset allocation. Risk and Expected return are two conflicting criteria in such problems, where the investor prefers the return to be high and the risk to be low. Using multi-objective approach we can solve those type of problems. However the information which we have for the input parameters are generally ambiguous and the input values can fluctuate around some nominal values. We can not ignore the uncertainty in input values, as they can affect the asset allocation drastically. So we use Robust Optimization approach to the problems where the input parameters comes under box uncertainty. In this paper, we solve the multi criteria robust problem with the help of E- constraint method.Keywords: Portfolio optimization, multi-objective optimization, E-constraint method, box uncertainty, robust optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6224308 Effect of Preheating Temperature and Chamber Pressure on the Properties of Porous NiTi Alloy Prepared by SHS Technique
Authors: Wisutmethangoon S., Denmud N., Sikong L.
Abstract:
The fabrication of porous NiTi shape memory alloys (SMAs) from elemental powder compacts was conducted by selfpropagating high temperature synthesis (SHS). Effects of the preheating temperature and the chamber pressure on the combustion characteristics as well as the final morphology and the composition of products were studied. The samples with porosity between 56.4 and 59.0% under preheating temperature in the range of 200-300°C and Ar-gas chamber pressure of 138 and 201 kPa were obtained. The pore structures were found to be dissimilar only in the samples processed with different preheating temperature. The major phase in the porous product is NiTi with small amounts of secondary phases, NiTi2 and Ni4Ti3. The preheating temperature and the chamber pressure have very little effect on the phase constituent. While the combustion temperature of the sample was notably increased by increasing the preheating temperature, they were slightly changed by varying the chamber pressure.
Keywords: Combustion synthesis, porous materials, self propagating high temperature synthesis, shape memory alloy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17474307 Speed Sensorless Control with a Linearizationby State Feedback of Asynchronous Machine Using a Model Reference Adaptive System
Authors: A. Larabi, M. S. Boucherit
Abstract:
In this paper, we show that the association of the PI regulators for the speed and stator currents with a control strategy using the linearization by state feedback for an induction machine without speed sensor, and with an adaptation of the rotor resistance. The rotor speed is estimated by using the model reference adaptive system approach (MRAS). This method consists of using two models: The first is the reference model and the second is an adjustable one in which two components of the stator flux, obtained from the measurement of the currents and stator voltages are estimated. The estimated rotor speed is then obtained by canceling the difference between stator-flux of the reference model and those of the adjustable one. Satisfactory results of simulation are obtained and discussed in this paper to highlight the proposed approach.Keywords: Asynchronous actuator, PI Regulator, adaptivemethod with reference model, Vector control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11164306 Modeling of Bio Scaffolds: Structural and Fluid Transport Characterization
Authors: Sahba Sadir, M. R. A. Kadir, A. Öchsner, M. N. Harun
Abstract:
Scaffolds play a key role in tissue engineering and can be produced in many different ways depending on the applications and the materials used. Most researchers used an experimental trialand- error approach into new biomaterials but computer simulation applied to tissue engineering can offer a more exhaustive approach to test and screen out biomaterials. This paper develops the model of scaffolds and Computational Fluid Dynamics that show the value of computer simulations in determining the influence of the geometrical scaffold parameter porosity, pore size and shape on the permeability of scaffolds, magnitude of velocity, drop pressure, shear stress distribution and level and the proper design of the geometry of the scaffold. This creates a need for more advanced studies that include aspects of dynamic conditions of a micro fluid passing through the scaffold were characterized for tissue engineering applications and differentiation of tissues within scaffolds.
Keywords: Scaffold engineering, Tissue engineering, Cellularstructure, Biomaterial, Computational fluid dynamics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20394305 Developing New Media Credibility Scale: A Multidimensional Perspective
Authors: Hanaa Farouk Saleh
Abstract:
The main purposes of this study are to develop a scale that reflects emerging theoretical understandings of new media credibility, based on the evolution of credibility studies in western researches, identification of the determinants of credibility in the media and its components by comparing traditional and new media credibility scales and building accumulative scale to test new media credibility. This approach was built on western researches using conceptualizations of media credibility, which focuses on four principal components: Source (journalist), message (article), medium (newspaper, radio, TV, web, etc.), and organization (owner of the medium), and adding user and cultural context as key components to assess new media credibility in particular. This study’s value lies in its contribution to the conceptualization and development of new media credibility through the creation of a theoretical measurement tool. Future studies should explore this scale to test new media credibility, which represents a promising new approach in the efforts to define and measure credibility of all media types.Keywords: Credibility scale, media credibility components, new media credibility scale, scale development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28824304 Design Process and Real-Time Validation of an Innovative Autonomous Mid-Air Flight and Landing System
Authors: De Lellis E., Di Vito V., Garbarino L., Lai C., Corraro F.
Abstract:
This paper describes the design process and the realtime validation of an innovative autonomous mid-air flight and landing system developed by the Italian Aerospace Research Center in the framework of the Italian national funded project TECVOL (Technologies for the Autonomous Flight). In the paper it is provided an insight of the whole development process of the system under study. In particular, the project framework is illustrated at first, then the functional context and the adopted design and testing approach are described, and finally the on-ground validation test rig on purpose designed is addressed in details. Furthermore, the hardwarein- the-loop validation of the autonomous mid-air flight and landing system by means of the real-time test rig is described and discussed.
Keywords: Autonomous landing, autonomous mid-air flight, design and test approach, real-time hardware-in-the-loop validation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16484303 Formosa3: A Cloud-Enabled HPC Cluster in NCHC
Authors: Chin-Hung Li, Te-Ming Chen, Ying-Chuan Chen, Shuen-Tai Wang
Abstract:
This paper proposes a new approach to offer a private cloud service in HPC clusters. In particular, our approach relies on automatically scheduling users- customized environment request as a normal job in batch system. After finishing virtualization request jobs, those guest operating systems will dismiss so that compute nodes will be released again for computing. We present initial work on the innovative integration of HPC batch system and virtualization tools that aims at coexistence such that they suffice for meeting the minimizing interference required by a traditional HPC cluster. Given the design of initial infrastructure, the proposed effort has the potential to positively impact on synergy model. The results from the experiment concluded that goal for provisioning customized cluster environment indeed can be fulfilled by using virtual machines, and efficiency can be improved with proper setup and arrangements.Keywords: Cloud Computing, HPC Cluster, Private Cloud, Virtualization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20424302 Incineration of Sludge in a Fluidized-Bed Combustor
Authors: Chien-Song Chyang, Yu-Chi Wang
Abstract:
For sludge disposal, incineration is considered to be better than direct burial because of regulations and space limitations in Taiwan. Additionally, burial after incineration can effectively prolong the lifespan of a landfill. Therefore, it is the most satisfactory method for treating sludge at present. Of the various incineration technologies, the fluidized bed incinerator is a suitable choice due to its fuel flexibility. In this work, sludge generated from industrial plants was treated in a pilot-scale vortexing fluidized bed. The moisture content of the sludge was 48.53%, and its LHV was 454.6 kcal/kg. Primary gas and secondary gas were fixed at 3 Nm3/min and 1 Nm3/min, respectively. Diesel burners with on-off controllers were used to control the temperature; the bed temperature was set to 750±20 °C, and the freeboard temperature was 850±20 °C. The experimental data show that the NO emission increased with bed temperature. The maximum NO emission is 139 ppm, which is in agreement with the regulation. The CO emission is low than 100 ppm through the operation period. The mean particle size of fly ash collected from baghouse decreased with operating time. The ration of bottom ash to fly ash is about 3. Compared with bottom ash, the potassium in the fly ash is much higher. It implied that the potassium content is not the key factor for aggregation of bottom ash.
Keywords: Sludge incineration, fluidized bed combustion, fly ash, bottom ash.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9414301 MITAutomatic ECG Beat Tachycardia Detection Using Artificial Neural Network
Authors: R. Amandi, A. Shahbazi, A. Mohebi, M. Bazargan, Y. Jaberi, P. Emadi, A. Valizade
Abstract:
The application of Neural Network for disease diagnosis has made great progress and is widely used by physicians. An Electrocardiogram carries vital information about heart activity and physicians use this signal for cardiac disease diagnosis which was the great motivation towards our study. In our work, tachycardia features obtained are used for the training and testing of a Neural Network. In this study we are using Fuzzy Probabilistic Neural Networks as an automatic technique for ECG signal analysis. As every real signal recorded by the equipment can have different artifacts, we needed to do some preprocessing steps before feeding it to our system. Wavelet transform is used for extracting the morphological parameters of the ECG signal. The outcome of the approach for the variety of arrhythmias shows the represented approach is superior than prior presented algorithms with an average accuracy of about %95 for more than 7 tachy arrhythmias.Keywords: Fuzzy Logic, Probabilistic Neural Network, Tachycardia, Wavelet Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22904300 Liver Lesion Extraction with Fuzzy Thresholding in Contrast Enhanced Ultrasound Images
Authors: Abder-Rahman Ali, Adélaïde Albouy-Kissi, Manuel Grand-Brochier, Viviane Ladan-Marcus, Christine Hoeffl, Claude Marcus, Antoine Vacavant, Jean-Yves Boire
Abstract:
In this paper, we present a new segmentation approach for focal liver lesions in contrast enhanced ultrasound imaging. This approach, based on a two-cluster Fuzzy C-Means methodology, considers type-II fuzzy sets to handle uncertainty due to the image modality (presence of speckle noise, low contrast, etc.), and to calculate the optimum inter-cluster threshold. Fine boundaries are detected by a local recursive merging of ambiguous pixels. The method has been tested on a representative database. Compared to both Otsu and type-I Fuzzy C-Means techniques, the proposed method significantly reduces the segmentation errors.Keywords: Defuzzification, fuzzy clustering, image segmentation, type-II fuzzy sets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22904299 Further the Future: The Exploratory Study in 3D Animation Marketing Trend and Industry in Thailand
Authors: Pawit Mongkolprasit, Proud Arunrangsiwed
Abstract:
Lately, many media organizations in Thailand have started to produce 3D animation, so the quality of personnel should be identified. As an instructor in the school of Animation and Multimedia, the researchers have to prepare the students, suitable for the need of industry. The current study used exploratory research design to establish the knowledge of about this issue, including the required qualification of employees and the potential of animation industry in Thailand. The interview sessions involved three key informants from three well-known organizations. The interview data was used to design a questionnaire for the confirmation phase. The overall results showed that the industry needed an individual with 3D animation skill, computer graphic skills, good communication skills, a high responsibility, and an ability to finish the project on time. Moreover, it is also found that there were currently various kinds of media where 3D animation has been involved, such as films, TV variety, TV advertising, online advertising, and application on mobile device.Keywords: Animation, marketing trend, animation industry, Thailand animation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15364298 Case Study Approach Using Scenario Analysis to Analyze Unabsorbed Head Office Overheads
Authors: K. C. Iyer, T. Gupta, Y. M. Bindal
Abstract:
Head office overhead (HOOH) is an indirect cost and is recovered through individual project billings by the contractor. Delay in a project impacts the absorption of HOOH cost allocated to that particular project and thus diminishes the expected profit of the contractor. This unabsorbed HOOH cost is later claimed by contractors as damages. The subjective nature of the available formulae to compute unabsorbed HOOH is the difficulty that contractors and owners face and thus dispute it. The paper attempts to bring together the rationale of various HOOH formulae by gathering contractor’s HOOH cost data on all of its project, using case study approach and comparing variations in values of HOOH using scenario analysis. The case study approach uses project data collected from four construction projects of a contractor in India to calculate unabsorbed HOOH costs from various available formulae. Scenario analysis provides further variations in HOOH values after considering two independent situations mainly scope changes and new projects during the delay period. Interestingly, one of the findings in this study reveals that, in spite of HOOH getting absorbed by additional works available during the period of delay, a few formulae depict an increase in the value of unabsorbed HOOH, neglecting any absorption by the increase in scope. This indicates that these formulae are inappropriate for use in case of a change to the scope of work. Results of this study can help both parties in deciding on an appropriate formula more objectively, considering the events on a project causing the delay and contractor's position in respect of obtaining new projects.
Keywords: Absorbed and unabsorbed overheads, head office overheads, scenario analysis, scope variation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8264297 Some Issues of Measurement of Impairment of Non-Financial Assets in the Public Sector
Authors: Mariam Vardiashvili
Abstract:
The economic value of the asset impairment process is quite large. Impairment reflects the reduction of future economic benefits or service potentials itemized in the asset. The assets owned by public sector entities bring economic benefits or are used for delivery of the free-of-charge services. Consequently, they are classified as cash-generating and non-cash-generating assets. IPSAS 21 - Impairment of non-cash-generating assets, and IPSAS 26 - Impairment of cash-generating assets, have been designed considering this specificity. When measuring impairment of assets, it is important to select the relevant methods. For measurement of the impaired Non-Cash-Generating Assets, IPSAS 21 recommends three methods: Depreciated Replacement Cost Approach, Restoration Cost Approach, and Service Units Approach. Impairment of Value in Use of Cash-Generating Assets (according to IPSAS 26) is measured by discounted value of the money sources to be received in future. Value in use of the cash-generating asserts (as per IPSAS 26) is measured by the discounted value of the money sources to be received in the future. The article provides classification of the assets in the public sector as non-cash-generating assets and cash-generating assets and, deals also with the factors which should be considered when evaluating impairment of assets. An essence of impairment of the non-financial assets and the methods of measurement thereof evaluation are formulated according to IPSAS 21 and IPSAS 26. The main emphasis is put on different methods of measurement of the value in use of the impaired Cash-Generating Assets and Non-Cash-Generation Assets and the methods of their selection. The traditional and the expected cash flow approaches for calculation of the discounted value are reviewed. The article also discusses the issues of recognition of impairment loss and its reflection in the financial reporting. The article concludes that despite a functional purpose of the impaired asset, whichever method is used for measuring the asset, presentation of realistic information regarding the value of the assets should be ensured in the financial reporting. In the theoretical development of the issue, the methods of scientific abstraction, analysis and synthesis were used. The research was carried out with a systemic approach. The research process uses international standards of accounting, theoretical researches and publications of Georgian and foreign scientists.
Keywords: Non-cash-generating assets, cash-generating assets, recoverable value, recoverable service amount, value in use.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6984296 Dynamic Threshold Adjustment Approach For Neural Networks
Authors: Hamza A. Ali, Waleed A. J. Rasheed
Abstract:
The use of neural networks for recognition application is generally constrained by their inherent parameters inflexibility after the training phase. This means no adaptation is accommodated for input variations that have any influence on the network parameters. Attempts were made in this work to design a neural network that includes an additional mechanism that adjusts the threshold values according to the input pattern variations. The new approach is based on splitting the whole network into two subnets; main traditional net and a supportive net. The first deals with the required output of trained patterns with predefined settings, while the second tolerates output generation dynamically with tuning capability for any newly applied input. This tuning comes in the form of an adjustment to the threshold values. Two levels of supportive net were studied; one implements an extended additional layer with adjustable neuronal threshold setting mechanism, while the second implements an auxiliary net with traditional architecture performs dynamic adjustment to the threshold value of the main net that is constructed in dual-layer architecture. Experiment results and analysis of the proposed designs have given quite satisfactory conducts. The supportive layer approach achieved over 90% recognition rate, while the multiple network technique shows more effective and acceptable level of recognition. However, this is achieved at the price of network complexity and computation time. Recognition generalization may be also improved by accommodating capabilities involving all the innate structures in conjugation with Intelligence abilities with the needs of further advanced learning phases.
Keywords: Classification, Recognition, Neural Networks, Pattern Recognition, Generalization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16274295 High Quality Speech Coding using Combined Parametric and Perceptual Modules
Authors: M. Kulesza, G. Szwoch, A. Czyżewski
Abstract:
A novel approach to speech coding using the hybrid architecture is presented. Advantages of parametric and perceptual coding methods are utilized together in order to create a speech coding algorithm assuring better signal quality than in traditional CELP parametric codec. Two approaches are discussed. One is based on selection of voiced signal components that are encoded using parametric algorithm, unvoiced components that are encoded perceptually and transients that remain unencoded. The second approach uses perceptual encoding of the residual signal in CELP codec. The algorithm applied for precise transient selection is described. Signal quality achieved using the proposed hybrid codec is compared to quality of some standard speech codecs.
Keywords: CELP residual coding, hybrid codec architecture, perceptual speech coding, speech codecs comparison.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15304294 Intelligent Agent Communication by Using DAML to Build Agent Community Ontology
Authors: Cheng-Hsiung Hung, Hong-Jie Dai, Jason Jen-Yen Chen
Abstract:
This paper presents a new approach for intelligent agent communication based on ontology for agent community. DARPA agent markup language (DAML) is used to build the community ontology. This paper extends the agent management specification by the foundation for intelligent physical agents (FIPA) to develop an agent role called community facilitator (CF) that manages community directory and community ontology. CF helps build agent community. Precise description of agent service in this community can thus be achieved. This facilitates agent communication. Furthermore, through ontology update, agents with different ontology are capable of communicating with each other. An example of advanced traveler information system is included to illustrate practicality of this approach.
Keywords: Intelligent agent communication, DARPA agent markup language (DAML), Community ontology, Advanced Traveler Information System (ATIS).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16004293 Analyzing Periurban Fringe with Rough Set
Authors: Benedetto Manganelli, Beniamino Murgante
Abstract:
The distinction among urban, periurban and rural areas represents a classical example of uncertainty in land classification. Satellite images, geostatistical analysis and all kinds of spatial data are very useful in urban sprawl studies, but it is important to define precise rules in combining great amounts of data to build complex knowledge about territory. Rough Set theory may be a useful method to employ in this field. It represents a different mathematical approach to uncertainty by capturing the indiscernibility. Two different phenomena can be indiscernible in some contexts and classified in the same way when combining available information about them. This approach has been applied in a case of study, comparing the results achieved with both Map Algebra technique and Spatial Rough Set. The study case area, Potenza Province, is particularly suitable for the application of this theory, because it includes 100 municipalities with different number of inhabitants and morphologic features.
Keywords: Land Classification, Map Algebra, Periurban Fringe, Rough Set, Urban Planning, Urban Sprawl.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17244292 Hybrid Method Using Wavelets and Predictive Method for Compression of Speech Signal
Authors: Karima Siham Aoubid, Mohamed Boulemden
Abstract:
The development of the signal compression algorithms is having compressive progress. These algorithms are continuously improved by new tools and aim to reduce, an average, the number of bits necessary to the signal representation by means of minimizing the reconstruction error. The following article proposes the compression of Arabic speech signal by a hybrid method combining the wavelet transform and the linear prediction. The adopted approach rests, on one hand, on the original signal decomposition by ways of analysis filters, which is followed by the compression stage, and on the other hand, on the application of the order 5, as well as, the compression signal coefficients. The aim of this approach is the estimation of the predicted error, which will be coded and transmitted. The decoding operation is then used to reconstitute the original signal. Thus, the adequate choice of the bench of filters is useful to the transform in necessary to increase the compression rate and induce an impercevable distortion from an auditive point of view.Keywords: Compression, linear prediction analysis, multiresolution analysis, speech signal.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13374291 Physical Modeling of Oil Well Fire Extinguishing Using a Turbojet on a Barge
Authors: M. Abbaspour, D. Mansouri, N. Mansouri
Abstract:
There are reports of gas and oil wells fire due to different accidents. Many different methods are used for fire fighting in gas and oil industry. Traditional fire extinguishing techniques are mostly faced with many problems and are usually time consuming and needs lots of equipments. Besides, they cause damages to facilities, and create health and environmental problems. This article proposes innovative approach in fire extinguishing techniques in oil and gas industry, especially applicable for burning oil wells located offshore. Fire extinguishment employing a turbojet is a novel approach which can help to extinguishment the fire in short period of time. Divergent and convergent turbojets modeled in laboratory scale along with a high pressure flame were used. Different experiments were conducted to determine the relationship between output discharges of trumpet and oil wells. The results were corrected and the relationship between dimensionless parameters of flame and fire extinguishment distances and also the output discharge of turbojet and oil wells in specified distances are demonstrated by specific curves.
Keywords: Burning well, fire extinguishment, gas/oil industry, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639