Search results for: real time data.
10875 Post Occupancy Life Cycle Analysis of a Green Building Energy Consumption at the University of Western Ontario in London - Canada
Authors: M. Bittencourt, E. K. Yanful, D. Velasquez, A. E. Jungles
Abstract:
The CMLP building was developed to be a model for sustainability with strategies to reduce water, energy and pollution, and to provide a healthy environment for the building occupants. The aim of this paper is to investigate the environmental effects of energy used by this building. A LCA (life cycle analysis) was led to measure the real environmental effects produced by the use of energy. The impact categories most affected by the energy use were found to be the human health effects, as well as ecotoxicity. Natural gas extraction, uranium milling for nuclear energy production, and the blasting for mining and infrastructure construction are the processes contributing the most to emissions in the human health effect. Data comparing LCA results of CMLP building with a conventional building results showed that energy used by the CMLP building has less damage for the environment and human health than a conventional building.Keywords: Environmental Impacts, Green buildings, Life CycleAnalysis, Sustainability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 177410874 K-Means for Spherical Clusters with Large Variance in Sizes
Authors: A. M. Fahim, G. Saake, A. M. Salem, F. A. Torkey, M. A. Ramadan
Abstract:
Data clustering is an important data exploration technique with many applications in data mining. The k-means algorithm is well known for its efficiency in clustering large data sets. However, this algorithm is suitable for spherical shaped clusters of similar sizes and densities. The quality of the resulting clusters decreases when the data set contains spherical shaped with large variance in sizes. In this paper, we introduce a competent procedure to overcome this problem. The proposed method is based on shifting the center of the large cluster toward the small cluster, and recomputing the membership of small cluster points, the experimental results reveal that the proposed algorithm produces satisfactory results.Keywords: K-Means, Data Clustering, Cluster Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 328110873 Performance Evaluation of a Limited Round-Robin System
Authors: Yoshiaki Shikata
Abstract:
Performance of a limited Round-Robin (RR) rule is studied in order to clarify the characteristics of a realistic sharing model of a processor. Under the limited RR rule, the processor allocates to each request a fixed amount of time, called a quantum, in a fixed order. The sum of the requests being allocated these quanta is kept below a fixed value. Arriving requests that cannot be allocated quanta because of such a restriction are queued or rejected. Practical performance measures, such as the relationship between the mean sojourn time, the mean number of requests, or the loss probability and the quantum size are evaluated via simulation. In the evaluation, the requested service time of an arriving request is converted into a quantum number. One of these quanta is included in an RR cycle, which means a series of quanta allocated to each request in a fixed order. The service time of the arriving request can be evaluated using the number of RR cycles required to complete the service, the number of requests receiving service, and the quantum size. Then an increase or decrease in the number of quanta that are necessary before service is completed is reevaluated at the arrival or departure of other requests. Tracking these events and calculations enables us to analyze the performance of our limited RR rule. In particular, we obtain the most suitable quantum size, which minimizes the mean sojourn time, for the case in which the switching time for each quantum is considered.Keywords: Limited RR rule, quantum, processor sharing, sojourn time, performance measures, simulation, loss probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 124610872 User-Friendly Task Creation Using a CAD Integrated Robotic System on a Real Workcell
Authors: Alireza Changizi, Arash Rezaei, Jamal Muhammad, Jyrki Latokartano, Minna Lanz
Abstract:
Offline programming (OLP) is a new method in robot programming which is used widely in the industry nowadays which is a simulation base method that can produce the robot codes for motion according to virtual world in the simulation software. In this project Delmia v5 is used as simulation software. First the work cell component was modelled by Catia v5 and all of them was imported to a process file in Delmia and placed roughly to form the virtual work cell. Then robot was added to the work cell from the Delmia library. Work cell was calibrated corresponding to real world work cell to have accurate code. Tool calibration is the first step of calibration scheme and then work cell equipment can be calibrated using 6 point calibration method. Finally generated code needs to be reformed to match related controller code instruction. At the last stage IO were set to accomplish robots cooperation and make their motion synchronized. The pros and cons also will be discussed to clarify the presented results show the feasibility of the method and its effect on production line efficiency. Finally the positive and negative points of the implementation will be discussed.
Keywords: Component, robotic, automated, production, offline programming, CAD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 111210871 Coaching Leadership Traits Preferences of University and College Athletes
Authors: Idou Keinde
Abstract:
This study examined coaching leadership traits as preferred by athletes of universities and colleges of education located in Lagos State, South West Nigeria. Athletes from two universities (n=99) and two colleges of education (n=92) were involved as study sample. The Leadership Trait Preference Questionnaire (LTPQ) was used to measure athletes’ preferences. Mean and Spearman rank order statistics were used to analyze collected data. Results showed that the traits of friendliness and happiness, sense of humour and cheerfulness, and cooperation were most preferred irrespective of type of institution. College of education athletes were found to have higher mean preferences (M=34.54; SD=9.42) of leadership traits than their university counterparts (M=33.64; SD=9.46). A significantly strong relationship (rho=.81;*p<0.05) was found between preferences of university and college of education athletes. It was recommended that coaches as leaders should from time to time exhibit emotive aspects of themselves to inspire athletes to higher performance.
Keywords: Coaching behavior, coach-athlete relationship, interscholastic games, leadership traits.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 238310870 Are XBRL-based Financial Reports Better than Non-XBRL Reports? A Quality Assessment
Authors: Zhenkun Wang, Simon S. Gao
Abstract:
Using a scoring system, this paper provides a comparative assessment of the quality of data between XBRL formatted financial reports and non-XBRL financial reports. It shows a major improvement in the quality of data of XBRL formatted financial reports. Although XBRL formatted financial reports do not show much advantage in the quality at the beginning, XBRL financial reports lately display a large improvement in the quality of data in almost all aspects. With the improved XBRL web data managing, presentation and analysis applications, XBRL formatted financial reports have a much better accessibility, are more accurate and better in timeliness.Keywords: Data Quality; Financial Report; Information; XBRL
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 256610869 Development of Map of Gridded Basin Flash Flood Potential Index: GBFFPI Map of QuangNam, QuangNgai, DaNang, Hue Provinces
Authors: Le Xuan Cau
Abstract:
Flash flood is occurred in short time rainfall interval: from 1 hour to 12 hours in small and medium basins. Flash floods typically have two characteristics: large water flow and big flow velocity. Flash flood is occurred at hill valley site (strip of lowland of terrain) in a catchment with large enough distribution area, steep basin slope, and heavy rainfall. The risk of flash floods is determined through Gridded Basin Flash Flood Potential Index (GBFFPI). Flash Flood Potential Index (FFPI) is determined through terrain slope flash flood index, soil erosion flash flood index, land cover flash floods index, land use flash flood index, rainfall flash flood index. Determining GBFFPI, each cell in a map can be considered as outlet of a water accumulation basin. GBFFPI of the cell is determined as basin average value of FFPI of the corresponding water accumulation basin. Based on GIS, a tool is developed to compute GBFFPI using ArcObjects SDK for .NET. The maps of GBFFPI are built in two types: GBFFPI including rainfall flash flood index (real time flash flood warning) or GBFFPI excluding rainfall flash flood index. GBFFPI Tool can be used to determine a high flash flood potential site in a large region as quick as possible. The GBFFPI is improved from conventional FFPI. The advantage of GBFFPI is that GBFFPI is taking into account the basin response (interaction of cells) and determines more true flash flood site (strip of lowland of terrain) while conventional FFPI is taking into account single cell and does not consider the interaction between cells. The GBFFPI Map of QuangNam, QuangNgai, DaNang, Hue is built and exported to Google Earth. The obtained map proves scientific basis of GBFFPI.Keywords: ArcObjects SDK for .NET, Basin average value of FFPI, Gridded basin flash flood potential index, GBFFPI map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 192310868 Preparation and Investigation of Photocatalytic Properties of ZnO Nanocrystals: Effect of Operational Parameters and Kinetic Study
Authors: N. Daneshvar, S. Aber, M. S. Seyed Dorraji, A. R. Khataee, M. H. Rasoulifard
Abstract:
ZnO nanocrystals with mean diameter size 14 nm have been prepared by precipitation method, and examined as photocatalyst for the UV-induced degradation of insecticide diazinon as deputy of organic pollutant in aqueous solution. The effects of various parameters, such as illumination time, the amount of photocatalyst, initial pH values and initial concentration of insecticide on the photocatalytic degradation diazinon were investigated to find desired conditions. In this case, the desired parameters were also tested for the treatment of real water containing the insecticide. Photodegradation efficiency of diazinon was compared between commercial and prepared ZnO nanocrystals. The results indicated that UV/ZnO process applying prepared nanocrystalline ZnO offered electrical energy efficiency and quantum yield better than commercial ZnO. The present study, on the base of Langmuir-Hinshelwood mechanism, illustrated a pseudo first-order kinetic model with rate constant of surface reaction equal to 0.209 mg l-1 min-1 and adsorption equilibrium constant of 0.124 l mg-1.Keywords: Zinc oxide nanopowder, Electricity consumption, Quantum yield, Nanoparticles, Photodegradation, Kinetic model, Insecticide.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 356810867 Modeling of Random Variable with Digital Probability Hyper Digraph: Data-Oriented Approach
Authors: A. Habibizad Navin, M. Naghian Fesharaki, M. Mirnia, M. Kargar
Abstract:
In this paper we introduce Digital Probability Hyper Digraph for modeling random variable as the hierarchical data-oriented model.Keywords: Data-Oriented Models, Data Structure, DigitalProbability Hyper Digraph, Random Variable, Statistic andProbability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 127310866 Implementation of an Innovative Simplified Sliding Mode Observer-Based Robust Fault Detection in a Drum Boiler System
Authors: L. Khoshnevisan, H. R. Momeni, A. Ashraf-Modarres
Abstract:
One of the robust fault detection filter (RFDF) designing method is based on sliding-mode theory. The main purpose of our study is to introduce an innovative simplified reference residual model generator to formulate the RFDF as a sliding-mode observer without any manipulation package or transformation matrix, through which the generated residual signals can be evaluated. So the proposed design is more explicit and requires less design parameters in comparison with approaches requiring changing coordinates. To the best author's knowledge, this is the first time that the sliding mode technique is applied to detect actuator and sensor faults in a real boiler. The designing procedure is proposed in a drum boiler in Synvendska Kraft AB Plant in Malmo, Sweden as a multivariable and strongly coupled system. It is demonstrated that both sensor and actuator faults can robustly be detected. Also sensor faults can be diagnosed and isolated through this method.Keywords: Boiler, fault detection, robustness, simplified sliding-mode observer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 194110865 Performance Analysis of Bluetooth Low Energy Mesh Routing Algorithm in Case of Disaster Prediction
Authors: Asmir Gogic, Aljo Mujcic, Sandra Ibric, Nermin Suljanovic
Abstract:
Ubiquity of natural disasters during last few decades have risen serious questions towards the prediction of such events and human safety. Every disaster regardless its proportion has a precursor which is manifested as a disruption of some environmental parameter such as temperature, humidity, pressure, vibrations and etc. In order to anticipate and monitor those changes, in this paper we propose an overall system for disaster prediction and monitoring, based on wireless sensor network (WSN). Furthermore, we introduce a modified and simplified WSN routing protocol built on the top of the trickle routing algorithm. Routing algorithm was deployed using the bluetooth low energy protocol in order to achieve low power consumption. Performance of the WSN network was analyzed using a real life system implementation. Estimates of the WSN parameters such as battery life time, network size and packet delay are determined. Based on the performance of the WSN network, proposed system can be utilized for disaster monitoring and prediction due to its low power profile and mesh routing feature.Keywords: Bluetooth low energy, disaster prediction, mesh routing protocols, wireless sensor networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 285710864 Effect of the Internet on Social Capital
Authors: Safaee Safiollah , Javadi Alimohammad, Javadi Maryam
Abstract:
Internet access is a vital part of the modern world and an important tool in the education of our children. It is present in schools, homes and even shopping malls. Mastering the use of the internet is likely to be an important skill for those entering the job markets of the future. An internet user can be anyone he or she wants to be in an online chat room, or play thrilling and challenging games against other players from all corners of the globe. It seems at present time (or near future) for many people relationships in the real world may be neglected as those in the virtual world increase in importance. Internet is provided a fast mode of transportation caused freedom from family bonds and mixing with different cultures and new communities. This research is an attempt to study effect of Internet on Social capital. For this purpose a survey technique on the sample size amounted 168 students of Payame Noor University of Kermanshah city in country of Iran were considered. Degree of social capital is moderate. With the help of the Multi-variable Regression, variables of Iranian message attractive, Interest to internet with effect of positive and variable Creating a cordial atmosphere with negative effect be significant.
Keywords: Internet, Social Capital, social participation Social trust
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157310863 A Generic Approach to Achieve Optimal Server Consolidation by Using Existing Servers in Virtualized Data Center
Authors: Siyuan Jing, Kun She
Abstract:
Virtualization-based server consolidation has been proven to be an ideal technique to solve the server sprawl problem by consolidating multiple virtualized servers onto a few physical servers leading to improved resource utilization and return on investment. In this paper, we solve this problem by using existing servers, which are heterogeneous and diversely preferred by IT managers. Five practical consolidation rules are introduced, and a decision model is proposed to optimally allocate source services to physical target servers while maximizing the average resource utilization and preference value. Our model can be regarded as a multi-objective multi-dimension bin-packing (MOMDBP) problem with constraints, which is strongly NP-hard. An improved grouping generic algorithm (GGA) is introduced for the problem. Extensive simulations were performed and the results are given.Keywords: GGA-based Heuristics, Preference, Real-worldConstraints, Resource Utilization, Server Consolidation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 164710862 Wireless Transmission of Big Data Using Novel Secure Algorithm
Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha
Abstract:
This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.Keywords: Big data, cooperative jamming, energy balance, physical layer, two-hop transmission, wireless security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 218010861 Power Line Carrier Equipment Supporting IP Traffic Transmission in the Enterprise Networks of Energy Companies
Authors: M. S. Anton Merkulov
Abstract:
This article discusses the questions concerning of creating small packet networks for energy companies with application of high voltage power line carrier equipment (PLC) with functionality of IP traffic transmission. The main idea is to create converged PLC links between substations and dispatching centers where packet data and voice are transmitted in one data flow. The article contents description of basic conception of the network, evaluation of voice traffic transmission parameters, and discussion of header compression techniques in relation to PLC links. The results of exploration show us, that convergent packet PLC links can be very useful in the construction of small packet networks between substations in remote locations, such as deposits or low populated areas.
Keywords: packet PLC, VoIP, time delay, packet traffic, overhead compression
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 216510860 Operating System Based Virtualization Models in Cloud Computing
Authors: Dev Ras Pandey, Bharat Mishra, S. K. Tripathi
Abstract:
Cloud computing is ready to transform the structure of businesses and learning through supplying the real-time applications and provide an immediate help for small to medium sized businesses. The ability to run a hypervisor inside a virtual machine is important feature of virtualization and it is called nested virtualization. In today’s growing field of information technology, many of the virtualization models are available, that provide a convenient approach to implement, but decision for a single model selection is difficult. This paper explains the applications of operating system based virtualization in cloud computing with an appropriate/suitable model with their different specifications and user’s requirements. In the present paper, most popular models are selected, and the selection was based on container and hypervisor based virtualization. Selected models were compared with a wide range of user’s requirements as number of CPUs, memory size, nested virtualization supports, live migration and commercial supports, etc. and we identified a most suitable model of virtualization.
Keywords: Virtualization, OS based virtualization, container and hypervisor based virtualization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 194310859 Experiment and Simulation of Laser Effect on Thermal Field of Porcine Liver
Authors: K.Ting, K. T. Chen, Y. L. Su, C. J. Chang
Abstract:
In medical therapy, laser has been widely used to conduct cosmetic, tumor and other treatments. During the process of laser irradiation, there may be thermal damage caused by excessive laser exposure. Thus, the establishment of a complete thermal analysis model is clinically helpful to physicians in reference data. In this study, porcine liver in place of tissue was subjected to laser irradiation to set up the experimental data considering the explored impact on surface thermal field and thermal damage region under different conditions of power, laser irradiation time, and distance between laser and porcine liver. In the experimental process, the surface temperature distribution of the porcine lever was measured by the infrared thermal imager. In the part of simulation, the bio heat transfer Pennes-s equation was solved by software SYSWELD applying in welding process. The double ellipsoid function as a laser source term is firstly considered in the prediction for surface thermal field and internal tissue damage. The simulation results are compared with the experimental data to validate the mathematical model established here in.
Keywords: laser infrared thermal imager, bio-heat transfer, double ellipsoid function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 205810858 2D Spherical Spaces for Face Relighting under Harsh Illumination
Authors: Amr Almaddah, Sadi Vural, Yasushi Mae, Kenichi Ohara, Tatsuo Arai
Abstract:
In this paper, we propose a robust face relighting technique by using spherical space properties. The proposed method is done for reducing the illumination effects on face recognition. Given a single 2D face image, we relight the face object by extracting the nine spherical harmonic bases and the face spherical illumination coefficients. First, an internal training illumination database is generated by computing face albedo and face normal from 2D images under different lighting conditions. Based on the generated database, we analyze the target face pixels and compare them with the training bootstrap by using pre-generated tiles. In this work, practical real time processing speed and small image size were considered when designing the framework. In contrast to other works, our technique requires no 3D face models for the training process and takes a single 2D image as an input. Experimental results on publicly available databases show that the proposed technique works well under severe lighting conditions with significant improvements on the face recognition rates.Keywords: Face synthesis and recognition, Face illumination recovery, 2D spherical spaces, Vision for graphics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 175410857 Curing Time Effect on Behavior of Cement Treated Marine Clay
Authors: H. W. Xiao, F. H. Lee
Abstract:
Cement stabilization has been widely used for improving the strength and stiffness of soft clayey soils. Cement treated soil specimens used to investigate the stress-strain behaviour in the laboratory study are usually cured for 7 days. This paper examines the effects of curing time on the strength and stress strain behaviour of cement treated marine clay under triaxial loading condition. Laboratory-prepared cement treated Singapore marine clay with different mix proportion S-C-W (soil solid-cement solid-water) and curing time (7 days to 180 days) was investigated through conducting unconfined compressive strength test and triaxial test. The results show that the curing time has a significant effect on the unconfined compressive strength u q , isotropic compression behaviour and stress strain behaviour. Although the primary yield loci of the cement treated soil specimens with the same mix proportion expand with curing time, they are very narrowly banded and have nearly the same shape after being normalized by isotropic compression primary stress ' py p . The isotropic compression primary yield stress ' py p was shown to be linearly related to unconfined compressive strength u q for specimens with different curing time and mix proportion. The effect of curing time on the hardening behaviour will diminish with consolidation stress higher than isotropic compression primary yield stress but its damping rate is dependent on the cement content.Keywords: Cement treated soil, curing time effect, hardening behaviour, isotropic compression primary yield stress, unconfined compressive strength.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 391010856 Immobilization of Lipase Enzyme by Low Cost Material: A Statistical Approach
Authors: Md. Z. Alam, Devi R. Asih, Md. N. Salleh
Abstract:
Immobilization of lipase enzyme produced from palm oil mill effluent (POME) by the activated carbon (AC) among the low cost support materials was optimized. The results indicated that immobilization of 94% was achieved by AC as the most suitable support material. A sequential optimization strategy based on a statistical experimental design, including one-factor-at-a-time (OFAT) method was used to determine the equilibrium time. Three components influencing lipase immobilization were optimized by the response surface methodology (RSM) based on the face-centered central composite design (FCCCD). On the statistical analysis of the results, the optimum enzyme concentration loading, agitation rate and carbon active dosage were found to be 30 U/ml, 300 rpm and 8 g/L respectively, with a maximum immobilization activity of 3732.9 U/g-AC after 2 hrs of immobilization. Analysis of variance (ANOVA) showed a high regression coefficient (R2) of 0.999, which indicated a satisfactory fit of the model with the experimental data. The parameters were statistically significant at p<0.05.
Keywords: Activated carbon, adsorption, immobilization, POME based lipase.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 257510855 Multi-matrix Real-coded Genetic Algorithm for Minimising Total Costs in Logistics Chain Network
Authors: Pupong Pongcharoen, Aphirak Khadwilard, Anothai Klakankhai
Abstract:
The importance of supply chain and logistics management has been widely recognised. Effective management of the supply chain can reduce costs and lead times and improve responsiveness to changing customer demands. This paper proposes a multi-matrix real-coded Generic Algorithm (MRGA) based optimisation tool that minimises total costs associated within supply chain logistics. According to finite capacity constraints of all parties within the chain, Genetic Algorithm (GA) often produces infeasible chromosomes during initialisation and evolution processes. In the proposed algorithm, chromosome initialisation procedure, crossover and mutation operations that always guarantee feasible solutions were embedded. The proposed algorithm was tested using three sizes of benchmarking dataset of logistic chain network, which are typical of those faced by most global manufacturing companies. A half fractional factorial design was carried out to investigate the influence of alternative crossover and mutation operators by varying GA parameters. The analysis of experimental results suggested that the quality of solutions obtained is sensitive to the ways in which the genetic parameters and operators are set.Keywords: Genetic Algorithm, Logistics, Optimisation, Supply Chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 181010854 Degeneracy of MIS under the Conditions of Instability: A Mathematical Formulation
Authors: Nazar Younis, Raied Salman
Abstract:
It has been always observed that the effectiveness of MIS as a support tool for management decisions degenerate after time of implementation, despite the substantial investments being made. This is true for organizations at the initial stages of MIS implementations, manual or computerized. A survey of a sample of middle to top managers in business and government institutions was made. A large ratio indicates that the MIS has lost its impact on the day-to-day operations, and even the response lag time expands sometimes indefinitely. The data indicates an infant mortality phenomenon of the bathtub model. Reasons may be monotonous nature of MIS delivery, irrelevance, irreverence, timeliness, and lack of adequate detail. All those reasons collaborate to create a degree of degeneracy. We investigate and model as a bathtub model the phenomenon of MIS degeneracy that inflicts the MIS systems and renders it ineffective. A degeneracy index is developed to identify the status of the MIS system and possible remedies to prevent the onset of total collapse of the system to the point of being useless.Keywords: MIS, management theory, information technology, information systems, IS, organizational environment, organizations, degeneracy, organizational change.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 158310853 A Metric-Set and Model Suggestion for Better Software Project Cost Estimation
Authors: Murat Ayyıldız, Oya Kalıpsız, Sırma Yavuz
Abstract:
Software project effort estimation is frequently seen as complex and expensive for individual software engineers. Software production is in a crisis. It suffers from excessive costs. Software production is often out of control. It has been suggested that software production is out of control because we do not measure. You cannot control what you cannot measure. During last decade, a number of researches on cost estimation have been conducted. The metric-set selection has a vital role in software cost estimation studies; its importance has been ignored especially in neural network based studies. In this study we have explored the reasons of those disappointing results and implemented different neural network models using augmented new metrics. The results obtained are compared with previous studies using traditional metrics. To be able to make comparisons, two types of data have been used. The first part of the data is taken from the Constructive Cost Model (COCOMO'81) which is commonly used in previous studies and the second part is collected according to new metrics in a leading international company in Turkey. The accuracy of the selected metrics and the data samples are verified using statistical techniques. The model presented here is based on Multi-Layer Perceptron (MLP). Another difficulty associated with the cost estimation studies is the fact that the data collection requires time and care. To make a more thorough use of the samples collected, k-fold, cross validation method is also implemented. It is concluded that, as long as an accurate and quantifiable set of metrics are defined and measured correctly, neural networks can be applied in software cost estimation studies with successKeywords: Software Metrics, Software Cost Estimation, Neural Network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195710852 Tracing Quality Cost in a Luggage Manufacturing Industry
Authors: S. B. Jaju, R. R. Lakhe
Abstract:
Quality costs are the costs associated with preventing, finding, and correcting defective work. Since the main language of corporate management is money, quality-related costs act as means of communication between the staff of quality engineering departments and the company managers. The objective of quality engineering is to minimize the total quality cost across the life of product. Quality costs provide a benchmark against which improvement can be measured over time. It provides a rupee-based report on quality improvement efforts. It is an effective tool to identify, prioritize and select quality improvement projects. After reviewing through the literature it was noticed that a simplified methodology for data collection of quality cost in a manufacturing industry was required. The quantified standard methodology is proposed for collecting data of various elements of quality cost categories for manufacturing industry. Also in the light of research carried out so far, it is felt necessary to standardise cost elements in each of the prevention, appraisal, internal failure and external failure costs. . Here an attempt is made to standardise the various cost elements applicable to manufacturing industry and data is collected by using the proposed quantified methodology. This paper discusses the case study carried in luggage manufacturing industry.Keywords: Quality Costs, PAF model, quantified methodology, Case study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 225410851 On the Algorithmic Iterative Solutions of Conjugate Gradient, Gauss-Seidel and Jacobi Methods for Solving Systems of Linear Equations
Authors: H. D. Ibrahim, H. C. Chinwenyi, H. N. Ude
Abstract:
In this paper, efforts were made to examine and compare the algorithmic iterative solutions of conjugate gradient method as against other methods such as Gauss-Seidel and Jacobi approaches for solving systems of linear equations of the form Ax = b, where A is a real n x n symmetric and positive definite matrix. We performed algorithmic iterative steps and obtained analytical solutions of a typical 3 x 3 symmetric and positive definite matrix using the three methods described in this paper (Gauss-Seidel, Jacobi and Conjugate Gradient methods) respectively. From the results obtained, we discovered that the Conjugate Gradient method converges faster to exact solutions in fewer iterative steps than the two other methods which took much iteration, much time and kept tending to the exact solutions.
Keywords: conjugate gradient, linear equations, symmetric and positive definite matrix, Gauss-Seidel, Jacobi, algorithm
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 47410850 Weld Defect Detection in Industrial Radiography Based Digital Image Processing
Authors: N. Nacereddine, M. Zelmat, S. S. Belaïfa, M. Tridi
Abstract:
Industrial radiography is a famous technique for the identification and evaluation of discontinuities, or defects, such as cracks, porosity and foreign inclusions found in welded joints. Although this technique has been well developed, improving both the inspection process and operating time, it does suffer from several drawbacks. The poor quality of radiographic images is due to the physical nature of radiography as well as small size of the defects and their poor orientation relatively to the size and thickness of the evaluated parts. Digital image processing techniques allow the interpretation of the image to be automated, avoiding the presence of human operators making the inspection system more reliable, reproducible and faster. This paper describes our attempt to develop and implement digital image processing algorithms for the purpose of automatic defect detection in radiographic images. Because of the complex nature of the considered images, and in order that the detected defect region represents the most accurately possible the real defect, the choice of global and local preprocessing and segmentation methods must be appropriated.
Keywords: Digital image processing, global and localapproaches, radiographic film, weld defect.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 407210849 Impact of Stack Caches: Locality Awareness and Cost Effectiveness
Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang
Abstract:
Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.
Keywords: Hit rate, Locality of program, Stack cache, and Stack data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 150810848 Basic Calibration and Normalization Techniques for Time Domain Reflectometry Measurements
Authors: Shagufta Tabassum
Abstract:
The study of dielectric properties in a binary mixture of liquids is very useful to understand the liquid structure, molecular interaction, dynamics, and kinematics of the mixture. Time-domain reflectometry (TDR) is a powerful tool for studying the cooperation and molecular dynamics of the H-bonded system. Here we discuss the basic calibration and normalization procedure for TDR measurements. Our aim is to explain different types of error occur during TDR measurements and how to minimize it.
Keywords: time domain reflectometry measurement technique, cable and connector loss, oscilloscope loss, normalization technique
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50510847 Parallel Text Processing: Alignment of Indonesian to Javanese Language
Authors: Aji P. Wibawa, Andrew Nafalski, Neil Murray, Wayan F. Mahmudy
Abstract:
Parallel text alignment is proposed as a way of aligning bahasa Indonesia to words in Javanese. Since the one-to-one word translator does not have the facility to translate pragmatic aspects of Javanese, the parallel text alignment model described uses a phrase pair combination. The algorithm aligns the parallel text automatically from the beginning to the end of each sentence. Even though the results of the phrase pair combination outperform the previous algorithm, it is still inefficient. Recording all possible combinations consume more space in the database and time consuming. The original algorithm is modified by applying the edit distance coefficient to improve the data-storage efficiency. As a result, the data-storage consumption is 90% reduced as well as its learning period (42s).
Keywords: Parallel text alignment, phrase pair combination, edit distance coefficient, Javanese-Indonesian language.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 248210846 An Approach for Reducing the End-to-end Delay and Increasing Network Lifetime in Mobile Adhoc Networks
Authors: R. Asokan, A. M. Natarajan
Abstract:
Mobile adhoc network (MANET) is a collection of mobile devices which form a communication network with no preexisting wiring or infrastructure. Multiple routing protocols have been developed for MANETs. As MANETs gain popularity, their need to support real time applications is growing as well. Such applications have stringent quality of service (QoS) requirements such as throughput, end-to-end delay, and energy. Due to dynamic topology and bandwidth constraint supporting QoS is a challenging task. QoS aware routing is an important building block for QoS support. The primary goal of the QoS aware protocol is to determine the path from source to destination that satisfies the QoS requirements. This paper proposes a new energy and delay aware protocol called energy and delay aware TORA (EDTORA) based on extension of Temporally Ordered Routing Protocol (TORA).Energy and delay verifications of query packet have been done in each node. Simulation results show that the proposed protocol has a higher performance than TORA in terms of network lifetime, packet delivery ratio and end-to-end delay.Keywords: EDTORA, Mobile Adhoc Networks, QoS, Routing, TORA
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2390