Search results for: megaproject execution.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 275

Search results for: megaproject execution.

125 Process Oriented Architecture for Emergency Scenarios in the Czech Republic

Authors: Tomáš Ludík, Josef Navrátil, Alena Langerová

Abstract:

Tackling emergency situations is performed based on emergency scenarios. These scenarios do not have a uniform form in the Czech Republic. They are unstructured and developed primarily in the text form. This does not allow solving emergency situations efficiently. For this reason, the paper aims at defining a Process Oriented Architecture to support and thus to improve tackling emergency situations in the Czech Republic. The innovative Process Oriented Architecture is based on the Workflow Reference Model while taking into account the options of Business Process Management Suites for the implementation of process oriented emergency scenarios. To verify the proposed architecture the Proof of Concept has been used which covers the reception of an emergency event at the district emergency operations centre. Within the particular implementation of the proposed architecture the Bonita Open Solution has been used. The architecture created in this way is suitable not only for emergency management, but also for educational purposes.

Keywords: Business Process Management Suite, Czech Republic, Emergency Scenarios, Process Execution, Process Oriented Architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1777
124 Parallel Branch and Bound Model Using Logarithmic Sampling (PBLS) for Symmetric Traveling Salesman Problem

Authors: Sheikh Muhammad Azam, Masood-ur-Rehman, Adnan Khalid Bhatti, Nadeem Daudpota

Abstract:

Very Large and/or computationally complex optimization problems sometimes require parallel or highperformance computing for achieving a reasonable time for computation. One of the most popular and most complicate problems of this family is “Traveling Salesman Problem". In this paper we have introduced a Branch & Bound based algorithm for the solution of such complicated problems. The main focus of the algorithm is to solve the “symmetric traveling salesman problem". We reviewed some of already available algorithms and felt that there is need of new algorithm which should give optimal solution or near to the optimal solution. On the basis of the use of logarithmic sampling, it was found that the proposed algorithm produced a relatively optimal solution for the problem and results excellent performance as compared with the traditional algorithms of this series.

Keywords: Parallel execution, symmetric traveling salesman problem, branch and bound algorithm, logarithmic sampling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2292
123 Multipurpose Cadastre, Essential for Urban Development Plans in Iran

Authors: Mehrshad Khalaj, Elham Lashkari

Abstract:

Majority of researches conducted on Iranian urban development plans indicate that they have been almost unsuccessful in terms of draft, execution and goal achievement. Lack or shortage of essential statistics and information can be listed as an important reason of the failure of these plans. Lack of figures and information has turned into an obvious part of the country-s statistics officials. This problem has made urban planner themselves to embark on physical surveys including real estate and land pricing, population and economic census of the city. Apart from the problems facing urban developers, the possibility of errors is high in such surveys. In the present article, applying the interview technique, it has been mentioned that utilizing multipurpose cadastre system as a land information system is essential for urban development plans in Iran. It can minimize or even remove the failures facing urban development plans.

Keywords: Multipurpose Cadastre, Urban Development Plan(UDP), Land Information System (LIS), Interview Technique

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2402
122 Impact of Moderating Role of e-Administration on Training, Perfromance Appraisal and Organizational Performance

Authors: Ejaz Ali, Muhammad Younas, Tahir Saeed

Abstract:

In this age of information technology, organizations are revisiting their approach in great deal. E-administration is the most popular area to proceed with. Organizations in order to excel over their competitors are spending a substantial chunk of its resources on E-Administration as it is the most effective, transparent and efficient way to achieve their short term as well as long term organizational goals. E-administration being a tool of ICT plays a significant role towards effective management of HR practices resulting into optimal performance of an organization. The present research was carried out to analyze the impact of moderating role of e-administration in the relationships training and performance appraisal aligned with perceived organizational performance. The study is based on RBV and AMO theories, advocating that use of latest technology in execution of human resource (HR) functions enables an organization to achieve and sustain competitive advantage which leads to optimal firm performance.

Keywords: Human resource management, HR function, e-administration, performance appraisal, training, organizational performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1116
121 Performance Evaluation of Task Scheduling Algorithm on LCQ Network

Authors: Zaki Ahmad Khan, Jamshed Siddiqui, Abdus Samad

Abstract:

The Scheduling and mapping of tasks on a set of processors is considered as a critical problem in parallel and distributed computing system. This paper deals with the problem of dynamic scheduling on a special type of multiprocessor architecture known as Linear Crossed Cube (LCQ) network. This proposed multiprocessor is a hybrid network which combines the features of both linear types of architectures as well as cube based architectures. Two standard dynamic scheduling schemes namely Minimum Distance Scheduling (MDS) and Two Round Scheduling (TRS) schemes are implemented on the LCQ network. Parallel tasks are mapped and the imbalance of load is evaluated on different set of processors in LCQ network. The simulations results are evaluated and effort is made by means of through analysis of the results to obtain the best solution for the given network in term of load imbalance left and execution time. The other performance matrices like speedup and efficiency are also evaluated with the given dynamic algorithms.

Keywords: Dynamic algorithm, Load imbalance, Mapping, Task scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1974
120 Scheduling Multiple Workflow Using De-De Dodging Algorithm and PBD Algorithm in Cloud: Detailed Study

Authors: B. Arun Kumar, T. Ravichandran

Abstract:

Workflow scheduling is an important part of cloud computing and based on different criteria it decides cost, execution time, and performances. A cloud workflow system is a platform service facilitating automation of distributed applications based on new cloud infrastructure. An aspect which differentiates cloud workflow system from others is market-oriented business model, an innovation which challenges conventional workflow scheduling strategies. Time and Cost optimization algorithm for scheduling Hybrid Clouds (TCHC) algorithm decides which resource should be chartered from public providers is combined with a new De-De algorithm considering that every instance of single and multiple workflows work without deadlocks. To offset this, two new concepts - De-De Dodging Algorithm and Priority Based Decisive Algorithm - combine with conventional deadlock avoidance issues by proposing one algorithm that maximizes active (not just allocated) resource use and reduces Makespan.

Keywords: Workflow Scheduling, cloud workflow, TCHC algorithm, De-De Dodging Algorithm, Priority Based Decisive Algorithm (PBD), Makespan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2753
119 A Critical Review on the Development of a Theoretical Framework for Managing Environmental Impacts of Construction Project

Authors: Sami Mustafa M. E. Ahmed, Noor Amila Wan Abdullah Zawawi, Zulkipli B. Ghazali

Abstract:

Construction industry is considered as one of the main contributor of natural resources depletion, responsible for high level pollution and it is one of the attributes that pose climate changes and other environmental threats. A lot of efforts had and have been done to reduce and control these impacts. Project Environmental Management (PEM) includes the processes required to ensure that the impacts of the project execution to the surrounding environment will remain within the limits stated in legal permits. The main aim of most of researches conducted managing Environmental Impacts (EI) is to protect earth planet from pollution. Those researches are presenting four major environmental elements; Environmental Management Systems (EMS), Environmental Design (ED), Environmental Planning (EP) and Environmental Impacts Assessments (EIA). Although everything has been said about environmental management for construction projects, but almost everything remains to be said and therefore to be explored or rediscovered because incontestably, almost everything remains to be done. This paper aimed at reviewing some of what has been said about PEM. Also one of its objectives is to explore and rediscover the whole view of managing the EI problems by proposing a framework that based on the relation between these environmental researches.

Keywords: Environmental planning, sustainable design, EIA and EMS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2371
118 An Algorithm for an Optimal Staffing Problem in Open Shop Environment

Authors: Daniela I. Borissova, Ivan C. Mustakerov

Abstract:

The paper addresses a problem of optimal staffing in open shop environment. The problem is to determine the optimal number of operators serving a given number of machines to fulfill the number of independent operations while minimizing staff idle. Using a Gantt chart presentation of the problem it is modeled as twodimensional cutting stock problem. A mixed-integer programming model is used to get minimal job processing time (makespan) for fixed number of machines' operators. An algorithm for optimal openshop staffing is developed based on iterative solving of the formulated optimization task. The execution of the developed algorithm provides optimal number of machines' operators in the sense of minimum staff idle and optimal makespan for that number of operators. The proposed algorithm is tested numerically for a real life staffing problem. The testing results show the practical applicability for similar open shop staffing problems.

Keywords: Integer programming, open shop problem, optimal staffing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3273
117 Replicating Data Objects in Large-scale Distributed Computing Systems using Extended Vickrey Auction

Authors: Samee Ullah Khan, Ishfaq Ahmad

Abstract:

This paper proposes a novel game theoretical technique to address the problem of data object replication in largescale distributed computing systems. The proposed technique draws inspiration from computational economic theory and employs the extended Vickrey auction. Specifically, players in a non-cooperative environment compete for server-side scarce memory space to replicate data objects so as to minimize the total network object transfer cost, while maintaining object concurrency. Optimization of such a cost in turn leads to load balancing, fault-tolerance and reduced user access time. The method is experimentally evaluated against four well-known techniques from the literature: branch and bound, greedy, bin-packing and genetic algorithms. The experimental results reveal that the proposed approach outperforms the four techniques in both the execution time and solution quality.

Keywords: Auctions, data replication, pricing, static allocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1426
116 Attributes of Ethical Leadership and Ethical Guidelines in Malaysian Public Sector

Authors: M. Norazamina, A. Azizah, Y. Najihah Marha, A. Suraya

Abstract:

Malaysian Public Sector departments or agencies are responsible to provide efficient public services with zero corruption. However, corruption continues to occur due to the absence of ethical leadership and well-execution of ethical guidelines. Thus, the objective of this paper is to explore the attributes of ethical leadership and ethical guidelines. This study employs a qualitative research by analyzing data from interviews with key informers of public sector using conceptual content analysis (NVivo11). The study reveals eight attributes of ethical leadership which are role model, attachment, ethical support, knowledgeable, discipline, leaders’ spirituality encouragement, virtue values and shared values. Meanwhile, five attributes (guidelines, communication, check and balance, concern on stakeholders and compliance) of ethical guidelines are identified. These identified attributes should become the ethical identity and ethical direction of Malaysian Public Sector. This could enhance the public trust as well as the international community trust towards the public sector.

Keywords: Check and balance, ethical guidelines, ethical leadership, public sector, spirituality encouragement .

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1590
115 Automatic Number Plate Recognition System Based on Deep Learning

Authors: T. Damak, O. Kriaa, A. Baccar, M. A. Ben Ayed, N. Masmoudi

Abstract:

In the last few years, Automatic Number Plate Recognition (ANPR) systems have become widely used in the safety, the security, and the commercial aspects. Forethought, several methods and techniques are computing to achieve the better levels in terms of accuracy and real time execution. This paper proposed a computer vision algorithm of Number Plate Localization (NPL) and Characters Segmentation (CS). In addition, it proposed an improved method in Optical Character Recognition (OCR) based on Deep Learning (DL) techniques. In order to identify the number of detected plate after NPL and CS steps, the Convolutional Neural Network (CNN) algorithm is proposed. A DL model is developed using four convolution layers, two layers of Maxpooling, and six layers of fully connected. The model was trained by number image database on the Jetson TX2 NVIDIA target. The accuracy result has achieved 95.84%.

Keywords: Automatic number plate recognition, character segmentation, convolutional neural network, CNN, deep learning, number plate localization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1216
114 A Virtual Reality Laboratory for Distance Education in Chemistry

Authors: J. Georgiou, K. Dimitropoulos, A. Manitsaris

Abstract:

Simulations play a major role in education not only because they provide realistic models with which students can interact to acquire real world experiences, but also because they constitute safe environments in which students can repeat processes without any risk in order to perceive easier concepts and theories. Virtual reality is widely recognized as a significant technological advance that can facilitate learning process through the development of highly realistic 3D simulations supporting immersive and interactive features. The objective of this paper is to analyze the influence of virtual reality-s use in chemistry instruction as well as to present an integrated web-based learning environment for the simulation of chemical experiments. The proposed application constitutes a cost-effective solution for both schools and universities without appropriate infrastructure and a valuable tool for distance learning and life-long education in chemistry. Its educational objectives are the familiarization of students with the equipment of a real chemical laboratory and the execution of virtual volumetric analysis experiments with the active participation of students.

Keywords: Chemistry, simulations, experiments, virtual reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2757
113 Cross Country Comparison: Business Process Management Maturity, Social Business Process Management and Organizational Culture

Authors: Dalia Suša Vugec

Abstract:

In recent few decades, business process management (BPM) has been in focus of a great number of researchers and organizations. There are many benefits derived from the implementation of BPM in organizations. However, there has been also noticed that lately traditional BPM faces some difficulties in terms of the divide between models and their execution, lost innovations, lack of information fusioning and so on. As a result, there has been a new discipline, called social BPM, which incorporates principles of social software into the BPM. On the other hand, many researchers indicate organizational culture as a vital part of the BPM success and maturity. Therefore, the goal of this study is to investigate the current state of BPM maturity and the usage of social BPM among the organizations from Croatia, Slovenia and Austria, with the regards to the organizational culture as well. The paper presents the results of a survey conducted as part of the PROSPER project (IP-2014-09-3729), financed by Croatian Science Foundation. The results indicate differences in the level of BPM maturity, the usage of social BPM and the dominant organizational culture in the observed organizations from different countries. These differences are further discussed in the paper.

Keywords: Business process management, BPM maturity, organizational culture, social BPM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 787
112 Prediction of in situ Permeability for Limestone Rock Using Rock Quality Designation Index

Authors: Ahmed T. Farid, Muhammed Rizwan

Abstract:

Geotechnical study for evaluating soil or rock permeability is a highly important parameter. Permeability values for rock formations are more difficult for determination than soil formation as it is an effect of the rock quality and its fracture values. In this research, the prediction of in situ permeability of limestone rock formations was predicted. The limestone rock permeability was evaluated using Lugeon tests (in-situ packer permeability). Different sites which spread all over the Riyadh region of Saudi Arabia were chosen to conduct our study of predicting the in-situ permeability of limestone rock. Correlations were deducted between the values of in-situ permeability of the limestone rock with the value of the rock quality designation (RQD) calculated during the execution of the boreholes of the study areas. The study was performed for different ranges of RQD values measured during drilling of the sites boreholes. The developed correlations are recommended for the onsite determination of the in-situ permeability of limestone rock only. For the other sedimentary formations of rock, more studies are needed for predicting the actual correlations related to each type.

Keywords: Packer, permeability, rock, quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1494
111 Assessment of Green and Smart IT Level: A Case Study on Public Research Institute

Authors: Han-Gook Kim, Dong-Suk Hong

Abstract:

As the latest advancement and trend in IT field, Green & Smart IT has attracted more and more attentions from researchers. This study focuses on the development of assessing tools which can be used for evaluating Green & Smart IT level within an organization. In order to achieve meaningful results, a comprehensive review of relevant literature was performed in advance, then, Delphi survey and other processes were also employed to develop the assessment tools for Green & Smart IT level. Two rounds of Delphi questionnaire survey were conducted with 20 IT experts in public sector. The results reveal that the top five weighted KPIs to evaluate maturity of Green & Smart IT were: (1) electronic execution of business process; (2) shutdown of unused IT devices; (3) virtualization of severs; (4) automation of constant temperature and humidity; and (5) introduction of smart-work system. Finally, these tools were applied to case study of a public research institute in Korea. The findings presented in this study provide organizations with useful implications for the introduction and promotion of Green & Smart IT in the future

Keywords: Assessment, Case Study, Delphi, Green & Smart

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1459
110 Towards Incorporating Context Awareness into Business Process Management

Authors: Xiaohui Zhao, Shahan Mafuz

Abstract:

Context-aware technologies provide system applications with the awareness of environmental conditions, customer behaviours, object movements, etc. Further, with such capability system applications can be smart to intelligently adapt their responses to the changing conditions. In regard to business operations, this promises businesses that their business processes can run more intelligently, adaptively and flexibly, and thereby either improve customer experience, enhance reliability of service delivery, or lower operational cost, to make the business more competitive and sustainable. Aiming at realising such context-aware business process management, this paper firstly explores its potential benefit, and then identifies some gaps between the current business process management support and the expected. In addition, some preliminary solutions are also discussed in regard to context definition, rule-based process execution, run-time process evolution, etc. A framework is also presented to give a conceptual architecture of context-aware business process management system to guide system implementation.

Keywords: Business process adaptation, business process evolution, business process modelling, and context awareness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928
109 Factors of Effective Business Software Systems Development and Enhancement Projects Work Effort Estimation

Authors: Beata Czarnacka-Chrobot

Abstract:

Majority of Business Software Systems (BSS) Development and Enhancement Projects (D&EP) fail to meet criteria of their effectiveness, what leads to the considerable financial losses. One of the fundamental reasons for such projects- exceptionally low success rate are improperly derived estimates for their costs and time. In the case of BSS D&EP these attributes are determined by the work effort, meanwhile reliable and objective effort estimation still appears to be a great challenge to the software engineering. Thus this paper is aimed at presenting the most important synthetic conclusions coming from the author-s own studies concerning the main factors of effective BSS D&EP work effort estimation. Thanks to the rational investment decisions made on the basis of reliable and objective criteria it is possible to reduce losses caused not only by abandoned projects but also by large scale of overrunning the time and costs of BSS D&EP execution.

Keywords: Benchmarking data, business software systems development and enhancement projects, effort estimation, software engineering economics, software functional size measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1505
108 Performance Improvements of DSP Applications on a Generic Reconfigurable Platform

Authors: Michalis D. Galanis, Gregory Dimitroulakos, Costas E. Goutis

Abstract:

Speedups from mapping four real-life DSP applications on an embedded system-on-chip that couples coarsegrained reconfigurable logic with an instruction-set processor are presented. The reconfigurable logic is realized by a 2-Dimensional Array of Processing Elements. A design flow for improving application-s performance is proposed. Critical software parts, called kernels, are accelerated on the Coarse-Grained Reconfigurable Array. The kernels are detected by profiling the source code. For mapping the detected kernels on the reconfigurable logic a prioritybased mapping algorithm has been developed. Two 4x4 array architectures, which differ in their interconnection structure among the Processing Elements, are considered. The experiments for eight different instances of a generic system show that important overall application speedups have been reported for the four applications. The performance improvements range from 1.86 to 3.67, with an average value of 2.53, compared with an all-software execution. These speedups are quite close to the maximum theoretical speedups imposed by Amdahl-s law.

Keywords: Reconfigurable computing, Coarse-grained reconfigurable array, Embedded systems, DSP, Performance

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1444
107 Sparse Networks-Based Speedup Technique for Proteins Betweenness Centrality Computation

Authors: Razvan Bocu, Dr Sabin Tabirca

Abstract:

The study of proteomics reached unexpected levels of interest, as a direct consequence of its discovered influence over some complex biological phenomena, such as problematic diseases like cancer. This paper presents the latest authors- achievements regarding the analysis of the networks of proteins (interactome networks), by computing more efficiently the betweenness centrality measure. The paper introduces the concept of betweenness centrality, and then describes how betweenness computation can help the interactome net- work analysis. Current sequential implementations for the between- ness computation do not perform satisfactory in terms of execution times. The paper-s main contribution is centered towards introducing a speedup technique for the betweenness computation, based on modified shortest path algorithms for sparse graphs. Three optimized generic algorithms for betweenness computation are described and implemented, and their performance tested against real biological data, which is part of the IntAct dataset.

Keywords: Betweenness centrality, interactome networks, protein-protein interactions, sub-communities, sparse networks, speedup tech-nique, IntAct.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1464
106 Community Detection-based Analysis of the Human Interactome Network

Authors: Razvan Bocu, Sabin Tabirca

Abstract:

The study of proteomics reached unexpected levels of interest, as a direct consequence of its discovered influence over some complex biological phenomena, such as problematic diseases like cancer. This paper presents a new technique that allows for an accurate analysis of the human interactome network. It is basically a two-step analysis process that involves, at first, the detection of each protein-s absolute importance through the betweenness centrality computation. Then, the second step determines the functionallyrelated communities of proteins. For this purpose, we use a community detection technique that is based on the edge betweenness calculation. The new technique was thoroughly tested on real biological data and the results prove some interesting properties of those proteins that are involved in the carcinogenesis process. Apart from its experimental usefulness, the novel technique is also computationally effective in terms of execution times. Based on the analysis- results, some topological features of cancer mutated proteins are presented and a possible optimization solution for cancer drugs design is suggested.

Keywords: Betweenness centrality, interactome networks, proteinprotein interactions, protein communities, cancer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1251
105 Single Event Transient Tolerance Analysis in 8051 Microprocessor Using Scan Chain

Authors: Jun Sung Go, Jong Kang Park, Jong Tae Kim

Abstract:

As semi-conductor manufacturing technology evolves; the single event transient problem becomes more significant issue. Single event transient has a critical impact on both combinational and sequential logic circuits, so it is important to evaluate the soft error tolerance of the circuits at the design stage. In this paper, we present a soft error detecting simulation using scan chain. The simulation model generates a single event transient randomly in the circuit, and detects the soft error during the execution of the test patterns. We verified this model by inserting a scan chain in an 8051 microprocessor using 65 nm CMOS technology. While the test patterns generated by ATPG program are passing through the scan chain, we insert a single event transient and detect the number of soft errors per sub-module. The experiments show that the soft error rates per cell area of the SFR module is 277% larger than other modules.

Keywords: Scan chain, single event transient, soft error, 8051 processor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1451
104 A Serializability Condition for Multi-step Transactions Accessing Ordered Data

Authors: Rafat Alshorman, Walter Hussak

Abstract:

In mobile environments, unspecified numbers of transactions arrive in continuous streams. To prove correctness of their concurrent execution a method of modelling an infinite number of transactions is needed. Standard database techniques model fixed finite schedules of transactions. Lately, techniques based on temporal logic have been proposed as suitable for modelling infinite schedules. The drawback of these techniques is that proving the basic serializability correctness condition is impractical, as encoding (the absence of) conflict cyclicity within large sets of transactions results in prohibitively large temporal logic formulae. In this paper, we show that, under certain common assumptions on the graph structure of data items accessed by the transactions, conflict cyclicity need only be checked within all possible pairs of transactions. This results in formulae of considerably reduced size in any temporal-logic-based approach to proving serializability, and scales to arbitrary numbers of transactions.

Keywords: multi-step transactions, serializability, directed graph.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1319
103 Deviations and Defects of the Sub-Task’s Requirements in Construction Projects

Authors: Abdullah Almusharraf, Andrew Whyte

Abstract:

The sub-task pattern in terms of deviations and defects should be identified and understood in order to improve the quality of practices in construction projects. Therefore, sub-task susceptibility to exposure to deviations and defects has been evaluated and classified via six classifications proposed in this study. Thirty-four case studies of specific sub-tasks (from compression members in constructed concrete structures) were collected from seven construction projects in order to examine the study’s proposed classifications. The study revealed that the sub-task has a high sensitivity to deviation, where 91% of the cases were recorded as deviations; however, only 19% of cases were recorded as defects. Other findings were that the actual work during the execution process is a high source of deviation for this sub-task (74%), while only 26% of the source of deviation was due to both design documentation and the actual work. These findings significantly imply that the study’s proposed classifications could be used to determine the pattern of each sub-task and develop proactive actions to overcome issues of sub-task deviations and defects.

Keywords: Sub-tasks, deviations, defects, quality, construction projects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2105
102 3D Network-on-Chip with on-Chip DRAM: An Empirical Analysis for Future Chip Multiprocessor

Authors: Thomas Canhao Xu, Bo Yang, Alexander Wei Yin, Pasi Liljeberg, Hannu Tenhunen

Abstract:

With the increasing number of on-chip components and the critical requirement for processing power, Chip Multiprocessor (CMP) has gained wide acceptance in both academia and industry during the last decade. However, the conventional bus-based onchip communication schemes suffer from very high communication delay and low scalability in large scale systems. Network-on-Chip (NoC) has been proposed to solve the bottleneck of parallel onchip communications by applying different network topologies which separate the communication phase from the computation phase. Observing that the memory bandwidth of the communication between on-chip components and off-chip memory has become a critical problem even in NoC based systems, in this paper, we propose a novel 3D NoC with on-chip Dynamic Random Access Memory (DRAM) in which different layers are dedicated to different functionalities such as processors, cache or memory. Results show that, by using our proposed architecture, average link utilization has reduced by 10.25% for SPLASH-2 workloads. Our proposed design costs 1.12% less execution cycles than the traditional design on average.

Keywords: 3D integration, network-on-chip, memory-on-chip, DRAM, chip multiprocessor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2397
101 Enhancement of Visual Comfort Using Parametric Double Skin Façades

Authors: Ahmed Ashraf Khamis, Sherif A. Ibrahim, Mahmoud ElKhatieb, Mohamed A. Barakat

Abstract:

Parametric design deemed to be one of icons of the modern architectural trends that facilitates taking complex design decisions counting on altering various design parameters. Double skin façades are one of the parametric applications that are used in parametric designs. This paper opts to enhance different daylight parameters of a selected case study office building in Cairo using a parametric double skin façade. First, the design and optimization process was executed utilizing Grasshopper parametric design software package, in which the daylighting performance of the base case building model was compared with the one used in the double façade showing an enhancement in task plane illuminance by 180%. Second, execution drawings are made for the optimized design using Revit software. Finally, computerized digital fabrication stages of the designed model with various scales are demonstrated to reach the final design decisions using Simplify 3D for mock-up digital fabrication.

Keywords: Parametric design, Double skin façades, Digital Fabrication, Grasshopper, Simplify 3D.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 270
100 The Public Law Studies: Relationship between Accountability, Environmental Education and Smart Cities

Authors: Aline Alves Bandeira, Luís Pedro Lima, Maria Cecília de Paula Silva, Paulo Henrique de Viveiros Tavares

Abstract:

Nowadays, the study of public policies regarding management efficiency is essential. Public policies are about what governments do or do not do, being an area that has grown worldwide, contributing through the knowledge of technologies and methodologies that monitor and evaluate the performance of public administrators. The information published on official government websites needs to provide for transparency and responsiveness of managers. Thus, transparency is a primordial factor for the execution of accountability, providing, in this way, services to the citizen with the expansion of transparent, efficient, democratic information and that value administrative eco-efficiency. The ecologically balanced management of a Smart City must optimize environmental education, building a fairer society, which brings about equality in the use of quality environmental resources. Smart Cities add value in the construction of public management, enabling interaction between people, enhancing environmental education and the practical applicability of administrative eco-efficiency, fostering economic development and improving the quality of life.

Keywords: Accountability, environmental education, new public administration, smart cities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 530
99 Analysis of the Interference from Risk-Determining Factors of Cooperative and Conventional Construction Contracts

Authors: E. Harrer, M. Mauerhofer, T. Werginz

Abstract:

As a result of intensive competition, the building sector is suffering from a high degree of rivalry. Furthermore, there can be observed an unbalanced distribution of project risks. Clients are aimed to shift their own risks into the sphere of the constructors or planners. The consequence of this is that the number of conflicts between the involved parties is inordinately high or even increasing; an alternative approach to counter on that developments are cooperative project forms in the construction sector. This research compares conventional contract models and models with partnering agreements to examine the influence on project risks by an early integration of the involved parties. The goal is to show up deviations in different project stages from the design phase to the project transfer phase. These deviations are evaluated by a survey of experts from the three spheres: clients, contractors and planners. By rating the influence of the participants on specific risk factors it is possible to identify factors which are relevant for a smooth project execution.

Keywords: Collaborative work, construction industry, contract-models, influence, partnering, project management, risk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 822
98 A Characterized and Optimized Approach for End-to-End Delay Constrained QoS Routing

Authors: P.S.Prakash, S.Selvan

Abstract:

QoS Routing aims to find paths between senders and receivers satisfying the QoS requirements of the application which efficiently using the network resources and underlying routing algorithm to be able to find low-cost paths that satisfy given QoS constraints. The problem of finding least-cost routing is known to be NP hard or complete and some algorithms have been proposed to find a near optimal solution. But these heuristics or algorithms either impose relationships among the link metrics to reduce the complexity of the problem which may limit the general applicability of the heuristic, or are too costly in terms of execution time to be applicable to large networks. In this paper, we analyzed two algorithms namely Characterized Delay Constrained Routing (CDCR) and Optimized Delay Constrained Routing (ODCR). The CDCR algorithm dealt an approach for delay constrained routing that captures the trade-off between cost minimization and risk level regarding the delay constraint. The ODCR which uses an adaptive path weight function together with an additional constraint imposed on the path cost, to restrict search space and hence ODCR finds near optimal solution in much quicker time.

Keywords: QoS, Delay, Routing, Optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1235
97 Validation of Reverse Engineered Web Application Models

Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini

Abstract:

Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.

Keywords: Validation, Dynamic Analysis, MutationAnalysis, Reverse Engineering, Web Applications

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1578
96 Schedule Management of an Enterprise Receiving Orders Considering Dependency between Unit Tasks of a Collaborative Project

Authors: Joseph Oh, Bo-Hyun Kim, Jae-Yong Baek

Abstract:

This study suggests how an order-receiving company can avoid disclosing schedule information on unit tasks to the order-placing company when carrying out a collaborative project on the value chain in an order-oriented industry. Specifically, it suggests methods for keeping schedule information confidential, and categorizes potential situations by inter-task dependency. Lastly, an approach to select the most optimal non-disclosure method is discussed. With the methods for not disclosing work-related information suggested in the study, order-receiving companies can logically deal with political issues relating to the question of whether or not to disclose information upon the execution of a collaborative project in cooperation with an order-placing firm. Moreover, order-placing companies can monitor undistorted information, while respecting the legitimate rights of an order-receiving company. Therefore, it is fair to say that the suggestions made in this study will contribute to the smooth operation of collaborative intercompany projects.

Keywords: collaborative project, dependency, schedule management, unit task.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1447