Search results for: multiple criteria decision making analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11469

Search results for: multiple criteria decision making analysis

9129 An Investigation into the Use of an Atomistic, Hermeneutic, Holistic Approach in Education Relating to the Architectural Design Process

Authors: N. Pritchard

Abstract:

Within architectural education, students arrive fore-armed with; their life-experience; knowledge gained from subject-based learning; their brains and more specifically their imaginations. The learning-by-doing that they embark on in studio-based/project-based learning calls for supervision that allows the student to proactively undertake research and experimentation with design solution possibilities. The degree to which this supervision includes direction is subject to debate and differing opinion. It can be argued that if the student is to learn-by-doing, then design decision making within the design process needs to be instigated and owned by the student so that they have the ability to personally reflect on and evaluate those decisions. Within this premise lies the problem that the student's endeavours can become unstructured and unfocused as they work their way into a new and complex activity. A resultant weakness can be that the design activity is compartmented and not holistic or comprehensive, and therefore, the student's reflections are consequently impoverished in terms of providing a positive, informative feedback loop. The construct proffered in this paper is that a supportive 'armature' or 'Heuristic-Framework' can be developed that facilitates a holistic approach and reflective learning. The normal explorations of architectural design comprise: Analysing the site and context, reviewing building precedents, assimilating the briefing information. However, the student can still be compromised by 'not knowing what they need to know'. The long-serving triad 'Firmness, Commodity and Delight' provides a broad-brush framework of considerations to explore and integrate into good design. If this were further atomised in subdivision formed from the disparate aspects of architectural design that need to be considered within the design process, then the student could sieve through the facts more methodically and reflectively in terms of considering their interrelationship conflict and alliances. The words facts and sieve hold the acronym of the aspects that form the Heuristic-Framework: Function, Aesthetics, Context, Tectonics, Spatial, Servicing, Infrastructure, Environmental, Value and Ecological issues. The Heuristic could be used as a Hermeneutic Model with each aspect of design being focused on and considered in abstraction and then considered in its relation to other aspect and the design proposal as a whole. Importantly, the heuristic could be used as a method for gathering information and enhancing the design brief. The more poetic, mysterious, intuitive, unconscious processes should still be able to occur for the student. The Heuristic-Framework should not be seen as comprehensive prescriptive formulaic or inhibiting to the wide exploration of possibilities and solutions within the architectural design process.

Keywords: Atomistic, hermeneutic, holistic, approach architectural design studio education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1365
9128 Identification of Igneous Intrusions in South Zallah Trough, Sirt Basin, Libya

Authors: Mohamed A. Saleem

Abstract:

Using mostly seismic data, this study intends to show some examples of igneous intrusions found in some areas of the Sirt Basin and explore the period of their emplacement as well as the interrelationships between these sills. The study area is located in the south of the Zallah Trough, south-west Sirt basin, Libya. It is precisely between the longitudes 18.35ᵒ E and 19.35ᵒ E, and the latitudes 27.8ᵒ N and 28.0ᵒ N. Based on a variety of criteria that are usually used as marks on the igneous intrusions, 12 igneous intrusions (Sills), have been detected and analysed using 3D seismic data. One or more of the following were used as identification criteria: the high amplitude reflectors paired with abrupt reflector terminations, vertical offsets, or what is described as a dike-like connection, the violation, the saucer form, and the roughness. Because of their laying between the hosting layers, the majority of these intrusions are classified as sills. Another distinguishing feature is the intersection geometry link between some of these sills. Every single sill has given a name just to distinguish the sills from each other such as S-1, S-2, and … S-12. To avoid the repetition of description, the common characteristics and some statistics of these sills are shown in summary tables, while the specific characters that are not common and have been noticed for each sill are shown individually. The sills, S-1, S-2, and S-3, are approximately parallel to one other, with the shape of these sills being governed by the syncline structure of their host layers. The faults that dominated the strata (pre-upper Cretaceous strata) have a significant impact on the sills; they caused their discontinuity, while the upper layers have a shape of anticlines. S-1 and S-10 are the group's deepest and highest sills, respectively, with S-1 seated near the basement's top and S-10 extending into the sequence of the upper cretaceous. The dramatic escalation of sill S-4 can be seen in North-South profiles. The majority of the interpreted sills are influenced and impacted by a large number of normal faults that strike in various directions and propagate vertically from the surface to the basement's top. This indicates that the sediment sequences were existed before the sill’s intrusion, deposited, and that the younger faults occurred more recently. The pre-upper cretaceous unit is the current geological depth for the Sills S-1, S-2 … S-9, while Sills S-10, S-11, and S-12 are hosted by the Cretaceous unit. Over the sills S-1, S-2, and S-3, which are the deepest sills, the pre-upper cretaceous surface has a slightly forced folding, these forced folding is also noticed above the right and left tips of sill S-8 and S-6, respectively, while the absence of these marks on the above sequences of layers supports the idea that the aforementioned sills were emplaced during the early upper cretaceous period.

Keywords: Sirt Basin, Zallah Trough, igneous intrusions, seismic data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 430
9127 An Economic Analysis of Phu Kradueng National Park

Authors: Chutarat Boontho

Abstract:

The purposes of this study were as follows to evaluate the economic value of Phu Kradueng National Park by the travel cost method (TCM) and the contingent valuation method (CVM) and to estimate the demand for traveling and the willingness to pay. The data for this study were collected by conducting two large scale surveys on users and non-users. A total of 1,016 users and 1,034 non-users were interviewed. The data were analyzed using multiple linear regression analysis, logistic regression model and the consumer surplus (CS) was the integral of demand function for trips. The survey found, were as follows: 1)Using the travel cost method which provides an estimate of direct benefits to park users, we found that visitors- total willingness to pay per visit was 2,284.57 bath, of which 958.29 bath was travel cost, 1,129.82 bath was expenditure for accommodation, food, and services, and 166.66 bath was consumer surplus or the visitors -net gain or satisfaction from the visit (the integral of demand function for trips). 2) Thai visitors to Phu Kradueng National Park were further willing to pay an average of 646.84 bath per head per year to ensure the continued existence of Phu Kradueng National Park and to preserve their option to use it in the future. 3) Thai non-visitors, on the other hand, are willing to pay an average of 212.61 bath per head per year for the option and existence value provided by the Park. 4) The total economic value of Phu Kradueng National Park to Thai visitors and non-visitors taken together stands today at 9,249.55 million bath per year. 5) The users- average willingness to pay for access to Phu Kradueng National Park rises from 40 bath to 84.66 bath per head per trip for improved services such as road improvement, increased cleanliness, and upgraded information. This paper was needed to investigate of the potential market demand for bio prospecting in Phu Kradueng national Park and to investigate how a larger share of the economic benefits of tourism could be distributed income to the local residents.

Keywords: Contingent Valuation Method, Travel Cost Method, Consumer surplus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1789
9126 Coding based Synchronization Algorithm for Secondary Synchronization Channel in WCDMA

Authors: Deng Liao, Dongyu Qiu, Ahmed K. Elhakeem

Abstract:

A new code synchronization algorithm is proposed in this paper for the secondary cell-search stage in wideband CDMA systems. Rather than using the Cyclically Permutable (CP) code in the Secondary Synchronization Channel (S-SCH) to simultaneously determine the frame boundary and scrambling code group, the new synchronization algorithm implements the same function with less system complexity and less Mean Acquisition Time (MAT). The Secondary Synchronization Code (SSC) is redesigned by splitting into two sub-sequences. We treat the information of scrambling code group as data bits and use simple time diversity BCH coding for further reliability. It avoids involved and time-costly Reed-Solomon (RS) code computations and comparisons. Analysis and simulation results show that the Synchronization Error Rate (SER) yielded by the new algorithm in Rayleigh fading channels is close to that of the conventional algorithm in the standard. This new synchronization algorithm reduces system complexities, shortens the average cell-search time and can be implemented in the slot-based cell-search pipeline. By taking antenna diversity and pipelining correlation processes, the new algorithm also shows its flexible application in multiple antenna systems.

Keywords: WCDMA cell-search, synchronization algorithm, secondary synchronization channel, antenna diversity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2391
9125 Technical Trading Rules in Emerging Stock Markets

Authors: Stefaan Pauwels, Koen Inghelbrecht, Dries Heyman, Pieter Marius

Abstract:

Literature reveals that many investors rely on technical trading rules when making investment decisions. If stock markets are efficient, one cannot achieve superior results by using these trading rules. However, if market inefficiencies are present, profitable opportunities may arise. The aim of this study is to investigate the effectiveness of technical trading rules in 34 emerging stock markets. The performance of the rules is evaluated by utilizing White-s Reality Check and the Superior Predictive Ability test of Hansen, along with an adjustment for transaction costs. These tests are able to evaluate whether the best model performs better than a buy-and-hold benchmark. Further, they provide an answer to data snooping problems, which is essential to obtain unbiased outcomes. Based on our results we conclude that technical trading rules are not able to outperform a naïve buy-and-hold benchmark on a consistent basis. However, we do find significant trading rule profits in 4 of the 34 investigated markets. We also present evidence that technical analysis is more profitable in crisis situations. Nevertheless, this result is relatively weak.

Keywords: technical trading rules, Reality Check, Superior Predictive Ability, emerging stock markets, data snooping

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2442
9124 Developing a Structured and Strategically Focused Performance Assessment System

Authors: Isabel Duarte de Almeida, João Vilas-Boas, Ana Abrantes Cabral

Abstract:

The number and adequacy of Performance-Indicators (PIs) for organisational purposes are core to the success of organisations and a major concern to the sponsor of this research. This assignment developed a procedure to improve a firm’s performance assessment system, by identifying two key-PIs out of 28 initial ones, and by setting criteria and their relative importance to validate and rank the adequacy and the right number of operational metrics. The Analytical-Hierarchy-Process was used with a synthesismethod to treat data coming from the management inquiries. Although organisational alignment has been achieved, business processes should also be targeted and PIs continuously revised.

Keywords: Strategic performance assessment systems, Key Performance Indicators (KPIs), Analytical Hierarchy Process (AHP).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
9123 Multi-Dimensional Concerns Mining for Web Applications via Concept-Analysis

Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini

Abstract:

Web applications have become very complex and crucial, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering), the scientific community has focused attention to Web applications design, development, analysis, and testing, by studying and proposing methodologies and tools. This paper proposes an approach to automatic multi-dimensional concern mining for Web Applications, based on concepts analysis, impact analysis, and token-based concern identification. This approach lets the user to analyse and traverse Web software relevant to a particular concern (concept, goal, purpose, etc.) via multi-dimensional separation of concerns, to document, understand and test Web applications. This technique was developed in the context of WAAT (Web Applications Analysis and Testing) project. A semi-automatic tool to support this technique is currently under development.

Keywords: Concepts Analysis, Concerns Mining, Multi-Dimensional Separation of Concerns, Impact Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1473
9122 Genetic Algorithm for In-Theatre Military Logistics Search-and-Delivery Path Planning

Authors: Jean Berger, Mohamed Barkaoui

Abstract:

Discrete search path planning in time-constrained uncertain environment relying upon imperfect sensors is known to be hard, and current problem-solving techniques proposed so far to compute near real-time efficient path plans are mainly bounded to provide a few move solutions. A new information-theoretic –based open-loop decision model explicitly incorporating false alarm sensor readings, to solve a single agent military logistics search-and-delivery path planning problem with anticipated feedback is presented. The decision model consists in minimizing expected entropy considering anticipated possible observation outcomes over a given time horizon. The model captures uncertainty associated with observation events for all possible scenarios. Entropy represents a measure of uncertainty about the searched target location. Feedback information resulting from possible sensor observations outcomes along the projected path plan is exploited to update anticipated unit target occupancy beliefs. For the first time, a compact belief update formulation is generalized to explicitly include false positive observation events that may occur during plan execution. A novel genetic algorithm is then proposed to efficiently solve search path planning, providing near-optimal solutions for practical realistic problem instances. Given the run-time performance of the algorithm, natural extension to a closed-loop environment to progressively integrate real visit outcomes on a rolling time horizon can be easily envisioned. Computational results show the value of the approach in comparison to alternate heuristics.

Keywords: Search path planning, false alarm, search-and-delivery, entropy, genetic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1967
9121 The Performance Analysis of Error Saturation Nonlinearity LMS in Impulsive Noise based on Weighted-Energy Conservation

Authors: T Panigrahi, G Panda, Mulgrew

Abstract:

This paper introduces a new approach for the performance analysis of adaptive filter with error saturation nonlinearity in the presence of impulsive noise. The performance analysis of adaptive filters includes both transient analysis which shows that how fast a filter learns and the steady-state analysis gives how well a filter learns. The recursive expressions for mean-square deviation(MSD) and excess mean-square error(EMSE) are derived based on weighted energy conservation arguments which provide the transient behavior of the adaptive algorithm. The steady-state analysis for co-related input regressor data is analyzed, so this approach leads to a new performance results without restricting the input regression data to be white.

Keywords: Error saturation nonlinearity, transient analysis, impulsive noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1781
9120 A Multi-Feature Deep Learning Algorithm for Urban Traffic Classification with Limited Labeled Data

Authors: Rohan Putatunda, Aryya Gangopadhyay

Abstract:

Acoustic sensors, if embedded in smart street lights, can help in capturing the activities (car honking, sirens, events, traffic, etc.) in cities. Needless to say, the acoustic data from such scenarios are complex due to multiple audio streams originating from different events, and when decomposed to independent signals, the amount of retrieved data volume is small in quantity which is inadequate to train deep neural networks. So, in this paper, we address the two challenges: a) separating the mixed signals, and b) developing an efficient acoustic classifier under data paucity. So, to address these challenges, we propose an architecture with supervised deep learning, where the initial captured mixed acoustics data are analyzed with Fast Fourier Transformation (FFT), followed by filtering the noise from the signal, and then decomposed to independent signals by fast independent component analysis (Fast ICA). To address the challenge of data paucity, we propose a multi feature-based deep neural network with high performance that is reflected in our experiments when compared to the conventional convolutional neural network (CNN) and multi-layer perceptron (MLP).

Keywords: FFT, ICA, vehicle classification, multi-feature DNN, CNN, MLP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 432
9119 2D Graphical Analysis of Wastewater Influent Capacity Time Series

Authors: Monika Chuchro, Maciej Dwornik

Abstract:

The extraction of meaningful information from image could be an alternative method for time series analysis. In this paper, we propose a graphical analysis of time series grouped into table with adjusted colour scale for numerical values. The advantages of this method are also discussed. The proposed method is easy to understand and is flexible to implement the standard methods of pattern recognition and verification, especially for noisy environmental data.

Keywords: graphical analysis, time series, seasonality, noisy environmental data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1450
9118 Comparative Study of the Static and Dynamic Analysis of Multi-Storey Irregular Building

Authors: Bahador Bagheri, Ehsan Salimi Firoozabad, Mohammadreza Yahyaei

Abstract:

As the world move to the accomplishment of Performance Based Engineering philosophies in seismic design of Civil Engineering structures, new seismic design provisions require Structural Engineers to perform both static and dynamic analysis for the design of structures. While Linear Equivalent Static Analysis is performed for regular buildings up to 90m height in zone I and II, Dynamic Analysis should be performed for regular and irregular buildings in zone IV and V. Dynamic Analysis can take the form of a dynamic Time History Analysis or a linear Response Spectrum Analysis. In present study, Multi-storey irregular buildings with 20 stories have been modeled using software packages ETABS and SAP 2000 v.15 for seismic zone V in India. This paper also deals with the effect of the variation of the building height on the structural response of the shear wall building. Dynamic responses of building under actual earthquakes, EL-CENTRO 1949 and CHI-CHI Taiwan 1999 have been investigated. This paper highlights the accuracy and exactness of Time History analysis in comparison with the most commonly adopted Response Spectrum Analysis and Equivalent Static Analysis.

Keywords: Equivalent Static Analysis, Time history method, Response spectrum method, Reinforce concrete building, displacement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16144
9117 Economic Factorial Analysis of CO2 Emissions: The Divisia Index with Interconnected Factors Approach

Authors: Alexander Y. Vaninsky

Abstract:

This paper presents a method of economic factorial analysis of the CO2 emissions based on the extension of the Divisia index to interconnected factors. This approach, contrary to the Kaya identity, considers three main factors of the CO2 emissions: gross domestic product, energy consumption, and population - as equally important, and allows for accounting of all of them simultaneously. The three factors are included into analysis together with their carbon intensities that allows for obtaining a comprehensive picture of the change in the CO2 emissions. A computer program in R-language that is available for free download serves automation of the calculations. A case study of the U.S. carbon dioxide emissions is used as an example. 

Keywords: CO2 emissions, Economic analysis, Factorial analysis, Divisia index, Interconnected factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2559
9116 Validation of Reverse Engineered Web Application Models

Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini

Abstract:

Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.

Keywords: Validation, Dynamic Analysis, MutationAnalysis, Reverse Engineering, Web Applications

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1624
9115 Direct Design of Steel Bridge Using Nonlinear Inelastic Analysis

Authors: Boo-Sung Koh, Seung-Eock Kim

Abstract:

In this paper, a direct design using a nonlinear inelastic analysis is suggested. Also, this paper compares the load carrying capacity obtained by a nonlinear inelastic analysis with experiment results to verify the accuracy of the results. The allowable stress design results of a railroad through a plate girder bridge and the safety factor of the nonlinear inelastic analysis were compared to examine the safety performance. As a result, the load safety factor for the nonlinear inelastic analysis was twice as high as the required safety factor under the allowable stress design standard specified in the civil engineering structure design standards for urban magnetic levitation railways, which further verified the advantages of the proposed direct design method.

Keywords: Direct design, nonlinear inelastic analysis, residual stress, initial geometric imperfection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454
9114 The Urban Development Boundary as a Planning Tool for Sustainable Urban Form: The South African Situation

Authors: E. J. Cilliers

Abstract:

It is the living conditions in the cities that determine the future of our livelihood. “To change life, we must first change space"- Henri Lefebvre. Sustainable development is a utopian aspiration for South African cities (especially the case study of the Gauteng City Region), which are currently characterized by unplanned growth and increasing urban sprawl. While the reasons for poor environmental quality and living conditions are undoubtedly diverse and complex, having political, economical and social dimensions, it is argued that the prevailing approach to layout planning in South Africa is part of the problem. This article seeks a solution to the problem of sustainability, from a spatial planning perspective. The spatial planning tool, the urban development boundary, is introduced as the concept that will ensure empty talk being translated into a sustainable vision. The urban development boundary is a spatial planning tool that can be used and implemented to direct urban growth towards a more sustainable form. The urban development boundary aims to ensure planned urban areas, in contrast to the current unplanned areas characterized by urban sprawl and insufficient infrastructure. However, the success of the urban development boundary concept is subject to effective implementation measures, as well as adequate and efficient management. The concept of sustainable development can function as a driving force underlying societal change and transformation, but the interface between spatial planning and environmental management needs to be established (as this is the core aspects underlying sustainable development), and authorities needs to understand and implement this interface consecutively. This interface can, however, realize in terms of the objectives of the planning tool – the urban development boundary. The case study, the Gauteng City Region, is depicted as a site of economic growth and innovation, but there is a lack of good urban and regional governance, impacting on the design (layout) and function of urban areas and land use, as current authorities make uninformed decisions in terms of development applications, leading to unsustainable urban forms and unsustainable nodes. Place and space concepts are thus critical matters applicable to planning of the Gauteng City Region. The urban development boundary are thus explored as a planning tool to guide decision-making, and create a sustainable urban form, leading to better environmental and living conditions, and continuous sustainability.

Keywords: Urban planning, sustainable urban form, urbandevelopment boundary, planning tool.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2567
9113 Organizational Decision Based on Business Intelligence

Authors: Pejman Hosseinioun, Rose Shayeghi, Ghasem Ghorbani Rostam

Abstract:

Nowadays, obtaining traditional statistics and reports is not adequate for the needs of organizational managers. The managers need to analyze and to transform the raw data into knowledge in the world filled with information. Therefore in this regard various processes have been developed. In the meantime the artificial intelligence-based processes are used and the new topics such as business intelligence and knowledge discovery have emerged. In the current paper it is sought to study the business intelligence and its applications in the organizations.

Keywords: Business intelligence, business intelligence infrastructures, business processes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2030
9112 Generator of Hypotheses an Approach of Data Mining Based on Monotone Systems Theory

Authors: Rein Kuusik, Grete Lind

Abstract:

Generator of hypotheses is a new method for data mining. It makes possible to classify the source data automatically and produces a particular enumeration of patterns. Pattern is an expression (in a certain language) describing facts in a subset of facts. The goal is to describe the source data via patterns and/or IF...THEN rules. Used evaluation criteria are deterministic (not probabilistic). The search results are trees - form that is easy to comprehend and interpret. Generator of hypotheses uses very effective algorithm based on the theory of monotone systems (MS) named MONSA (MONotone System Algorithm).

Keywords: data mining, monotone systems, pattern, rule.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1256
9111 Performance Analysis of Organic Rankine Cycle Technology to Exploit Low-Grade Waste Heat to Power Generation in Indian Industry

Authors: Bipul Krishna Saha, Basab Chakraborty, Ashish Alex Sam, Parthasarathi Ghosh

Abstract:

The demand for energy is cumulatively increasing with time.  Since the availability of conventional energy resources is dying out gradually, significant interest is being laid on searching for alternate energy resources and minimizing the wastage of energy in various fields.  In such perspective, low-grade waste heat from several industrial sources can be reused to generate electricity. The present work is to further the adoption of the Organic Rankine Cycle (ORC) technology in Indian industrial sector.  The present paper focuses on extending the previously reported idea to the next level through a comparative review with three different working fluids using practical data from an Indian industrial plant. For comprehensive study in the simulation platform of Aspen Hysys®, v8.6, the waste heat data has been collected from a current coke oven gas plant in India.  A parametric analysis of non-regenerative ORC and regenerative ORC is executed using the working fluids R-123, R-11 and R-21 for subcritical ORC system.  The primary goal is to determine the optimal working fluid considering various system parameters like turbine work output, obtained system efficiency, irreversibility rate and second law efficiency under applied multiple heat source temperature (160 °C- 180 °C).  Selection of the turbo-expanders is one of the most crucial tasks for low-temperature applications in ORC system. The present work is an attempt to make suitable recommendation for the appropriate configuration of the turbine. In a nutshell, this study justifies the proficiency of integrating the ORC technology in Indian perspective and also finds the appropriate parameter of all components integrated in ORC system for building up an ORC prototype.

Keywords: Organic rankine cycle, regenerative organic rankine cycle, waste heat recovery, Indian industry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1269
9110 Model Solutions for Performance-Based Seismic Analysis of an Anchored Sheet Pile Quay Wall

Authors: C. J. W. Habets, D. J. Peters, J. G. de Gijt, A. V. Metrikine, S. N. Jonkman

Abstract:

Conventional seismic designs of quay walls in ports are mostly based on pseudo-static analysis. A more advanced alternative is the Performance-Based Design (PBD) method, which evaluates permanent deformations and amounts of (repairable) damage under seismic loading. The aim of this study is to investigate the suitability of this method for anchored sheet pile quay walls that were not purposely designed for seismic loads. A research methodology is developed in which pseudo-static, permanent-displacement and finite element analysis are employed, calibrated with an experimental reference case that considers a typical anchored sheet pile wall. A reduction factor that accounts for deformation behaviour is determined for pseudo-static analysis. A model to apply traditional permanent displacement analysis on anchored sheet pile walls is proposed. Dynamic analysis is successfully carried out. From the research it is concluded that PBD evaluation can effectively be used for seismic analysis and design of this type of structure.

Keywords: Anchored sheet pile quay wall, simplified dynamic analysis, performance-based design, pseudo-static analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2364
9109 Universal Current-Mode OTA-C KHN Biquad

Authors: Dalibor Biolek, Viera Biolková, Zden─øk Kolka

Abstract:

A universal current-mode biquad is described which represents an economical variant of well-known KHN (Kerwin, Huelsman, Newcomb) voltage-mode filter. The circuit consists of two multiple-output OTAs and of two grounded capacitors. Utilizing simple splitter of the input current and a pair of jumpers, all the basic 2nd-order transfer functions can be implemented. The principle is verified by Spice simulation on the level of a CMOS structure of OTAs.

Keywords: Biquad, current mode, OTA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2415
9108 Analysis of Medical Data using Data Mining and Formal Concept Analysis

Authors: Anamika Gupta, Naveen Kumar, Vasudha Bhatnagar

Abstract:

This paper focuses on analyzing medical diagnostic data using classification rules in data mining and context reduction in formal concept analysis. It helps in finding redundancies among the various medical examination tests used in diagnosis of a disease. Classification rules have been derived from positive and negative association rules using the Concept lattice structure of the Formal Concept Analysis. Context reduction technique given in Formal Concept Analysis along with classification rules has been used to find redundancies among the various medical examination tests. Also it finds out whether expensive medical tests can be replaced by some cheaper tests.

Keywords: Data Mining, Formal Concept Analysis, Medical Data, Negative Classification Rules.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738
9107 The Relation between Social Capital and Trust with Social Network Analysis

Authors: Safak Baykal

Abstract:

The purpose of this study is analyzing the relationship between trust and social capital of people with using Social Network Analysis. In this study, two aspects of social capital will be focused: Bonding, homophilous social capital (BoSC), and Bridging, heterophilous social capital (BrSC). These two aspects diverge each other regarding to the social theories. The other concept of the study is Trust (Tr), namely interpersonal trust, willing to ascribe good intentions to and have confidence in the words and actions of other people. In this study, the sample group, 61 people, was selected from a private firm from the defense industry. The relation between BoSC/BrSC and Tr is shown by using Social Network Analysis (SNA) and statistical analysis with Likert type-questionnaire. The results of the analysis show the Cronbach’s alpha value is 0.756 and social capital values (BoSC/BrSC) is not correlated with Tr values of the people.

Keywords: Social capital, interpersonal trust, social network analysis (SNA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2590
9106 Exploring Social Impact of Emerging Technologies from Futuristic Data

Authors: Heeyeul Kwon, Yongtae Park

Abstract:

Despite the highly touted benefits, emerging technologies have unleashed pervasive concerns regarding unintended and unforeseen social impacts. Thus, those wishing to create safe and socially acceptable products need to identify such side effects and mitigate them prior to the market proliferation. Various methodologies in the field of technology assessment (TA), namely Delphi, impact assessment, and scenario planning, have been widely incorporated in such a circumstance. However, literatures face a major limitation in terms of sole reliance on participatory workshop activities. They unfortunately missed out the availability of a massive untapped data source of futuristic information flooding through the Internet. This research thus seeks to gain insights into utilization of futuristic data, future-oriented documents from the Internet, as a supplementary method to generate social impact scenarios whilst capturing perspectives of experts from a wide variety of disciplines. To this end, network analysis is conducted based on the social keywords extracted from the futuristic documents by text mining, which is then used as a guide to produce a comprehensive set of detailed scenarios. Our proposed approach facilitates harmonized depictions of possible hazardous consequences of emerging technologies and thereby makes decision makers more aware of, and responsive to, broad qualitative uncertainties.

Keywords: Emerging technologies, futuristic data, scenario, text mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2391
9105 Designing a Pre-Assessment Tool to Support the Achievement of Green Building Certifications

Authors: Jisun Mo, Paola Boarin

Abstract:

The impact of common buildings on climate and environment has prompted people to get involved in the green building standards aimed at implementing rating tools or certifications. Thus, green building rating systems were introduced to the construction industry, and the demand for certified green buildings has increased gradually and succeeded considerably in enhancing people’s environmental awareness. However, the existing certification process has been unsatisfactory in attracting stakeholders and/or professionals who are actively engaged in adopting a rating system. It is because they have faced recurring barriers regarding limited information in understanding the rating process, time-consuming procedures and higher costs, which have a direct influence on pursuing green building rating systems. To promote the achievement of green building certifications within the building industry more successfully, this paper aims at designing a Pre-Assessment Tool (PAT) framework that can help stakeholders and/or professionals engaged in the construction industry to clarify their basic knowledge, timeframe and extra costs needed to activate a green building certification. First, taking the first steps towards the rating tool seems to be complicated because of upfront commitment to understanding the overall rating procedure is required. This conceptual PAT framework can increase basic knowledge of the rating tool and the certification process, mainly in terms of all resources or information of each credit requirements. Second, the assessment process of rating tools is generally known as a “lengthy and time-consuming system”, contributing to unenthusiastic reactions concerning green building projects. The proposed framework can predict the timeframe needed to identify how long it will take for a green project to process each credit requirement and the documentation required from the beginning of the certification process to final approval. Finally, most people often have the initial perception that pursuing green building certification costs more than constructing a non-green building, which makes it more difficult to execute rating tools. To overcome this issue, this PAT will help users to estimate the extra expenses such as certification fees and third-party contributions based on the track of the amount of time it takes to implement the rating tool throughout all the related stages. Also, it can prevent unexpected or hidden costs occurring in the process of assessment. Therefore, this proposed PAT framework can be recommended as an effective method to support the decision-making of inexperienced users and play an important role in promoting green building certification.

Keywords: Barriers, certification process, green building rating systems, pre-assessment tool.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 828
9104 Electron Density Discrepancy Analysis of Energy Metabolism Coenzymes

Authors: Alan Luo, Hunter N. B. Moseley

Abstract:

Many macromolecular structure entries in the Protein Data Bank (PDB) have a range of regional (localized) quality issues, be it derived from X-ray crystallography, Nuclear Magnetic Resonance (NMR) spectroscopy, or other experimental approaches. However, most PDB entries are judged by global quality metrics like R-factor, R-free, and resolution for X-ray crystallography or backbone phi-psi distribution statistics and average restraint violations for NMR. Regional quality is often ignored when PDB entries are re-used for a variety of structurally based analyses. The binding of ligands, especially ligands involved in energy metabolism, is of particular interest in many structurally focused protein studies. Using a regional quality metric that provides chemically interpretable information from electron density maps, a significant number of outliers in regional structural quality was detected across X-ray crystallographic PDB entries for proteins bound to biochemically critical ligands. In this study, a series of analyses was performed to evaluate both specific and general potential factors that could promote these outliers. In particular, these potential factors were the minimum distance to a metal ion, the minimum distance to a crystal contact, and the isotropic atomic b-factor. To evaluate these potential factors, Fisher’s exact tests were performed, using regional quality criteria of outlier (top 1%, 2.5%, 5%, or 10%) versus non-outlier compared to a potential factor metric above versus below a certain outlier cutoff. The results revealed a consistent general effect from region-specific normalized b-factors but no specific effect from metal ion contact distances and only a very weak effect from crystal contact distance as compared to the b-factor results. These findings indicate that no single specific potential factor explains a majority of the outlier ligand-bound regions, implying that human error is likely as important as these other factors. Thus, all factors, including human error, should be considered when regions of low structural quality are detected. Also, the downstream re-use of protein structures for studying ligand-bound conformations should screen the regional quality of the binding sites. Doing so prevents misinterpretation due to the presence of structural uncertainty or flaws in regions of interest.

Keywords: Biomacromolecular structure, coenzyme, electron density discrepancy analysis, X-ray crystallography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 255
9103 Multivariate Assessment of Mathematics Test Scores of Students in Qatar

Authors: Ali Rashash Alzahrani, Elizabeth Stojanovski

Abstract:

Data on various aspects of education are collected at the institutional and government level regularly. In Australia, for example, students at various levels of schooling undertake examinations in numeracy and literacy as part of NAPLAN testing, enabling longitudinal assessment of such data as well as comparisons between schools and states within Australia. Another source of educational data collected internationally is via the PISA study which collects data from several countries when students are approximately 15 years of age and enables comparisons in the performance of science, mathematics and English between countries as well as ranking of countries based on performance in these standardised tests. As well as student and school outcomes based on the tests taken as part of the PISA study, there is a wealth of other data collected in the study including parental demographics data and data related to teaching strategies used by educators. Overall, an abundance of educational data is available which has the potential to be used to help improve educational attainment and teaching of content in order to improve learning outcomes. A multivariate assessment of such data enables multiple variables to be considered simultaneously and will be used in the present study to help develop profiles of students based on performance in mathematics using data obtained from the PISA study.

Keywords: Cluster analysis, education, mathematics, profiles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 892
9102 An Exploratory Study of Reliability of Ranking vs. Rating in Peer Assessment

Authors: Yang Song, Yifan Guo, Edward F. Gehringer

Abstract:

Fifty years of research has found great potential for peer assessment as a pedagogical approach. With peer assessment, not only do students receive more copious assessments; they also learn to become assessors. In recent decades, more educational peer assessments have been facilitated by online systems. Those online systems are designed differently to suit different class settings and student groups, but they basically fall into two categories: rating-based and ranking-based. The rating-based systems ask assessors to rate the artifacts one by one following some review rubrics. The ranking-based systems allow assessors to review a set of artifacts and give a rank for each of them. Though there are different systems and a large number of users of each category, there is no comprehensive comparison on which design leads to higher reliability. In this paper, we designed algorithms to evaluate assessors' reliabilities based on their rating/ranking against the global ranks of the artifacts they have reviewed. These algorithms are suitable for data from both rating-based and ranking-based peer assessment systems. The experiments were done based on more than 15,000 peer assessments from multiple peer assessment systems. We found that the assessors in ranking-based peer assessments are at least 10% more reliable than the assessors in rating-based peer assessments. Further analysis also demonstrated that the assessors in ranking-based assessments tend to assess the more differentiable artifacts correctly, but there is no such pattern for rating-based assessors.

Keywords: Peer assessment, peer rating, peer ranking, reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1114
9101 The Influences of Accountants’ Potential Performance on Their Working Process: Government Savings Bank, Northeast, Thailand

Authors: Prateep Wajeetongratana

Abstract:

The purpose of this research was to study the influence of accountants’ potential performance on their working process, a case study of Government Savings Banks in the northeast of Thailand. The independent variables included accounting knowledge, accounting skill, accounting value, accounting ethics, and accounting attitude, while the dependent variable included the success of the working process. A total of 155 accountants working for Government Savings Banks were selected by random sampling. A questionnaire was used as a tool for collecting data. Descriptive statistics in this research included percentage, mean, and multiple regression analyses.

The findings revealed that the majority of accountants were female with an age between 35-40 years old. Most of the respondents had an undergraduate degree with ten years of experience. Moreover, the factors of accounting knowledge, accounting skill, accounting a value and accounting ethics and accounting attitude were rated at a high level. The findings from regression analysis of observation data revealed a causal relationship in that the observation data could explain at least 51 percent of the success in the accountants’ working process.

Keywords: Influence, Potential Performance, Success, Working Process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1393
9100 Identification of Promiscuous Epitopes for Cellular Immune Responses in the Major Antigenic Protein Rv3873 Encoded by Region of Difference 1 of Mycobacterium tuberculosis

Authors: Abu Salim Mustafa

Abstract:

Rv3873 is a relatively large size protein (371 amino acids in length) and its gene is located in the immunodominant genomic region of difference (RD)1 that is present in the genome of Mycobacterium tuberculosis but deleted from the genomes of all the vaccine strains of Bacillus Calmette Guerin (BCG) and most other mycobacteria. However, when tested for cellular immune responses using peripheral blood mononuclear cells from tuberculosis patients and BCG-vaccinated healthy subjects, this protein was found to be a major stimulator of cell mediated immune responses in both groups of subjects. In order to further identify the sequence of immunodominant epitopes and explore their Human Leukocyte Antigen (HLA)-restriction for epitope recognition, 24 peptides (25-mers overlapping with the neighboring peptides by 10 residues) covering the sequence of Rv3873 were synthesized chemically using fluorenylmethyloxycarbonyl chemistry and tested in cell mediated immune responses. The results of these experiments helped in the identification of an immunodominant peptide P9 that was recognized by people expressing varying HLA-DR types. Furthermore, it was also predicted to be a promiscuous binder with multiple epitopes for binding to HLA-DR, HLA-DP and HLA-DQ alleles of HLA-class II molecules that present antigens to T helper cells, and to HLA-class I molecules that present antigens to T cytotoxic cells. In addition, the evaluation of peptide P9 using an immunogenicity predictor server yielded a high score (0.94), which indicated a greater probability of this peptide to elicit a protective cellular immune response. In conclusion, P9, a peptide with multiple epitopes and ability to bind several HLA class I and class II molecules for presentation to cells of the cellular immune response, may be useful as a peptide-based vaccine against tuberculosis.

Keywords: Mycobacterium tuberculosis, Rv3873, peptides, vaccine

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 845