Search results for: data driven decision making
27442 A Systematic Review on Challenges in Big Data Environment
Authors: Rimmy Yadav, Anmol Preet Kaur
Abstract:
Big Data has demonstrated the vast potential in streamlining, deciding, spotting business drifts in different fields, for example, producing, fund, Information Technology. This paper gives a multi-disciplinary diagram of the research issues in enormous information and its procedures, instruments, and system identified with the privacy, data storage management, network and energy utilization, adaptation to non-critical failure and information representations. Other than this, result difficulties and openings accessible in this Big Data platform have made.Keywords: big data, privacy, data management, network and energy consumption
Procedia PDF Downloads 31427441 Place-Making Theory behind Claremont Court
Authors: Sandra Costa-Santos, Nadia Bertolino, Stephen Hicks, Vanessa May, Camilla Lewis
Abstract:
This paper aims to elaborate the architectural theory on place-making that supported Claremont Court housing scheme (Edinburgh, United Kingdom). Claremont Court (1959-62) is a large post-war mixed development housing scheme designed by Basil Spence, which included ‘place-making’ as one of its founding principles. Although some stylistic readings of the housing scheme have been published, the theory on place-making that allegedly ruled the design has yet to be clarified. The architecture allows us to mark or make a place within space in order to dwell. Under the framework of contemporary philosophical theories of place, this paper aims to explore the relationship between place and dwelling through a cross-disciplinary reading of Claremont Court, with a view to develop an architectural theory on place-making. Since dwelling represents the way we are immersed in our world in an existential manner, this theme is not just relevant for architecture but also for philosophy and sociology. The research in this work is interpretive-historic in nature. It examines documentary evidence of the original architectural design, together with relevant literature in sociology, history, and architecture, through the lens of theories of place. First, the paper explores how the dwelling types originally included in Claremont Court supported ideas of dwelling or meanings of home. Then, it traces shared space and social ties in order to study the symbolic boundaries that allow the creation of a collective identity or sense of belonging. Finally, the relation between the housing scheme and the supporting theory is identified. The findings of this research reveal Scottish architect Basil Spence’s exploration of the meaning of home, as he changed his approach to the mass housing while acting as President of the Royal Incorporation of British Architects (1958-60). When the British Government was engaged in various ambitious building programmes, he sought to drive architecture to a wider socio-political debate as president of the RIBA, hence moving towards a more ambitious and innovative socio-architectural approach. Rather than trying to address the ‘genius loci’ with an architectural proposition, as has been stated, the research shows that the place-making theory behind the housing scheme was supported by notions of community-based on shared space and dispositions. The design of the housing scheme was steered by a desire to foster social relations and collective identities, rather than by the idea of keeping the spirit of the place. This research is part of a cross-disciplinary project funded by the Arts and Humanities Research Council. The findings present Claremont Court as a signifier of Basil Spence’s attempt to address the post-war political debate on housing in United Kingdom. They highlight the architect’s theoretical agenda and challenge current purely stylistic readings of Claremont Court as they fail to acknowledge its social relevance.Keywords: architectural theory, dwelling, place-making, post-war housing
Procedia PDF Downloads 26627440 The Formulation of R&D Strategy for Biofuel Technology: A Case Study of the Aviation Industry in Iran
Authors: Maryam Amiri, Ali Rajabzade, Gholam Reza Goudarzi, Reza Heidari
Abstract:
Growth of technology and environmental changes are so fast and therefore, companies and industries have much tendency to do activities of R&D for active participation in the market and achievement to a competitive advantages. Aviation industry and its subdivisions have high level technology and play a special role in economic and social development of countries. So, in the aviation industry for getting new technologies and competing with other countries aviation industry, there is a requirement for capability in R&D. Considering of appropriate R&D strategy is supportive that day technologies of the world can be achieved. Biofuel technology is one of the newest technologies that has allocated discussion of the world in aviation industry to itself. The purpose of this research has been formulation of R&D strategy of biofuel technology in aviation industry of Iran. After reviewing of the theoretical foundations of the methods and R&D strategies, finally we classified R&D strategies in four main categories as follows: internal R&D, collaboration R&D, out sourcing R&D and in-house R&D. After a review of R&D strategies, a model for formulation of R&D strategy with the aim of developing biofuel technology in aviation industry in Iran was offered. With regard to the requirements and aracteristics of industry and technology in the model, we presented an integrated approach to R&D. Based on the techniques of decision making and analyzing of structured expert opinion, 4 R&D strategies for different scenarios and with the aim of developing biofuel technology in aviation industry in Iran were recommended. In this research, based on the common features of the implementation process of R&D, a logical classification of these methods are presented as R&D strategies. Then, R&D strategies and their characteristics was developed according to the experts. In the end, we introduced a model to consider the role of aviation industry and biofuel technology in R&D strategies. And lastly, for conditions and various scenarios of the aviation industry, we have formulated a specific R&D strategy.Keywords: aviation industry, biofuel technology, R&D, R&D strategy
Procedia PDF Downloads 58327439 Neuropsychological Testing in a Multi-Lingual Society: Normative Data for South African Adults in More Than Eight Languages
Authors: Sharon Truter, Ann B. Shuttleworth-Edwards
Abstract:
South Africa is a developing country with significant diversity in languages spoken and quality of education available, creating challenges for fair and accurate neuropsychological assessments when most available neuropsychological tests are obtained from English-speaking developed countries. The aim of this research was to compare normative data on a spectrum of commonly used neuropsychological tests for English- and Afrikaans-speaking South Africans with relatively high quality of education and South Africans with relatively low quality of education who speak Afrikaans, Sesotho, Setswana, Sepedi, Tsonga, Venda, Xhosa or Zulu. The participants were all healthy adults aged 18-60 years, with 8-12 years of education. All the participants were tested in their first language on the following tests: two non-verbal tests (Rey Osterrieth Complex Figure Test and Bell Cancellation Test), four verbal fluency tests (category, phonemic, verb and 'any words'), one verbal learning test (Rey Auditory Verbal Leaning Test) and three tests that have a verbal component (Trail Making Test A & B; Symbol Digit Modalities Test and Digit Span). Descriptive comparisons of mean scores and standard deviations across the language groups and between the groups with relatively high versus low quality of education highlight the importance of using normative data that takes into account language and quality of education.Keywords: cross-cultural, language, multi-lingual, neuropsychological testing, quality of education
Procedia PDF Downloads 17927438 An Integrated Label Propagation Network for Structural Condition Assessment
Authors: Qingsong Xiong, Cheng Yuan, Qingzhao Kong, Haibei Xiong
Abstract:
Deep-learning-driven approaches based on vibration responses have attracted larger attention in rapid structural condition assessment while obtaining sufficient measured training data with corresponding labels is relevantly costly and even inaccessible in practical engineering. This study proposes an integrated label propagation network for structural condition assessment, which is able to diffuse the labels from continuously-generating measurements by intact structure to those of missing labels of damage scenarios. The integrated network is embedded with damage-sensitive features extraction by deep autoencoder and pseudo-labels propagation by optimized fuzzy clustering, the architecture and mechanism which are elaborated. With a sophisticated network design and specified strategies for improving performance, the present network achieves to extends the superiority of self-supervised representation learning, unsupervised fuzzy clustering and supervised classification algorithms into an integration aiming at assessing damage conditions. Both numerical simulations and full-scale laboratory shaking table tests of a two-story building structure were conducted to validate its capability of detecting post-earthquake damage. The identifying accuracy of a present network was 0.95 in numerical validations and an average 0.86 in laboratory case studies, respectively. It should be noted that the whole training procedure of all involved models in the network stringently doesn’t rely upon any labeled data of damage scenarios but only several samples of intact structure, which indicates a significant superiority in model adaptability and feasible applicability in practice.Keywords: autoencoder, condition assessment, fuzzy clustering, label propagation
Procedia PDF Downloads 9927437 The Role of Marketing in the Promotion of the Istanbul Brand
Authors: Ipek Krom, Nurdan Tumbek Tekeoglu
Abstract:
In our globalizing world increased competition between cities have resulted in expanding investments in marketing activities. In order to promote tourism and reinvestments, the cities have been using marketing activities to create more attractive sites and make use of their resources more efficiently. In becoming a branded city marketing activities play a major role in building brand value, which in turn results in the attraction of newcomers, revisits, settlements, reinvestments and the development of the city. This paper focuses on the Istanbul brand, which carries an important role in the promotion of Turkey as being its cultural, economic and financial center. As one of the most historical and appealing metropolitans in the world with remains of ancient civilizations, Istanbul has attracted 11 million 843 thousand tourists in 2014. Increasing number of marketing activities developed by numerous actors of private and public sector are among the reasons why tourists prefer Istanbul. Among these reasons we can list the increasing number of hotels, developed infrastructure and better transportation, modern shopping malls, international festivals, exportation of Turkish TV series, gastronomy investments, congress tourism, health tourism, student exchange programs, expatriation opportunities, recreational activities and new tourism destinations. In this paper we explore the marketing activities in Istanbul in order to make the city of the most visited metropolitans in the world. Decision making people in the tourism sector have been interviewed to provide better insight to the addressed topics.Keywords: brand cities, marketing, tourism in istanbul, tourism marketing
Procedia PDF Downloads 33427436 Robust and Dedicated Hybrid Cloud Approach for Secure Authorized Deduplication
Authors: Aishwarya Shekhar, Himanshu Sharma
Abstract:
Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. In this process, duplicate data is expunged, leaving only one copy means single instance of the data to be accumulated. Though, indexing of each and every data is still maintained. Data deduplication is an approach for minimizing the part of storage space an organization required to retain its data. In most of the company, the storage systems carry identical copies of numerous pieces of data. Deduplication terminates these additional copies by saving just one copy of the data and exchanging the other copies with pointers that assist back to the primary copy. To ignore this duplication of the data and to preserve the confidentiality in the cloud here we are applying the concept of hybrid nature of cloud. A hybrid cloud is a fusion of minimally one public and private cloud. As a proof of concept, we implement a java code which provides security as well as removes all types of duplicated data from the cloud.Keywords: confidentiality, deduplication, data compression, hybridity of cloud
Procedia PDF Downloads 38527435 A Review of Machine Learning for Big Data
Authors: Devatha Kalyan Kumar, Aravindraj D., Sadathulla A.
Abstract:
Big data are now rapidly expanding in all engineering and science and many other domains. The potential of large or massive data is undoubtedly significant, make sense to require new ways of thinking and learning techniques to address the various big data challenges. Machine learning is continuously unleashing its power in a wide range of applications. In this paper, the latest advances and advancements in the researches on machine learning for big data processing. First, the machine learning techniques methods in recent studies, such as deep learning, representation learning, transfer learning, active learning and distributed and parallel learning. Then focus on the challenges and possible solutions of machine learning for big data.Keywords: active learning, big data, deep learning, machine learning
Procedia PDF Downloads 44827434 Strengthening Legal Protection of Personal Data through Technical Protection Regulation in Line with Human Rights
Authors: Tomy Prihananto, Damar Apri Sudarmadi
Abstract:
Indonesia recognizes the right to privacy as a human right. Indonesia provides legal protection against data management activities because the protection of personal data is a part of human rights. This paper aims to describe the arrangement of data management and data management in Indonesia. This paper is a descriptive research with qualitative approach and collecting data from literature study. Results of this paper are comprehensive arrangement of data that have been set up as a technical requirement of data protection by encryption methods. Arrangements on encryption and protection of personal data are mutually reinforcing arrangements in the protection of personal data. Indonesia has two important and immediately enacted laws that provide protection for the privacy of information that is part of human rights.Keywords: Indonesia, protection, personal data, privacy, human rights, encryption
Procedia PDF Downloads 18627433 Evaluation of Triage Performance: Nurse Practice and Problem Classifications
Authors: Atefeh Abdollahi, Maryam Bahreini, Babak Choobi Anzali, Fatemeh Rasooli
Abstract:
Introduction: Triage becomes the main part of organization of care in Emergency department (ED)s. It is used to describe the sorting of patients for treatment priority in ED. The accurate triage of injured patients has reduced fatalities and improved resource usage. Besides, the nurses’ knowledge and skill are important factors in triage decision-making. The ability to define an appropriate triage level and their need for intervention is crucial to guide to a safe and effective emergency care. Methods: This is a prospective cross-sectional study designed for emergency nurses working in four public university hospitals. Five triage workshops have been conducted every three months for emergency nurses based on a standard triage Emergency Severity Index (ESI) IV slide set - approved by Iranian Ministry of Health. Most influential items on triage performance were discussed through brainstorming in workshops which then, were peer reviewed by five emergency physicians and two head registered nurses expert panel. These factors that might distract nurse’ attention from proper decisions included patients’ past medical diseases, the natural tricks of triage and system failure. After permission had been taken, emergency nurses participated in the study and were given the structured questionnaire. Data were analysed by SPSS 21.0. Results: 92 emergency nurses enrolled in the study. 30 % of nurses reported the past history of chronic disease as the most influential confounding factor to ascertain triage level, other important factors were the history of prior admission, past history of myocardial infarction and heart failure to be 20, 17 and 11 %, respectively. Regarding the concept of difficulties in triage practice, 54.3 % reported that the discussion with patients and family members was difficult and 8.7 % declared that it is hard to stay in a single triage room whole day. Among the participants, 45.7 and 26.1 % evaluated the triage workshops as moderately and highly effective, respectively. 56.5 % reported overcrowding as the most important system-based difficulty. Nurses were mainly doubtful to differentiate between the triage levels 2 and 3 according to the ESI VI system. No significant correlation was found between the work record of nurses in triage and the uncertainty in determining the triage level and difficulties. Conclusion: The work record of nurses hardly seemed to be effective on the triage problems and issues. To correct the deficits, training workshops should be carried out, followed by continuous refresher training and supportive supervision.Keywords: assessment, education, nurse, triage
Procedia PDF Downloads 23627432 An Emergentist Defense of Incompatibility between Morally Significant Freedom and Causal Determinism
Authors: Lubos Rojka
Abstract:
The common perception of morally responsible behavior is that it presupposes freedom of choice, and that free decisions and actions are not determined by natural events, but by a person. In other words, the moral agent has the ability and the possibility of doing otherwise when making morally responsible decisions, and natural causal determinism cannot fully account for morally significant freedom. The incompatibility between a person’s morally significant freedom and causal determinism appears to be a natural position. Nevertheless, some of the most influential philosophical theories on moral responsibility are compatibilist or semi-compatibilist, and they exclude the requirement of alternative possibilities, which contradicts the claims of classical incompatibilism. The compatibilists often employ Frankfurt-style thought experiments to prove their theory. The goal of this paper is to examine the role of imaginary Frankfurt-style examples in compatibilist accounts. More specifically, the compatibilist accounts defended by John Martin Fischer and Michael McKenna will be inserted into the broader understanding of a person elaborated by Harry Frankfurt, Robert Kane and Walter Glannon. Deeper analysis reveals that the exclusion of alternative possibilities based on Frankfurt-style examples is problematic and misleading. A more comprehensive account of moral responsibility and morally significant (source) freedom requires higher order complex theories of human will and consciousness, in which rational and self-creative abilities and a real possibility to choose otherwise, at least on some occasions during a lifetime, are necessary. Theoretical moral reasons and their logical relations seem to require a sort of higher-order agent-causal incompatibilism. The ability of theoretical or abstract moral reasoning requires complex (strongly emergent) mental and conscious properties, among which an effective free will, together with first and second-order desires. Such a hierarchical theoretical model unifies reasons-responsiveness, mesh theory and emergentism. It is incompatible with physical causal determinism, because such determinism only allows non-systematic processes that may be hard to predict, but not complex (strongly) emergent systems. An agent’s effective will and conscious reflectivity is the starting point of a morally responsible action, which explains why a decision is 'up to the subject'. A free decision does not always have a complete causal history. This kind of an emergentist source hyper-incompatibilism seems to be the best direction of the search for an adequate explanation of moral responsibility in the traditional (merit-based) sense. Physical causal determinism as a universal theory would exclude morally significant freedom and responsibility in the traditional sense because it would exclude the emergence of and supervenience by the essential complex properties of human consciousness.Keywords: consciousness, free will, determinism, emergence, moral responsibility
Procedia PDF Downloads 16727431 The Impact of Human Rights Legislations and Evolution
Authors: Emad Eid Nemr Danyal
Abstract:
the problem of respect for human rights in Southeast Asia has come to be a prime problem and is attracting the attention of the worldwide community. basically, the affiliation of Southeast Asian international locations (ASEAN) made human rights one in every of its fundamental problems and in the ASEAN constitution in 2008. in the end, the Intergovernmental fee on Human Rights ASEAN Human Rights (AICHR) changed into mounted. AICHR is the Southeast Asia Human Rights Enforcement fee charged with the responsibilities, capabilities and powers to sell and defend human rights. but, at the quit of 2016, the protective function assigned to the AICHR turned into no longer but fulfilled. that is shown through numerous cases of human rights violations which are still ongoing and have now not but been solved. One case that has lately come to mild is human rights violations in opposition to the Rohingya human beings in Myanmar. the use of a prison-normative technique, the take a look at examines the urgency of setting up a human rights tribunal in Southeast Asia able to making a decision binding on ASEAN members or responsible events. facts suggests ASEAN desires regional courts to cope with human rights abuses inside the ASEAN area. in addition, the observe additionally highlights 3 essential elements that ASEAN must recall when setting up a human rights tribunal, specifically: quantity. a full-size distinction in terms of democracy and human rights development most of the individuals, a regular implementation of the precept of non-interference and the economic difficulty of the continuation of the court.Keywords: sustainable development, human rights, the right to development, the human rights-based approach to development, environmental rights, economic development, social sustainability human rights protection, human rights violations, workers’ rights, justice, security
Procedia PDF Downloads 1727430 A Concept Analysis of Self-Efficacy for Cancer Pain Management
Authors: Yi-Fung Lin, Yuan-Mei Liao
Abstract:
Background: Pain is common among patients with cancer and is also one of the most disturbing symptoms. As this suffering is subjective, if patients proactively participate in their pain self-management, pain could be alleviated effectively. However, not everyone can carry out self-management very well because human behavior is a product of the cognition process. In this process, we can see that "self-efficacy" plays an essential role in affecting human behaviors. Methods: We used the eight steps of concept analysis proposed by Walker and Avant to clarify the concept of “self-efficacy for cancer pain management.” A comprehensive literature review was conducted for relevant publications that were published during the period of 1977 to 2021. We used several keywords, including self-efficacy, self-management, concept analysis, conceptual framework, and cancer pain, to search the following databases: PubMed, CINAHL, Web of Science, and Embase. Results: We identified three defining attributes for the concept of self-efficacy for cancer pain management, including pain management abilities, confidence, and continuous pain monitoring, and recognized six skills related to pain management abilities: problem-solving, decision-making, resource utilization, forming partnerships between medical professionals and patients, planning actions, and self-regulation. Five antecedents for the concept of self-efficacy for cancer pain management were identified: pain experience, existing cancer pain, pain-related knowledge, a belief in pain management, and physical/mental state. Consequences related to self-efficacy for cancer pain management were achievement of pain self-management, well pain control, satisfying quality of life, and containing motivation. Conclusions: This analysis provides researchers with a clearer understanding of the concept of “self-efficacy for cancer pain management.” The findings presented here provide a foundation for future research and nursing interventions to enhance self-efficacy for cancer pain management.Keywords: cancer pain, concept analysis, self-efficacy, self-management
Procedia PDF Downloads 7227429 Efficiency and Scale Elasticity in Network Data Envelopment Analysis: An Application to International Tourist Hotels in Taiwan
Authors: Li-Hsueh Chen
Abstract:
Efficient operation is more and more important for managers of hotels. Unlike the manufacturing industry, hotels cannot store their products. In addition, many hotels provide room service, and food and beverage service simultaneously. When efficiencies of hotels are evaluated, the internal structure should be considered. Hence, based on the operational characteristics of hotels, this study proposes a DEA model to simultaneously assess the efficiencies among the room production division, food and beverage production division, room service division and food and beverage service division. However, not only the enhancement of efficiency but also the adjustment of scale can improve the performance. In terms of the adjustment of scale, scale elasticity or returns to scale can help to managers to make decisions concerning expansion or contraction. In order to construct a reasonable approach to measure the efficiencies and scale elasticities of hotels, this study builds an alternative variable-returns-to-scale-based two-stage network DEA model with the combination of parallel and series structures to explore the scale elasticities of the whole system, room production division, food and beverage production division, room service division and food and beverage service division based on the data of international tourist hotel industry in Taiwan. The results may provide valuable information on operational performance and scale for managers and decision makers.Keywords: efficiency, scale elasticity, network data envelopment analysis, international tourist hotel
Procedia PDF Downloads 22627428 Assessing the Actions of the Farm Mangers to Execute Field Operations at Opportune Times
Authors: G. Edwards, N. Dybro, L. J. Munkholm, C. G. Sørensen
Abstract:
Planning agricultural operations requires an understanding of when fields are ready for operations. However determining a field’s readiness is a difficult process that can involve large amounts of data and an experienced farm manager. A consequence of this is that operations are often executed when fields are unready, or partially unready, which can compromise results incurring environmental impacts, decreased yield and increased operational costs. In order to assess timeliness of operations’ execution, a new scheme is introduced to quantify the aptitude of farm managers to plan operations. Two criteria are presented by which the execution of operations can be evaluated as to their exploitation of a field’s readiness window. A dataset containing the execution dates of spring and autumn operations on 93 fields in Iowa, USA, over two years, was considered as an example and used to demonstrate how operations’ executions can be evaluated. The execution dates were compared with simulated data to gain a measure of how disparate the actual execution was from the ideal execution. The presented tool is able to evaluate the spring operations better than the autumn operations as required data was lacking to correctly parameterise the crop model. Further work is needed on the underlying models of the decision support tool in order for its situational knowledge to emulate reality more consistently. However the assessment methods and evaluation criteria presented offer a standard by which operations' execution proficiency can be quantified and could be used to identify farm managers who require decisional support when planning operations, or as a means of incentivising and promoting the use of sustainable farming practices.Keywords: operation management, field readiness, sustainable farming, workability
Procedia PDF Downloads 39127427 Intertwined Lives: Narratives of Children with Disabilities and Their Siblings
Authors: Shyamani Hettiarachchi
Abstract:
The experiences of children with disabilities and their siblings are seldom documented in Sri Lanka. The aim of this study was to uncover the narratives of young children with disabilities and their siblings in Sri Lanka. Fifteen children with disabilities and fifteen siblings were included in this study. Opportunities were offered to the participants to engage in artwork and story making activities. Narratives on the artwork and stories were gathered and the data analyzed using the key principles of Framework Analysis to determine the key themes. The key themes to emerge were of love, protectiveness, insecurity and visibility. The results highlight the need to take account of the experiences of children with disabilities and their siblings to understand how they understand and cope with disability.Keywords: art, children with disabilities, narratives, siblings, storymaking
Procedia PDF Downloads 27827426 Artificial Neural Network-Based Prediction of Effluent Quality of Wastewater Treatment Plant Employing Data Preprocessing Approaches
Authors: Vahid Nourani, Atefeh Ashrafi
Abstract:
Prediction of treated wastewater quality is a matter of growing importance in water treatment procedure. In this way artificial neural network (ANN), as a robust data-driven approach, has been widely used for forecasting the effluent quality of wastewater treatment. However, developing ANN model based on appropriate input variables is a major concern due to the numerous parameters which are collected from treatment process and the number of them are increasing in the light of electronic sensors development. Various studies have been conducted, using different clustering methods, in order to classify most related and effective input variables. This issue has been overlooked in the selecting dominant input variables among wastewater treatment parameters which could effectively lead to more accurate prediction of water quality. In the presented study two ANN models were developed with the aim of forecasting effluent quality of Tabriz city’s wastewater treatment plant. Biochemical oxygen demand (BOD) was utilized to determine water quality as a target parameter. Model A used Principal Component Analysis (PCA) for input selection as a linear variance-based clustering method. Model B used those variables identified by the mutual information (MI) measure. Therefore, the optimal ANN structure when the result of model B compared with model A showed up to 15% percent increment in Determination Coefficient (DC). Thus, this study highlights the advantage of PCA method in selecting dominant input variables for ANN modeling of wastewater plant efficiency performance.Keywords: Artificial Neural Networks, biochemical oxygen demand, principal component analysis, mutual information, Tabriz wastewater treatment plant, wastewater treatment plant
Procedia PDF Downloads 13227425 Ranking the Factors That Influence the Construction Project Success: The Jordanian Perspective
Authors: Ghanim A. Bekr
Abstract:
Project success is what must be done for the project to be acceptable to the client, stakeholders and end-users who will be affected by the project. The study of project success and the critical success factors (CSFs) are the means adopted to improve the effectiveness of project. This research is conducted to make an attempt to identify which variables influence the success of project implementation. This study has selected, through an extensive literature review and interviews, (83) factors categorized in (7) groups that the questionnaire respondents were asked to score. The responses from 66 professionals with an average of 15 years of experience in different types of construction projects in Jordan were collected and analyzed using SPSS and most important factors for success for various success criteria are presented depending on the relative importance index to rank the categories. The research revealed the significant groups of factors are: Client related factors, Contractor’s related factors, Project Manager (PM) related factors, and Project management related factors. In addition the top ten sub factors are: Assertion of the client towards short time of the project, availability of skilled labor, Assertion of the client towards high level of the quality, capability of the client in taking risk, previous experience of the PM in similar projects, previous experience of the contractor in similar projects, decision making by the client/ the client’s representative at the right time, assertion of client towards low cost of project, experience in project management in previous projects, and flow of the information among parties. The results would be helpful to construction project professionals in taking proactive measures for successful completion of construction projects in Jordan.Keywords: construction projects, critical success factors, Jordan, project success
Procedia PDF Downloads 16527424 Partial Differential Equation-Based Modeling of Brain Response to Stimuli
Authors: Razieh Khalafi
Abstract:
The brain is the information processing centre of the human body. Stimuli in the form of information are transferred to the brain and then brain makes the decision on how to respond to them. In this research, we propose a new partial differential equation which analyses the EEG signals and make a relationship between the incoming stimuli and the brain response to them. In order to test the proposed model, a set of external stimuli applied to the model and the model’s outputs were checked versus the real EEG data. The results show that this model can model the EEG signal well. The proposed model is useful not only for modelling of EEG signal in case external stimuli but it can be used for modelling of brain response in case of internal stimuli.Keywords: brain, stimuli, partial differential equation, response, EEG signal
Procedia PDF Downloads 55727423 Developing a Model for the Relation between Heritage and Place Identity
Authors: A. Arjomand Kermani, N. Charbgoo, M. Alalhesabi
Abstract:
In the situation of great acceleration of changes and the need for new developments in the cities on one hand and conservation and regeneration approaches on the other hand, place identity and its relation with heritage context have taken on new importance. This relation is generally mutual and complex one. The significant point in this relation is that the process of identifying something as heritage rather than just historical phenomena, brings that which may be inherited into the realm of identity. In planning and urban design as well as environmental psychology and phenomenology domain, place identity and its attributes and components were studied and discussed. However, the relation between physical environment (especially heritage) and identity has been neglected in the planning literature. This article aims to review the knowledge on this field and develop a model on the influence and relation of these two major concepts (heritage and identity). To build this conceptual model, we draw on available literature in environmental psychology as well as planning on place identity and heritage environment using a descriptive-analytical methodology to understand how they can inform the planning strategies and governance policies. A cross-disciplinary analysis is essential to understand the nature of place identity and heritage context and develop a more holistic model of their relationship in order to be employed in planning process and decision making. Moreover, this broader and more holistic perspective would enable both social scientists and planners to learn from one another’s expertise for a fuller understanding of community dynamics. The result indicates that a combination of these perspectives can provide a richer understanding—not only of how planning impacts our experience of place, but also how place identity can impact community planning and development.Keywords: heritage, inter-disciplinary study, place identity, planning
Procedia PDF Downloads 42527422 CNN-Based Compressor Mass Flow Estimator in Industrial Aircraft Vapor Cycle System
Authors: Justin Reverdi, Sixin Zhang, Saïd Aoues, Fabrice Gamboa, Serge Gratton, Thomas Pellegrini
Abstract:
In vapor cycle systems, the mass flow sensor plays a key role for different monitoring and control purposes. However, physical sensors can be inaccurate, heavy, cumbersome, expensive, or highly sensitive to vibrations, which is especially problematic when embedded into an aircraft. The conception of a virtual sensor, based on other standard sensors, is a good alternative. This paper has two main objectives. Firstly, a data-driven model using a convolutional neural network is proposed to estimate the mass flow of the compressor. We show that it significantly outperforms the standard polynomial regression model (thermodynamic maps) in terms of the standard MSE metric and engineer performance metrics. Secondly, a semi-automatic segmentation method is proposed to compute the engineer performance metrics for real datasets, as the standard MSE metric may pose risks in analyzing the dynamic behavior of vapor cycle systems.Keywords: deep learning, convolutional neural network, vapor cycle system, virtual sensor
Procedia PDF Downloads 6427421 The Various Legal Dimensions of Genomic Data
Authors: Amy Gooden
Abstract:
When human genomic data is considered, this is often done through only one dimension of the law, or the interplay between the various dimensions is not considered, thus providing an incomplete picture of the legal framework. This research considers and analyzes the various dimensions in South African law applicable to genomic sequence data – including property rights, personality rights, and intellectual property rights. The effective use of personal genomic sequence data requires the acknowledgement and harmonization of the rights applicable to such data.Keywords: artificial intelligence, data, law, genomics, rights
Procedia PDF Downloads 14427420 Orchestra Course Outcomes in Terms of Values Education
Authors: Z. Kurtaslan, H. Hakan Okay, E. Can Dönmez, I. Kuçukdoğan
Abstract:
Music education aims to bring up individuals most appropriately and to advanced levels as a balanced whole physically, cognitively, affectively, and kinesthetically while making a major contribution to the physical and spiritual development of the individual. The most crucial aim of music education, an influential education medium per se, is to make music be loved; yet, among its educational aims are concepts such as affinity, friendship, goodness, philanthropy, responsibility, and respect all extremely crucial bringing up individuals as a balanced whole. One of the most essential assets of the music education is the training of making music together, solidifying musical knowledge and enabling the acquisition of cooperation. This habit requires internalization of values like responsibility, patience, cooperativeness, respect, self-control, friendship, and fairness. If musicians lack these values, the ensemble will become after some certain time a cacophony. In this qualitative research, the attitudes of music teacher candidates in orchestra/chamber music classes will be examined in terms of values.Keywords: education, music, orchestra/chamber music, values
Procedia PDF Downloads 50627419 Application of Stochastic Models to Annual Extreme Streamflow Data
Authors: Karim Hamidi Machekposhti, Hossein Sedghi
Abstract:
This study was designed to find the best stochastic model (using of time series analysis) for annual extreme streamflow (peak and maximum streamflow) of Karkheh River at Iran. The Auto-regressive Integrated Moving Average (ARIMA) model used to simulate these series and forecast those in future. For the analysis, annual extreme streamflow data of Jelogir Majin station (above of Karkheh dam reservoir) for the years 1958–2005 were used. A visual inspection of the time plot gives a little increasing trend; therefore, series is not stationary. The stationarity observed in Auto-Correlation Function (ACF) and Partial Auto-Correlation Function (PACF) plots of annual extreme streamflow was removed using first order differencing (d=1) in order to the development of the ARIMA model. Interestingly, the ARIMA(4,1,1) model developed was found to be most suitable for simulating annual extreme streamflow for Karkheh River. The model was found to be appropriate to forecast ten years of annual extreme streamflow and assist decision makers to establish priorities for water demand. The Statistical Analysis System (SAS) and Statistical Package for the Social Sciences (SPSS) codes were used to determinate of the best model for this series.Keywords: stochastic models, ARIMA, extreme streamflow, Karkheh river
Procedia PDF Downloads 15027418 A Novel Guided Search Based Multi-Objective Evolutionary Algorithm
Authors: A. Baviskar, C. Sandeep, K. Shankar
Abstract:
Solving Multi-objective Optimization Problems requires faster convergence and better spread. Though existing Evolutionary Algorithms (EA's) are able to achieve this, the computation effort can further be reduced by hybridizing them with innovative strategies. This study is focuses on converging to the pareto front faster while adapting the advantages of Strength Pareto Evolutionary Algorithm-II (SPEA-II) for a better spread. Two different approaches based on optimizing the objective functions independently are implemented. In the first method, the decision variables corresponding to the optima of individual objective functions are strategically used to guide the search towards the pareto front. In the second method, boundary points of the pareto front are calculated and their decision variables are seeded to the initial population. Both the methods are applied to different constrained and unconstrained multi-objective test functions. It is observed that proposed guided search based algorithm gives better convergence and diversity than several well-known existing algorithms (such as NSGA-II and SPEA-II) in considerably less number of iterations.Keywords: boundary points, evolutionary algorithms (EA's), guided search, strength pareto evolutionary algorithm-II (SPEA-II)
Procedia PDF Downloads 27827417 Bridge Members Segmentation Algorithm of Terrestrial Laser Scanner Point Clouds Using Fuzzy Clustering Method
Authors: Donghwan Lee, Gichun Cha, Jooyoung Park, Junkyeong Kim, Seunghee Park
Abstract:
3D shape models of the existing structure are required for many purposes such as safety and operation management. The traditional 3D modeling methods are based on manual or semi-automatic reconstruction from close-range images. It occasions great expense and time consuming. The Terrestrial Laser Scanner (TLS) is a common survey technique to measure quickly and accurately a 3D shape model. This TLS is used to a construction site and cultural heritage management. However there are many limits to process a TLS point cloud, because the raw point cloud is massive volume data. So the capability of carrying out useful analyses is also limited with unstructured 3-D point. Thus, segmentation becomes an essential step whenever grouping of points with common attributes is required. In this paper, members segmentation algorithm was presented to separate a raw point cloud which includes only 3D coordinates. This paper presents a clustering approach based on a fuzzy method for this objective. The Fuzzy C-Means (FCM) is reviewed and used in combination with a similarity-driven cluster merging method. It is applied to the point cloud acquired with Lecia Scan Station C10/C5 at the test bed. The test-bed was a bridge which connects between 1st and 2nd engineering building in Sungkyunkwan University in Korea. It is about 32m long and 2m wide. This bridge was used as pedestrian between two buildings. The 3D point cloud of the test-bed was constructed by a measurement of the TLS. This data was divided by segmentation algorithm for each member. Experimental analyses of the results from the proposed unsupervised segmentation process are shown to be promising. It can be processed to manage configuration each member, because of the segmentation process of point cloud.Keywords: fuzzy c-means (FCM), point cloud, segmentation, terrestrial laser scanner (TLS)
Procedia PDF Downloads 23827416 Project-Based Learning in Engineering Education
Authors: M. Greeshma, V. Ashvini, P. Jayarekha
Abstract:
Project based learning (PBL) is a student-driven educational framework and offers the student an opportunity for in-depth investigations of courses. This paper presents the need of PBL in engineering education for the student to graduate with a capacity to design and implement complex problems. The implementation strategy of PBL and its related challenges are presented. The case study that energizes the engineering curriculum with a relevance to the real-world of technology along with its benefits to the students is also included.Keywords: PBL, engineering education, curriculum, implement complex
Procedia PDF Downloads 47627415 Data Analysis Tool for Predicting Water Scarcity in Industry
Authors: Tassadit Issaadi Hamitouche, Nicolas Gillard, Jean Petit, Valerie Lavaste, Celine Mayousse
Abstract:
Water is a fundamental resource for the industry. It is taken from the environment either from municipal distribution networks or from various natural water sources such as the sea, ocean, rivers, aquifers, etc. Once used, water is discharged into the environment, reprocessed at the plant or treatment plants. These withdrawals and discharges have a direct impact on natural water resources. These impacts can apply to the quantity of water available, the quality of the water used, or to impacts that are more complex to measure and less direct, such as the health of the population downstream from the watercourse, for example. Based on the analysis of data (meteorological, river characteristics, physicochemical substances), we wish to predict water stress episodes and anticipate prefectoral decrees, which can impact the performance of plants and propose improvement solutions, help industrialists in their choice of location for a new plant, visualize possible interactions between companies to optimize exchanges and encourage the pooling of water treatment solutions, and set up circular economies around the issue of water. The development of a system for the collection, processing, and use of data related to water resources requires the functional constraints specific to the latter to be made explicit. Thus the system will have to be able to store a large amount of data from sensors (which is the main type of data in plants and their environment). In addition, manufacturers need to have 'near-real-time' processing of information in order to be able to make the best decisions (to be rapidly notified of an event that would have a significant impact on water resources). Finally, the visualization of data must be adapted to its temporal and geographical dimensions. In this study, we set up an infrastructure centered on the TICK application stack (for Telegraf, InfluxDB, Chronograf, and Kapacitor), which is a set of loosely coupled but tightly integrated open source projects designed to manage huge amounts of time-stamped information. The software architecture is coupled with the cross-industry standard process for data mining (CRISP-DM) data mining methodology. The robust architecture and the methodology used have demonstrated their effectiveness on the study case of learning the level of a river with a 7-day horizon. The management of water and the activities within the plants -which depend on this resource- should be considerably improved thanks, on the one hand, to the learning that allows the anticipation of periods of water stress, and on the other hand, to the information system that is able to warn decision-makers with alerts created from the formalization of prefectoral decrees.Keywords: data mining, industry, machine Learning, shortage, water resources
Procedia PDF Downloads 12227414 Evaluating Multiple Diagnostic Tests: An Application to Cervical Intraepithelial Neoplasia
Authors: Areti Angeliki Veroniki, Sofia Tsokani, Evangelos Paraskevaidis, Dimitris Mavridis
Abstract:
The plethora of diagnostic test accuracy (DTA) studies has led to the increased use of systematic reviews and meta-analysis of DTA studies. Clinicians and healthcare professionals often consult DTA meta-analyses to make informed decisions regarding the optimum test to choose and use for a given setting. For example, the human papilloma virus (HPV) DNA, mRNA, and cytology can be used for the cervical intraepithelial neoplasia grade 2+ (CIN2+) diagnosis. But which test is the most accurate? Studies directly comparing test accuracy are not always available, and comparisons between multiple tests create a network of DTA studies that can be synthesized through a network meta-analysis of diagnostic tests (DTA-NMA). The aim is to summarize the DTA-NMA methods for at least three index tests presented in the methodological literature. We illustrate the application of the methods using a real data set for the comparative accuracy of HPV DNA, HPV mRNA, and cytology tests for cervical cancer. A search was conducted in PubMed, Web of Science, and Scopus from inception until the end of July 2019 to identify full-text research articles that describe a DTA-NMA method for three or more index tests. Since the joint classification of the results from one index against the results of another index test amongst those with the target condition and amongst those without the target condition are rarely reported in DTA studies, only methods requiring the 2x2 tables of the results of each index test against the reference standard were included. Studies of any design published in English were eligible for inclusion. Relevant unpublished material was also included. Ten relevant studies were finally included to evaluate their methodology. DTA-NMA methods that have been presented in the literature together with their advantages and disadvantages are described. In addition, using 37 studies for cervical cancer obtained from a published Cochrane review as a case study, an application of the identified DTA-NMA methods to determine the most promising test (in terms of sensitivity and specificity) for use as the best screening test to detect CIN2+ is presented. As a conclusion, different approaches for the comparative DTA meta-analysis of multiple tests may conclude to different results and hence may influence decision-making. Acknowledgment: This research is co-financed by Greece and the European Union (European Social Fund- ESF) through the Operational Programme «Human Resources Development, Education and Lifelong Learning 2014-2020» in the context of the project “Extension of Network Meta-Analysis for the Comparison of Diagnostic Tests ” (MIS 5047640).Keywords: colposcopy, diagnostic test, HPV, network meta-analysis
Procedia PDF Downloads 14227413 Big Brain: A Single Database System for a Federated Data Warehouse Architecture
Authors: X. Gumara Rigol, I. Martínez de Apellaniz Anzuola, A. Garcia Serrano, A. Franzi Cros, O. Vidal Calbet, A. Al Maruf
Abstract:
Traditional federated architectures for data warehousing work well when corporations have existing regional data warehouses and there is a need to aggregate data at a global level. Schibsted Media Group has been maturing from a decentralised organisation into a more globalised one and needed to build both some of the regional data warehouses for some brands at the same time as the global one. In this paper, we present the architectural alternatives studied and why a custom federated approach was the notable recommendation to go further with the implementation. Although the data warehouses are logically federated, the implementation uses a single database system which presented many advantages like: cost reduction and improved data access to global users allowing consumers of the data to have a common data model for detailed analysis across different geographies and a flexible layer for local specific needs in the same place.Keywords: data integration, data warehousing, federated architecture, Online Analytical Processing (OLAP)
Procedia PDF Downloads 238