Search results for: editing tool
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5039

Search results for: editing tool

4229 Mineral Deposits in Spatial Planning Systems – Review of European Practices

Authors: Alicja Kot-Niewiadomska

Abstract:

Securing sustainable access to raw materials is vital for the growth of the European economy and for the goals laid down in Strategy Europe 2020. One of the most important sources of mineral raw materials are primary deposits. The efficient management of them, including extraction, will ensure competitiveness of the European economy. A critical element of this approach is mineral deposits safeguarding and the most important tool - spatial planning. The safeguarding of deposits should be understood as safeguarding of land access, and safeguarding of area against development, which may (potential) prevent the use of the deposit and the necessary mining activities. Many European Union countries successfully integrated their mineral policy and spatial policy, which has ensured the proper place of mineral deposits in their spatial planning systems. These, in turn, are widely recognized as the most important mineral deposit safeguarding tool, the essence of which is to ensure long-term access to its resources. The examples of Austria, Portugal, Slovakia, Czech Republic, Sweden, and the United Kingdom, discussed in the paper, are often mentioned as examples of good practices in this area. Although none of these countries managed to avoid cases of social and environmental conflicts related to mining activities, the solutions they implement certainly deserve special attention. And for many countries, including Poland, they can be a potential source of solutions aimed at improving the protection of mineral deposits.

Keywords: mineral deposits, land use planning, mineral deposit safeguarding, European practices

Procedia PDF Downloads 167
4228 Comet Assay: A Promising Tool for the Risk Assessment and Clinical Management of Head and Neck Tumors

Authors: Sarim Ahmad

Abstract:

The Single Cell Gel Electrophoresis Assay (SCGE, known as comet assay) is a potential, uncomplicated, sensitive and state-of-the-art technique for quantitating DNA damage at individual cell level and repair from in vivo and in vitro samples of eukaryotic cells and some prokaryotic cells, being popular in its widespread use in various areas including human biomonitoring, genotoxicology, ecological monitoring and as a tool for research into DNA damage or repair in different cell types in response to a range of DNA damaging agents, cancer risk and therapy. The method involves the encapsulation of cells in a low-melting-point agarose suspension, lysis of the cells in neutral or alkaline (pH > 13) conditions, and electrophoresis of the suspended lysed cells, resulting in structures resembling comets as observed by fluorescence microscopy; the intensity of the comet tail relative to the head reflects the number of DNA breaks. The likely basis for this is that loops containing a break lose their supercoiling and become free to extend towards the anode. This is followed by visual analysis with staining of DNA and calculating fluorescence to determine the extent of DNA damage. This can be performed by manual scoring or automatically by imaging software. The assay can, therefore, predict an individual’s tumor sensitivity to radiation and various chemotherapeutic drugs and further assess the oxidative stress within tumors and to detect the extent of DNA damage in various cancerous and precancerous lesions of oral cavity.

Keywords: comet assay, single cell gel electrophoresis, DNA damage, early detection test

Procedia PDF Downloads 291
4227 TACTICAL: Ram Image Retrieval in Linux Using Protected Mode Architecture’s Paging Technique

Authors: Sedat Aktas, Egemen Ulusoy, Remzi Yildirim

Abstract:

This article explains how to get a ram image from a computer with a Linux operating system and what steps should be followed while getting it. What we mean by taking a ram image is the process of dumping the physical memory instantly and writing it to a file. This process can be likened to taking a picture of everything in the computer’s memory at that moment. This process is very important for tools that analyze ram images. Volatility can be given as an example because before these tools can analyze ram, images must be taken. These tools are used extensively in the forensic world. Forensic, on the other hand, is a set of processes for digitally examining the information on any computer or server on behalf of official authorities. In this article, the protected mode architecture in the Linux operating system is examined, and the way to save the image sample of the kernel driver and system memory to disk is followed. Tables and access methods to be used in the operating system are examined based on the basic architecture of the operating system, and the most appropriate methods and application methods are transferred to the article. Since there is no article directly related to this study on Linux in the literature, it is aimed to contribute to the literature with this study on obtaining ram images. LIME can be mentioned as a similar tool, but there is no explanation about the memory dumping method of this tool. Considering the frequency of use of these tools, the contribution of the study in the field of forensic medicine has been the main motivation of the study due to the intense studies on ram image in the field of forensics.

Keywords: linux, paging, addressing, ram-image, memory dumping, kernel modules, forensic

Procedia PDF Downloads 109
4226 River Offtake Management Using Mathematical Modelling Tool: A Case Study of the Gorai River, Bangladesh

Authors: Sarwat Jahan, Asker Rajin Rahman

Abstract:

Management of offtake of any fluvial river is very sensitive in terms of long-term sustainability where the variation of water flow and sediment transport range are wide enough throughout a hydrological year. The Gorai River is a major distributary of the Ganges River in Bangladesh and is termed as a primary source of fresh water for the South-West part of the country. Every year, significant siltation of the Gorai offtake disconnects it from the Ganges during the dry season. As a result, the socio-economic and environmental condition of the downstream areas has been deteriorating for a few decades. To improve the overall situation of the Gorai offtake and its dependent areas, a study has been conducted by the Institute of Water Modelling, Bangladesh, in 2022. Using the mathematical morphological modeling tool MIKE 21C of DHI Water & Environment, Denmark, simulated results revealed the need for dredging/river training structures for offtake management at the Gorai offtake to ensure significant dry season flow towards the downstream. The dry season flow is found to increase significantly with the proposed river interventions, which also improves the environmental conditions in terms of salinity of the South-West zone of the country. This paper summarizes the primary findings of the analyzed results of the developed mathematical model for improving the existing condition of the Gorai River.

Keywords: Gorai river, mathematical modelling, offtake, siltation, salinity

Procedia PDF Downloads 92
4225 Improving the Detection of Depression in Sri Lanka: Cross-Sectional Study Evaluating the Efficacy of a 2-Question Screen for Depression

Authors: Prasad Urvashi, Wynn Yezarni, Williams Shehan, Ravindran Arun

Abstract:

Introduction: Primary health services are often the first point of contact that patients with mental illness have with the healthcare system. A number of tools have been developed to increase detection of depression in the context of primary care. However, one challenge amongst many includes utilizing these tools within the limited primary care consultation timeframe. Therefore, short questionnaires that screen for depression that are just as effective as more comprehensive diagnostic tools may be beneficial in improving detection rates of patients visiting a primary care setting. Objective: To develop and determine the sensitivity and specificity of a 2-Question Questionnaire (2-QQ) to screen for depression in in a suburban primary care clinic in Ragama, Sri Lanka. The purpose is to develop a short screening tool for depression that is culturally adapted in order to increase the detection of depression in the Sri Lankan patient population. Methods: This was a cross-sectional study involving two steps. Step one: verbal administration of 2-QQ to patients by their primary care physician. Step two: completion of the Peradeniya Depression Scale, a validated diagnostic tool for depression, the patient after their consultation with the primary care physician. The results from the PDS were then correlated to the results from the 2-QQ for each patient to determine sensitivity and specificity of the 2-QQ. Results: A score of 1/+ on the 2-QQ was most sensitive but least specific. Thus, setting the threshold at this level is effective for correctly identifying depressed patients, but also inaccurately captures patients who are not depressed. A score of 6 on the 2-QQ was most specific but least sensitive. Setting the threshold at this level is effective for correctly identifying patients without depression, but not very effective at capturing patients with depression. Discussion: In the context of primary care, it may be worthwhile setting the 2-QQ screen at a lower threshold for positivity (such as a score of 1 or above). This would generate a high test sensitivity and thus capture the majority of patients that have depression. On the other hand, by setting a low threshold for positivity, patients who do not have depression but score higher than 1 on the 2-QQ will also be falsely identified as testing positive for depression. However, the benefits of identifying patients who present with depression may outweigh the harms of falsely identifying a non-depressed patient. It is our hope that the 2-QQ will serve as a quick primary screen for depression in the primary care setting and serve as a catalyst to identify and treat individuals with depression.

Keywords: depression, primary care, screening tool, Sri Lanka

Procedia PDF Downloads 254
4224 Reasonableness to Strengthen Citizen Participation in Mexican Anti-Corruption Policies

Authors: Amós García Montaño

Abstract:

In a democracy, a public policy must be developed within the regulatory framework and considering citizen participation in its planning, design, execution, and evaluation stages, necessary factors to have both legal support and sufficient legitimacy for its operation. However, the complexity and magnitude of certain public problems results in difficulties for the generation of consensus among society members, leading to unstable and unsuccessful scenarios for the exercise of the right to citizen participation and the generation of effective and efficient public policies. This is the case of public policies against corruption, an issue that in Mexico is difficult to define and generates conflicting opinions. To provide a possible solution to this delicate reality, this paper analyzes the principle of reasonableness as a tool for identifying the basic elements that guarantee a fundamental level of the exercise of the right to citizen participation in the fight against corruption, adopting elements of human rights indicator methodologies. In this sense, the relevance of having a legal framework that establishes obligations to incorporate proactive and transversal citizen participation in the matter is observed. It is also noted the need to monitor the operation of various citizen participation mechanisms in the decision-making processes of the institutions involved in the fight and prevention of corruption, which lead to an increase in the improvement of the perception of the citizen role as a relevant actor in this field. It is concluded that the principle of reasonableness is presented as a very useful tool for the identification of basic elements that facilitate the fulfillment of human rights commitments in the field of public policies.

Keywords: anticorruption, public participation, public policies, reasonableness

Procedia PDF Downloads 80
4223 Machine Translation Analysis of Chinese Dish Names

Authors: Xinyu Zhang, Olga Torres-Hostench

Abstract:

This article presents a comparative study evaluating and comparing the quality of machine translation (MT) output of Chinese gastronomy nomenclature. Chinese gastronomic culture is experiencing an increased international acknowledgment nowadays. The nomenclature of Chinese gastronomy not only reflects a specific aspect of culture, but it is related to other areas of society such as philosophy, traditional medicine, etc. Chinese dish names are composed of several types of cultural references, such as ingredients, colors, flavors, culinary techniques, cooking utensils, toponyms, anthroponyms, metaphors, historical tales, among others. These cultural references act as one of the biggest difficulties in translation, in which the use of translation techniques is usually required. Regarding the lack of Chinese food-related translation studies, especially in Chinese-Spanish translation, and the current massive use of MT, the quality of the MT output of Chinese dish names is questioned. Fifty Chinese dish names with different types of cultural components were selected in order to complete this study. First, all of these dish names were translated by three different MT tools (Google Translate, Baidu Translate and Bing Translator). Second, a questionnaire was designed and completed by 12 Chinese online users (Chinese graduates of a Hispanic Philology major) in order to find out user preferences regarding the collected MT output. Finally, human translation techniques were observed and analyzed to identify what translation techniques would be observed more often in the preferred MT proposals. The result reveals that the MT output of the Chinese gastronomy nomenclature is not of high quality. It would be recommended not to trust the MT in occasions like restaurant menus, TV culinary shows, etc. However, the MT output could be used as an aid for tourists to have a general idea of a dish (the main ingredients, for example). Literal translation turned out to be the most observed technique, followed by borrowing, generalization and adaptation, while amplification, particularization and transposition were infrequently observed. Possibly because that the MT engines at present are limited to relate equivalent terms and offer literal translations without taking into account the whole context meaning of the dish name, which is essential to the application of those less observed techniques. This could give insight into the post-editing of the Chinese dish name translation. By observing and analyzing translation techniques in the proposals of the machine translators, the post-editors could better decide which techniques to apply in each case so as to correct mistakes and improve the quality of the translation.

Keywords: Chinese dish names, cultural references, machine translation, translation techniques

Procedia PDF Downloads 132
4222 Neural Network Models for Actual Cost and Actual Duration Estimation in Construction Projects: Findings from Greece

Authors: Panagiotis Karadimos, Leonidas Anthopoulos

Abstract:

Predicting the actual cost and duration in construction projects concern a continuous and existing problem for the construction sector. This paper addresses this problem with modern methods and data available from past public construction projects. 39 bridge projects, constructed in Greece, with a similar type of available data were examined. Considering each project’s attributes with the actual cost and the actual duration, correlation analysis is performed and the most appropriate predictive project variables are defined. Additionally, the most efficient subgroup of variables is selected with the use of the WEKA application, through its attribute selection function. The selected variables are used as input neurons for neural network models through correlation analysis. For constructing neural network models, the application FANN Tool is used. The optimum neural network model, for predicting the actual cost, produced a mean squared error with a value of 3.84886e-05 and it was based on the budgeted cost and the quantity of deck concrete. The optimum neural network model, for predicting the actual duration, produced a mean squared error with a value of 5.89463e-05 and it also was based on the budgeted cost and the amount of deck concrete.

Keywords: actual cost and duration, attribute selection, bridge construction, neural networks, predicting models, FANN TOOL, WEKA

Procedia PDF Downloads 132
4221 Online Delivery Approaches of Post Secondary Virtual Inclusive Media Education

Authors: Margot Whitfield, Andrea Ducent, Marie Catherine Rombaut, Katia Iassinovskaia, Deborah Fels

Abstract:

Learning how to create inclusive media, such as closed captioning (CC) and audio description (AD), in North America is restricted to the private sector, proprietary company-based training. We are delivering (through synchronous and asynchronous online learning) the first Canadian post-secondary, practice-based continuing education course package in inclusive media for broadcast production and processes. Despite the prevalence of CC and AD taught within the field of translation studies in Europe, North America has no comparable field of study. This novel approach to audio visual translation (AVT) education develops evidence-based methodology innovations, stemming from user study research with blind/low vision and Deaf/hard of hearing audiences for television and theatre, undertaken at Ryerson University. Knowledge outcomes from the courses include a) Understanding how CC/AD fit within disability/regulatory frameworks in Canada. b) Knowledge of how CC/AD could be employed in the initial stages of production development within broadcasting. c) Writing and/or speaking techniques designed for media. d) Hands-on practice in captioning re-speaking techniques and open source technologies, or in AD techniques. e) Understanding of audio production technologies and editing techniques. The case study of the curriculum development and deployment, involving first-time online course delivery from academic and practitioner-based instructors in introductory Captioning and Audio Description courses (CDIM 101 and 102), will compare two different instructors' approaches to learning design, including the ratio of synchronous and asynchronous classroom time and technological engagement tools on meeting software platform such as breakout rooms and polling. Student reception of these two different approaches will be analysed using qualitative thematic and quantitative survey analysis. Thus far, anecdotal conversations with students suggests that they prefer synchronous compared with asynchronous learning within our hands-on online course delivery method.

Keywords: inclusive media theory, broadcasting practices, AVT post secondary education, respeaking, audio description, learning design, virtual education

Procedia PDF Downloads 178
4220 Technical, Environmental and Financial Assessment for Optimal Sizing of Run-of-River Small Hydropower Project: Case Study in Colombia

Authors: David Calderon Villegas, Thomas Kaltizky

Abstract:

Run-of-river (RoR) hydropower projects represent a viable, clean, and cost-effective alternative to dam-based plants and provide decentralized power production. However, RoR schemes cost-effectiveness depends on the proper selection of site and design flow, which is a challenging task because it requires multivariate analysis. In this respect, this study presents the development of an investment decision support tool for assessing the optimal size of an RoR scheme considering the technical, environmental, and cost constraints. The net present value (NPV) from a project perspective is used as an objective function for supporting the investment decision. The tool has been tested by applying it to an actual RoR project recently proposed in Colombia. The obtained results show that the optimum point in financial terms does not match the flow that maximizes energy generation from exploiting the river's available flow. For the case study, the flow that maximizes energy corresponds to a value of 5.1 m3/s. In comparison, an amount of 2.1 m3/s maximizes the investors NPV. Finally, a sensitivity analysis is performed to determine the NPV as a function of the debt rate changes and the electricity prices and the CapEx. Even for the worst-case scenario, the optimal size represents a positive business case with an NPV of 2.2 USD million and an IRR 1.5 times higher than the discount rate.

Keywords: small hydropower, renewable energy, RoR schemes, optimal sizing, objective function

Procedia PDF Downloads 129
4219 Key Parameters Analysis of the Stirring Systems in the Optmization Procedures

Authors: T. Gomes, J. Manzi

Abstract:

The inclusion of stirring systems in the calculation and optimization procedures has been undergone a significant lack of attention, what it can reflect in the results because such systems provide an additional energy to the process, besides promote a better distribution of mass and energy. This is meaningful for the reactive systems, particularly for the Continuous Stirred Tank Reactor (CSTR), for which the key variables and parameters, as well as the operating conditions of stirring systems, can play a pivotal role and it has been showed in the literature that neglect these factors can lead to sub-optimal results. It is also well known that the sole use of the First Law of Thermodynamics as an optimization tool cannot yield satisfactory results, since the joint use of the First and Second Laws condensed into a procedure so-called entropy generation minimization (EGM) has shown itself able to drive the system towards better results. Therefore, the main objective of this paper is to determine the effects of key parameters of the stirring system in the optimization procedures by means of EGM applied to the reactive systems. Such considerations have been possible by dimensional analysis according to Rayleigh and Buckingham's method, which takes into account the physical and geometric parameters and the variables of the reactive system. For the simulation purpose based on the production of propylene glycol, the results have shown a significant increase in the conversion rate from 36% (not-optimized system) to 95% (optimized system) with a consequent reduction of by-products. In addition, it has been possible to establish the influence of the work of the stirrer in the optimization procedure, in which can be described as a function of the fluid viscosity and consequently of the temperature. The conclusions to be drawn also indicate that the use of the entropic analysis as optimization tool has been proved to be simple, easy to apply and requiring low computational effort.

Keywords: stirring systems, entropy, reactive system, optimization

Procedia PDF Downloads 243
4218 Surface Roughness in the Incremental Forming of Drawing Quality Cold Rolled CR2 Steel Sheet

Authors: Zeradam Yeshiwas, A. Krishnaia

Abstract:

The aim of this study is to verify the resulting surface roughness of parts formed by the Single-Point Incremental Forming (SPIF) process for an ISO 3574 Drawing Quality Cold Rolled CR2 Steel. The chemical composition of drawing quality Cold Rolled CR2 steel is comprised of 0.12 percent of carbon, 0.5 percent of manganese, 0.035 percent of sulfur, 0.04 percent phosphorous, and the remaining percentage is iron with negligible impurities. The experiments were performed on a 3-axis vertical CNC milling machining center equipped with a tool setup comprising a fixture and forming tools specifically designed and fabricated for the process. The CNC milling machine was used to transfer the tool path code generated in Mastercam 2017 environment into three-dimensional motions by the linear incremental progress of the spindle. The blanks of Drawing Quality Cold Rolled CR2 steel sheets of 1 mm of thickness have been fixed along their periphery by a fixture and hardened high-speed steel (HSS) tools with a hemispherical tip of 8, 10 and 12mm of diameter were employed to fabricate sample parts. To investigate the surface roughness, hyperbolic-cone shape specimens were fabricated based on the chosen experimental design. The effect of process parameters on the surface roughness was studied using three important process parameters, i.e., tool diameter, feed rate, and step depth. In this study, the Taylor-Hobson Surtronic 3+ surface roughness tester profilometer was used to determine the surface roughness of the parts fabricated using the arithmetic mean deviation (Rₐ). In this instrument, a small tip is dragged across a surface while its deflection is recorded. Finally, the optimum process parameters and the main factor affecting surface roughness were found using the Taguchi design of the experiment and ANOVA. A Taguchi experiment design with three factors and three levels for each factor, the standard orthogonal array L9 (3³) was selected for the study using the array selection table. The lowest value of surface roughness is significant for surface roughness improvement. For this objective, the ‘‘smaller-the-better’’ equation was used for the calculation of the S/N ratio. The finishing roughness parameter Ra has been measured for the different process combinations. The arithmetic means deviation (Rₐ) was measured via the experimental design for each combination of the control factors by using Taguchi experimental design. Four roughness measurements were taken for a single component and the average roughness was taken to optimize the surface roughness. The lowest value of Rₐ is very important for surface roughness improvement. For this reason, the ‘‘smaller-the-better’’ Equation was used for the calculation of the S/N ratio. Analysis of the effect of each control factor on the surface roughness was performed with a ‘‘S/N response table’’. Optimum surface roughness was obtained at a feed rate of 1500 mm/min, with a tool radius of 12 mm, and with a step depth of 0.5 mm. The ANOVA result shows that step depth is an essential factor affecting surface roughness (91.1 %).

Keywords: incremental forming, SPIF, drawing quality steel, surface roughness, roughness behavior

Procedia PDF Downloads 60
4217 Multi-Objectives Genetic Algorithm for Optimizing Machining Process Parameters

Authors: Dylan Santos De Pinho, Nabil Ouerhani

Abstract:

Energy consumption of machine-tools is becoming critical for machine-tool builders and end-users because of economic, ecological and legislation-related reasons. Many machine-tool builders are seeking for solutions that allow the reduction of energy consumption of machine-tools while preserving the same productivity rate and the same quality of machined parts. In this paper, we present the first results of a project conducted jointly by academic and industrial partners to reduce the energy consumption of a Swiss-Type lathe. We employ genetic algorithms to find optimal machining parameters – the set of parameters that lead to the best trade-off between energy consumption, part quality and tool lifetime. Three main machining process parameters are considered in our optimization technique, namely depth of cut, spindle rotation speed and material feed rate. These machining process parameters have been identified as the most influential ones in the configuration of the Swiss-type machining process. A state-of-the-art multi-objective genetic algorithm has been used. The algorithm combines three fitness functions, which are objective functions that permit to evaluate a set of parameters against the three objectives: energy consumption, quality of the machined parts, and tool lifetime. In this paper, we focus on the investigation of the fitness function related to energy consumption. Four different energy consumption related fitness functions have been investigated and compared. The first fitness function refers to the Kienzle cutting force model. The second fitness function uses the Material Removal Rate (RMM) as an indicator of energy consumption. The two other fitness functions are non-deterministic, learning-based functions. One fitness function uses a simple Neural Network to learn the relation between the process parameters and the energy consumption from experimental data. Another fitness function uses Lasso regression to determine the same relation. The goal is, then, to find out which fitness functions predict best the energy consumption of a Swiss-Type machining process for the given set of machining process parameters. Once determined, these functions may be used for optimization purposes – determine the optimal machining process parameters leading to minimum energy consumption. The performance of the four fitness functions has been evaluated. The Tornos DT13 Swiss-Type Lathe has been used to carry out the experiments. A mechanical part including various Swiss-Type machining operations has been selected for the experiments. The evaluation process starts with generating a set of CNC (Computer Numerical Control) programs for machining the part at hand. Each CNC program considers a different set of machining process parameters. During the machining process, the power consumption of the spindle is measured. All collected data are assigned to the appropriate CNC program and thus to the set of machining process parameters. The evaluation approach consists in calculating the correlation between the normalized measured power consumption and the normalized power consumption prediction for each of the four fitness functions. The evaluation shows that the Lasso and Neural Network fitness functions have the highest correlation coefficient with 97%. The fitness function “Material Removal Rate” (MRR) has a correlation coefficient of 90%, whereas the Kienzle-based fitness function has a correlation coefficient of 80%.

Keywords: adaptive machining, genetic algorithms, smart manufacturing, parameters optimization

Procedia PDF Downloads 143
4216 Case Study: Optimization of Contractor’s Financing through Allocation of Subcontractors

Authors: Helen S. Ghali, Engy Serag, A. Samer Ezeldin

Abstract:

In many countries, the construction industry relies heavily on outsourcing models in executing their projects and expanding their businesses to fit in the diverse market. Such extensive integration of subcontractors is becoming an influential factor in contractor’s cash flow management. Accordingly, subcontractors’ financial terms are important phenomena and pivotal components for the well-being of the contractor’s cash flow. The aim of this research is to study the contractor’s cash flow with respect to the owner and subcontractor’s payment management plans, considering variable advance payment, payment frequency, and lag and retention policies. The model is developed to provide contractors with a decision support tool that can assist in selecting the optimum subcontracting plan to minimize the contractor’s financing limits and optimize the profit values. The model is built using Microsoft Excel VBA coding, and the genetic algorithm is utilized as the optimization tool. Three objective functions are investigated, which are minimizing the highest negative overdraft value, minimizing the net present worth of overdraft, and maximizing the project net profit. The model is validated on a full-scale project which includes both self-performed and subcontracted work packages. The results show potential outputs in optimizing the contractor’s negative cash flow values and, in the meantime, assisting contractors in selecting suitable subcontractors to achieve the objective function.

Keywords: cash flow optimization, payment plan, procurement management, subcontracting plan

Procedia PDF Downloads 127
4215 The Holistic Nursing WebQuest: An Interactive Teaching/Learning Strategy

Authors: Laura M. Schwarz

Abstract:

WebQuests are an internet-based interactive teaching/learning tool and utilize a scaffolded methodology. WebQuests employ critical thinking, afford inquiry-based constructivist learning, and readily employ Bloom’s Taxonomy. WebQuests have generally been used as instructional technology tools in primary and secondary education and have more recently grown in popularity in higher education. The study of the efficacy of WebQuests as an instructional approach to learning, however, has been limited, particularly in the nursing education arena. The purpose of this mixed-methods study was to determine nursing students’ perceptions of the effectiveness of the Nursing WebQuest as a teaching/learning strategy for holistic nursing-related content. Quantitative findings (N=42) suggested that learners were active participants, used reflection, thought of new ideas, used analysis skills, discovered something new, and assessed the worth of something while taking part in the WebQuests. Qualitative findings indicated that participants found WebQuest positives as easy to understand and navigate; clear and organized; interactive; good alternative learning format, and used a variety of quality resources. Participants saw drawbacks as requiring additional time and work; and occasional failed link or link causing them to lose their location in the WebQuest. Recommendations include using larger sample size and more diverse populations from various programs and universities. In conclusion, WebQuests were found to be an effective teaching/learning tool as positively assessed by study participants.

Keywords: holistic nursing, nursing education, teaching/learning strategy, WebQuests

Procedia PDF Downloads 124
4214 Suggestions to the Legislation about Medical Ethics and Ethics Review in the Age of Medical Artificial Intelligence

Authors: Xiaoyu Sun

Abstract:

In recent years, the rapid development of Artificial Intelligence (AI) has extensively promoted medicine, pharmaceutical, and other related fields. The medical research and development of artificial intelligence by scientific and commercial organizations are on the fast track. The ethics review is one of the critical procedures of registration to get the products approved and launched. However, the SOPs for ethics review is not enough to guide the healthy and rapid development of artificial intelligence in healthcare in China. Ethical Review Measures for Biomedical Research Involving Human Beings was enacted by the National Health Commission of the People's Republic of China (NHC) on December 1st, 2016. However, from a legislative design perspective, it was neither updated timely nor in line with the trends of AI international development. Therefore, it was great that NHC published a consultation paper on the updated version on March 16th, 2021. Based on the most updated laws and regulations in the States and EU, and in-depth-interviewed 11 subject matter experts in China, including lawmakers, regulators, and key members of ethics review committees, heads of Regulatory Affairs in SaMD industry, and data scientists, several suggestions were proposed on top of the updated version. Although the new version indicated that the Ethics Review Committees need to be created by National, Provincial and individual institute levels, the review authorities of different levels were not clarified. The suggestion is that the precise scope of review authorities for each level should be identified based on Risk Analysis and Management Model, such as the complicated leading technology, gene editing, should be reviewed by National Ethics Review Committees, it will be the job of individual institute Ethics Review Committees to review and approve the clinical study with less risk such as an innovative cream to treat acne. Furthermore, to standardize the research and development of artificial intelligence in healthcare in the age of AI, more clear guidance should be given to data security in the layers of data, algorithm, and application in the process of ethics review. In addition, transparency and responsibility, as two of six principles in the Rome Call for AI Ethics, could be further strengthened in the updated version. It is the shared goal among all countries to manage well and develop AI to benefit human beings. Learned from the other countries who have more learning and experience, China could be one of the most advanced countries in artificial intelligence in healthcare.

Keywords: biomedical research involving human beings, data security, ethics committees, ethical review, medical artificial intelligence

Procedia PDF Downloads 166
4213 Cheiloscopy: A Study on Predominant Lip Print Patterns among the Gujarati Population

Authors: Pooja Ahuja, Tejal Bhutani, M. S. Dahiya

Abstract:

Cheiloscopy, the study of lip prints, is a tool in forensic investigation technique that deals with identification of individuals based on lips patterns. The objective of this study is to determine predominant lip print pattern found among the Gujarati population, to evaluate whether any sex difference exists and to study the permanence of the pattern over six months duration. The study comprised of 100 healthy individuals (50 males and 50 females), in the age group of 18 to 25 years of Gujarati population of the Gandhinagar region of the Gujarat state, India. By using Suzuki and Tsuchihashi classification, Lip prints were then divided into four quadrants and also classified on the basis of peripheral shape of the lips. Materials used to record the lip prints were dark brown colored lipstick, cellophane tape, and white bond paper. Lipstick was applied uniformly, and lip prints were taken on the glued portion of cellophane tape and then stuck on to a white bond paper. These lip prints were analyzed with magnifying lens and virtually with stereo microscope. On the analysis of the subject population, results showed Branched pattern Type II (29.57 percentage) to be most predominant in the Gujarati population. Branched pattern Type II (35.60 percentage) and long vertical Type I (28.28 percentage) were most prevalent in males and females respectively and large full lips were most predominantly present in both the sexes. The study concludes that lip prints in any form can be an effective tool for identification of an individual in a closed or open group forms.

Keywords: cheiloscopy, lip pattern, predomianant, Gujarati population

Procedia PDF Downloads 296
4212 Multiperson Drone Control with Seamless Pilot Switching Using Onboard Camera and Openpose Real-Time Keypoint Detection

Authors: Evan Lowhorn, Rocio Alba-Flores

Abstract:

Traditional classification Convolutional Neural Networks (CNN) attempt to classify an image in its entirety. This becomes problematic when trying to perform classification with a drone’s camera in real-time due to unpredictable backgrounds. Object detectors with bounding boxes can be used to isolate individuals and other items, but the original backgrounds remain within these boxes. These basic detectors have been regularly used to determine what type of object an item is, such as “person” or “dog.” Recent advancement in computer vision, particularly with human imaging, is keypoint detection. Human keypoint detection goes beyond bounding boxes to fully isolate humans and plot points, or Regions of Interest (ROI), on their bodies within an image. ROIs can include shoulders, elbows, knees, heads, etc. These points can then be related to each other and used in deep learning methods such as pose estimation. For drone control based on human motions, poses, or signals using the onboard camera, it is important to have a simple method for pilot identification among multiple individuals while also giving the pilot fine control options for the drone. To achieve this, the OpenPose keypoint detection network was used with body and hand keypoint detection enabled. OpenPose supports the ability to combine multiple keypoint detection methods in real-time with a single network. Body keypoint detection allows simple poses to act as the pilot identifier. The hand keypoint detection with ROIs for each finger can then offer a greater variety of signal options for the pilot once identified. For this work, the individual must raise their non-control arm to be identified as the operator and send commands with the hand on their other arm. The drone ignores all other individuals in the onboard camera feed until the current operator lowers their non-control arm. When another individual wish to operate the drone, they simply raise their arm once the current operator relinquishes control, and then they can begin controlling the drone with their other hand. This is all performed mid-flight with no landing or script editing required. When using a desktop with a discrete NVIDIA GPU, the drone’s 2.4 GHz Wi-Fi connection combined with OpenPose restrictions to only body and hand allows this control method to perform as intended while maintaining the responsiveness required for practical use.

Keywords: computer vision, drone control, keypoint detection, openpose

Procedia PDF Downloads 181
4211 A Pilot Study on the Development and Validation of an Instrument to Evaluate Inpatient Beliefs, Expectations and Attitudes toward Reflexology (IBEAR)-16

Authors: Samuel Attias, Elad Schiff, Zahi Arnon, Eran Ben-Arye, Yael Keshet, Ibrahim Matter, Boker Lital Keinan

Abstract:

Background: Despite the extensive use of manual therapies, reflexology in particular, no validated tools have been developed to evaluate patients' beliefs, attitudes and expectations regarding reflexology. Such tools however are essential to improve the results of the reflexology treatment, by better adjusting it to the patients' attitudes and expectations. The tool also enables assessing correlations with clinical results of interventional studies using reflexology. Methods: The IBEAR (Inpatient Beliefs, Expectations and Attitudes toward Reflexology) tool contains 25 questions (8 demographic and 17 specifically addressing reflexology), and was constructed in several stages: brainstorming by a multidisciplinary team of experts; evaluation of each of the proposed questions by the experts' team; and assessment of the experts' degree of agreement per each question, based on a Likert 1-7 scale (1 – don't agree at all; 7 – agree completely). Cronbach's Alpha was computed to evaluate the questionnaire's reliability while the Factor analysis test was used for further validation (228 patients). The questionnaire was tested and re-tested (48h) on a group of 199 patients to assure clarity and reliability, using the Pearson coefficient and the Kappa test. It was modified based on these results into its final form. Results: After its construction, the IBEAR questionnaire passed the expert group's preliminary consensus, evaluation of the questions' clarity (from 5.1 to 7.0), inner validation (from 5.5 to 7) and structural validation (from 5.5 to 6.75). Factor analysis pointed to two content worlds in a division into 4 questions discussing attitudes and expectations versus 5 questions on belief and attitudes. Of the 221 questionnaires collected, a Cronbach's Alpha coefficient was calculated on nine questions relating to beliefs, expectations, and attitudes regarding reflexology. This measure stood at 0.716 (satisfactory reliability). At the Test-Retest stage, 199 research participants filled in the questionnaire a second time. The Pearson coefficient for all questions ranged between 0.73 and 0.94 (good to excellent reliability). As for dichotomic answers, Kappa scores ranged between 0.66 and 1.0 (mediocre to high). One of the questions was removed from the IBEAR following questionnaire validation. Conclusions: The present study provides evidence that the proposed IBEAR-16 questionnaire is a valid and reliable tool for the characterization of potential reflexology patients and may be effectively used in settings which include the evaluation of inpatients' beliefs, expectations, and attitudes toward reflexology.

Keywords: reflexology, attitude, expectation, belief, CAM, inpatient

Procedia PDF Downloads 227
4210 Easymodel: Web-based Bioinformatics Software for Protein Modeling Based on Modeller

Authors: Alireza Dantism

Abstract:

Presently, describing the function of a protein sequence is one of the most common problems in biology. Usually, this problem can be facilitated by studying the three-dimensional structure of proteins. In the absence of a protein structure, comparative modeling often provides a useful three-dimensional model of the protein that is dependent on at least one known protein structure. Comparative modeling predicts the three-dimensional structure of a given protein sequence (target) mainly based on its alignment with one or more proteins of known structure (templates). Comparative modeling consists of four main steps 1. Similarity between the target sequence and at least one known template structure 2. Alignment of target sequence and template(s) 3. Build a model based on alignment with the selected template(s). 4. Prediction of model errors 5. Optimization of the built model There are many computer programs and web servers that automate the comparative modeling process. One of the most important advantages of these servers is that it makes comparative modeling available to both experts and non-experts, and they can easily do their own modeling without the need for programming knowledge, but some other experts prefer using programming knowledge and do their modeling manually because by doing this they can maximize the accuracy of their modeling. In this study, a web-based tool has been designed to predict the tertiary structure of proteins using PHP and Python programming languages. This tool is called EasyModel. EasyModel can receive, according to the user's inputs, the desired unknown sequence (which we know as the target) in this study, the protein sequence file (template), etc., which also has a percentage of similarity with the primary sequence, and its third structure Predict the unknown sequence and present the results in the form of graphs and constructed protein files.

Keywords: structural bioinformatics, protein tertiary structure prediction, modeling, comparative modeling, modeller

Procedia PDF Downloads 91
4209 Use of Front-Face Fluorescence Spectroscopy and Multiway Analysis for the Prediction of Olive Oil Quality Features

Authors: Omar Dib, Rita Yaacoub, Luc Eveleigh, Nathalie Locquet, Hussein Dib, Ali Bassal, Christophe B. Y. Cordella

Abstract:

The potential of front-face fluorescence coupled with chemometric techniques, namely parallel factor analysis (PARAFAC) and multiple linear regression (MLR) as a rapid analysis tool to characterize Lebanese virgin olive oils was investigated. Fluorescence fingerprints were acquired directly on 102 Lebanese virgin olive oil samples in the range of 280-540 nm in excitation and 280-700 nm in emission. A PARAFAC model with seven components was considered optimal with a residual of 99.64% and core consistency value of 78.65. The model revealed seven main fluorescence profiles in olive oil and was mainly associated with tocopherols, polyphenols, chlorophyllic compounds and oxidation/hydrolysis products. 23 MLR regression models based on PARAFAC scores were generated, the majority of which showed a good correlation coefficient (R > 0.7 for 12 predicted variables), thus satisfactory prediction performances. Acid values, peroxide values, and Delta K had the models with the highest predictions, with R values of 0.89, 0.84 and 0.81 respectively. Among fatty acids, linoleic and oleic acids were also highly predicted with R values of 0.8 and 0.76, respectively. Factors contributing to the model's construction were related to common fluorophores found in olive oil, mainly chlorophyll, polyphenols, and oxidation products. This study demonstrates the interest of front-face fluorescence as a promising tool for quality control of Lebanese virgin olive oils.

Keywords: front-face fluorescence, Lebanese virgin olive oils, multiple Linear regressions, PARAFAC analysis

Procedia PDF Downloads 451
4208 CRISPR/Cas9 Based Gene Stacking in Plants for Virus Resistance Using Site-Specific Recombinases

Authors: Sabin Aslam, Sultan Habibullah Khan, James G. Thomson, Abhaya M. Dandekar

Abstract:

Losses due to viral diseases are posing a serious threat to crop production. A quick breakdown of resistance to viruses like Cotton Leaf Curl Virus (CLCuV) demands the application of a proficient technology to engineer durable resistance. Gene stacking has recently emerged as a potential approach for integrating multiple genes in crop plants. In the present study, recombinase technology has been used for site-specific gene stacking. A target vector (pG-Rec) was designed for engineering a predetermined specific site in the plant genome whereby genes can be stacked repeatedly. Using Agrobacterium-mediated transformation, the pG-Rec was transformed into Coker-312 along with Nicotiana tabacum L. cv. Xanthi and Nicotiana benthamiana. The transgene analysis of target lines was conducted through junction PCR. The transgene positive target lines were used for further transformations to site-specifically stack two genes of interest using Bxb1 and PhiC31 recombinases. In the first instance, Cas9 driven by multiplex gRNAs (for Rep gene of CLCuV) was site-specifically integrated into the target lines and determined by the junction PCR and real-time PCR. The resulting plants were subsequently used to stack the second gene of interest (AVP3 gene from Arabidopsis for enhancing cotton plant growth). The addition of the genes is simultaneously achieved with the removal of marker genes for recycling with the next round of gene stacking. Consequently, transgenic marker-free plants were produced with two genes stacked at the specific site. These transgenic plants can be potential germplasm to introduce resistance against various strains of cotton leaf curl virus (CLCuV) and abiotic stresses. The results of the research demonstrate gene stacking in crop plants, a technology that can be used to introduce multiple genes sequentially at predefined genomic sites. The current climate change scenario highlights the use of such technologies so that gigantic environmental issues can be tackled by several traits in a single step. After evaluating virus resistance in the resulting plants, the lines can be a primer to initiate stacking of further genes in Cotton for other traits as well as molecular breeding with elite cotton lines.

Keywords: cotton, CRISPR/Cas9, gene stacking, genome editing, recombinases

Procedia PDF Downloads 151
4207 Fish Scales as a Nonlethal Screening Tools for Assessing the Effects of Surface Water Contaminants in Cyprinus Carpio

Authors: Shahid Mahboob, Hafiz Muhammad Ashraf, Salma Sultana, Tayyaba Sultana, Khalid Al-Ghanim, Fahid Al-Misned, Zubair Ahmedd

Abstract:

There is an increasing need for an effective tool to estimate the risks derived from the large number of pollutants released to the environment by human activities. Typical screening procedures are highly invasive or lethal to the fish. Recent studies show that fish scales biochemically respond to a range of contaminants, including toxic metals, organic compounds, and endocrine disruptors. The present study evaluated the effects of the surface water contaminants on Cyprinus carpio in the Ravi River by comparing DNA extracted non-lethally from their scales to DNA extracted from the scales of fish collected from a controlled fish farm. A single, random sampling was conducted. Fish were broadly categorised into three weight categories (W1, W2 and W3). The experimental samples in the W1, W2 and W3 categories had an average DNA concentration (µg/µl) that was lower than the control samples. All control samples had a single DNA band; whereas the experimental samples in W1 fish had 1 to 2 bands, the experimental samples in W2 fish had two bands and the experimental samples in W3 fish had fragmentation in the form of three bands. These bands exhibit the effects of pollution on fish in the Ravi River. On the basis findings of this study, we propose that fish scales can be successfully employed as a new non-lethal tool for the evaluation of the effect of surface water contaminants.

Keywords: fish scales, Cyprinus carpio, heavy metals, non-invasive, DNA fragmentation

Procedia PDF Downloads 408
4206 The Position of Cooperatives and Social Economy in Solving the Problems of Today's Society

Authors: Mohammad Abbasi

Abstract:

Cooperatives around the world, relying on the policy of mutual self-help, are a natural tool for Social and economic development, and securing the interests of local communities and social systems has changed. The social economy consists of institutions, cooperatives, bilateral organizations, and unions and associations and their activities have social and economic aspects. Due to the nature of cooperative companies, it can be claimed that all cooperatives have social and economic goals; Because every company A cooperative is formed with the aim of meeting the common needs of society members. These needs sometimes It is aimed at housing or health services, and sometimes cooperative members want to go through it Products and services, employment, and continuous income (for example, in most rural areas of Iran, needs are of this type) to have access. This article also examines the broad methods of participation of Iran's cooperatives in the economy It deals with social issues and provides innovative solutions to solve social issues and problems, and the potential for innovation and growth in using the cooperative model in order to meet economic needs and It proves the sociality of Canadians. In this article, cooperatives whose activities are mostly "social" are mentioned And the activity of many of them in cooperation with government programs in the fields of health, housing, etc. It is a kindergarten and they have proven that they have a cost-effective model in providing services. The conclusion of this discussion shows that the cooperative model for economic activity, with A hundred years of history in Iran has been able to show its value as a tool of innovation in the fields to to prove social, technological, and economic. Cooperatives with about 10 million members in Iran have shown that they are well-known and trusted by the people.

Keywords: types of cooperatives, social economy, Iran, non-governmental organizations, justice, consumption pattern

Procedia PDF Downloads 15
4205 Using Assessment Criteria as a Pedagogic Tool to Develop Argumentative Essay Writing

Authors: Sruti Akula

Abstract:

Assessment criteria are mostly used for assessing skills like writing and speaking. However, they could be used as a pedagogic tool to develop writing skills. A study was conducted with higher secondary learners (Class XII Kendriya Vidyalaya) to investigate the effectiveness of assessment criteria to develop argumentative essay writing. In order to raise awareness about the features of argumentative essay, assessment criteria were shared with the learners. Along with that, self-evaluation checklists were given to the learners to guide them through the writing process. During the study learners wrote multiple drafts with the help of assessment criteria, self-evaluation checklists and teacher feedback at different stages of their writing. It was observed that learners became more aware of the features of argumentative essay which in turn improved their argumentative essay writing. In addition the self evaluation checklists imporved their ability to reflect on their work there by increasing learner autonomy in the class. Hence, it can be claimed that both assessment criteria and self evaluation checklists are effective pedagogic tools to develop argumentative essay writing. Thus, teachers can be trained to create and use tools like assessment criteria and self-evaluation checklists to develop learners’ writing skills in an effective way. The presentation would discuss the approach adopted in the study to teach argumentative essay writing along with the rationale. The tools used in the study would be shared and the data collected in the form of written scripts, self-evaluation checklists and student interviews will be analyzed to validate the claims. Finally, the practical implication of the study like the ways of using assessment criteria and checklists to raise learner awareness and autonomy, using such tools to keep the learners informed about the task requirements and genre features, and the like will be put forward.

Keywords: argumentative essay writing, assessment criteria, self evaluation checklists, pedagogic

Procedia PDF Downloads 505
4204 Application of Fuzzy Analytical Hierarchical Process in Evaluation Supply Chain Performance Measurement

Authors: Riyadh Jamegh, AllaEldin Kassam, Sawsan Sabih

Abstract:

In modern trends of market, organizations face high-pressure environment which is characterized by globalization, high competition, and customer orientation, so it is very crucial to control and know the weak and strong points of the supply chain in order to improve their performance. So the performance measurements presented as an important tool of supply chain management because it's enabled the organizations to control, understand, and improve their efficiency. This paper aims to identify supply chain performance measurement (SCPM) by using Fuzzy Analytical Hierarchical Process (FAHP). In our real application, the performance of organizations estimated based on four parameters these are cost parameter indicator of cost (CPI), inventory turnover parameter indicator of (INPI), raw material parameter (RMPI), and safety stock level parameter indicator (SSPI), these indicators vary in impact on performance depending upon policies and strategies of organization. In this research (FAHP) technique has been used to identify the importance of such parameters, and then first fuzzy inference (FIR1) is applied to identify performance indicator of each factor depending on the importance of the factor and its value. Then, the second fuzzy inference (FIR2) also applied to integrate the effect of these indicators and identify (SCPM) which represent the required output. The developed approach provides an effective tool for evaluation of supply chain performance measurement.

Keywords: fuzzy performance measurements, supply chain, fuzzy logic, key performance indicator

Procedia PDF Downloads 140
4203 Social Media Marketing in Russia

Authors: J. A. Ageeva, Z. S. Zavyalova

Abstract:

The article considers social media as a tool for business promotion. We analyze and compare the SMM experience in the western countries and Russia. A short review of Russian social networks are given including their peculiar features, and the main problems and perspectives of Russian SMM are described.

Keywords: social media, social networks, marketing, SMM

Procedia PDF Downloads 548
4202 Design of a New Architecture of IDS Called BiIDS (IDS Based on Two Principles of Detection)

Authors: Yousef Farhaoui

Abstract:

An IDS is a tool which is used to improve the level of security.In this paper we present different architectures of IDS. We will also discuss measures that define the effectiveness of IDS and the very recent works of standardization and homogenization of IDS. At the end, we propose a new model of IDS called BiIDS (IDS Based on the two principles of detection).

Keywords: intrusion detection, architectures, characteristic, tools, security

Procedia PDF Downloads 458
4201 An Investigation on Physics Teachers’ Views Towards Context Based Learning Approach

Authors: Medine Baran, Abdulkadir Maskan, Mehmet Ikbal Yetişir, Mukadder Baran, Azmi Türkan, Şeyma Yaşar

Abstract:

The purpose of this study was to determine the views of physics teachers from several secondary schools in Turkey towards context-based learning approach. In the study, the context-based learning opinion questionnaire developed by the researchers for use as the data collection tool was piloted with 250 physics teachers. The questionnaire examined by the researchers and field experts was initially made up of 53 items. Following the evaluation process of the questionnaire, it included 37 items. In this way, the reliability and validity process of the measurement tool was completed. In the end, the finalized questionnaire was applied to 144 physics teachers from several secondary schools in different cities in Turkey (F:73, M:71). In the study, the participants were determined based on ease of reaching them. The results revealed no remarkable difference between the views of the physics teachers with respect to their gender, region and school. However, when the items in the questionnaire were considered, it was found that the participants interestingly agreed on some of the choices in the items. Depending on this, it was found that there were high levels of differences between the frequencies of those who agreed and those who disagreed with the 16 items in the questionnaire. Therefore, as the following phase of the present study, further research has been planned using the same questions. Based on these questions, which received opposite responses, physics teachers will be asked for their views about the results of the study using the interview technique, one of qualitative research techniques. In this way, the results will be evaluated both by the researchers and by the participants, and the problems and difficulties will be determined. As a result, related suggestions can be put forward.

Keywords: context bases learning, physics teachers, views

Procedia PDF Downloads 371
4200 A Construction Management Tool: Determining a Project Schedule Typical Behaviors Using Cluster Analysis

Authors: Natalia Rudeli, Elisabeth Viles, Adrian Santilli

Abstract:

Delays in the construction industry are a global phenomenon. Many construction projects experience extensive delays exceeding the initially estimated completion time. The main purpose of this study is to identify construction projects typical behaviors in order to develop a prognosis and management tool. Being able to know a construction projects schedule tendency will enable evidence-based decision-making to allow resolutions to be made before delays occur. This study presents an innovative approach that uses Cluster Analysis Method to support predictions during Earned Value Analyses. A clustering analysis was used to predict future scheduling, Earned Value Management (EVM), and Earned Schedule (ES) principal Indexes behaviors in construction projects. The analysis was made using a database with 90 different construction projects. It was validated with additional data extracted from literature and with another 15 contrasting projects. For all projects, planned and executed schedules were collected and the EVM and ES principal indexes were calculated. A complete linkage classification method was used. In this way, the cluster analysis made considers that the distance (or similarity) between two clusters must be measured by its most disparate elements, i.e. that the distance is given by the maximum span among its components. Finally, through the use of EVM and ES Indexes and Tukey and Fisher Pairwise Comparisons, the statistical dissimilarity was verified and four clusters were obtained. It can be said that construction projects show an average delay of 35% of its planned completion time. Furthermore, four typical behaviors were found and for each of the obtained clusters, the interim milestones and the necessary rhythms of construction were identified. In general, detected typical behaviors are: (1) Projects that perform a 5% of work advance in the first two tenths and maintain a constant rhythm until completion (greater than 10% for each remaining tenth), being able to finish on the initially estimated time. (2) Projects that start with an adequate construction rate but suffer minor delays culminating with a total delay of almost 27% of the planned time. (3) Projects which start with a performance below the planned rate and end up with an average delay of 64%, and (4) projects that begin with a poor performance, suffer great delays and end up with an average delay of a 120% of the planned completion time. The obtained clusters compose a tool to identify the behavior of new construction projects by comparing their current work performance to the validated database, thus allowing the correction of initial estimations towards more accurate completion schedules.

Keywords: cluster analysis, construction management, earned value, schedule

Procedia PDF Downloads 257